πŸ“‹ Model Description


license: mit base_model:
  • moonshotai/Kimi-Dev-72B
tags:
  • code
  • unsloth
  • swebench
  • software
  • issue-resolving

Unsloth Dynamic 2.0 achieves superior accuracy & outperforms other leading quants.





We introduce Kimi-Dev-72B, our new open-source coding LLM for software engineering tasks. Kimi-Dev-72B achieves a new state-of-the-art on SWE-bench Verified among open-source models.

  • Kimi-Dev-72B achieves 60.4% performance on SWE-bench Verified. It surpasses the runner-up, setting a new state-of-the-art result among open-source models.
  • Kimi-Dev-72B is optimized via large-scale reinforcement learning. It autonomously patches real repositories in Docker and gains rewards only when the entire test suite passes. This ensures correct and robust solutions, aligning with real-world development standards.
  • Kimi-Dev-72B is available for download and deployment on Hugging Face and GitHub. We welcome developers and researchers to explore its capabilities and contribute to development.


Kimi Logo

Performance of Open-source Models on SWE-bench Verified.

Quick Start

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "moonshotai/Kimi-Dev-72B"

model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype="auto",
device_map="auto"
)
tokenizer = AutoTokenizer.frompretrained(modelname)

prompt = "Give me a short introduction to large language model."
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": prompt}
]
text = tokenizer.applychattemplate(
messages,
tokenize=False,
addgenerationprompt=True
)
modelinputs = tokenizer([text], returntensors="pt").to(model.device)

generated_ids = model.generate(
model_inputs,
maxnewtokens=512
)
generated_ids = [
outputids[len(inputids):] for inputids, outputids in zip(modelinputs.inputids, generated_ids)
]

response = tokenizer.batchdecode(generatedids, skipspecialtokens=True)[0]

Citation

@misc{kimidev72b_2025,
  title        = {Introducing Kimi-Dev: A Strong and Open-source Coding LLM for Issue Resolution},
  author       = {{Kimi-Dev Team}},
  year         = {2025},
  month        = {June},
  url          = {\url{https://www.moonshot.cn/Kimi-Dev}}
}

πŸ“‚ GGUF File List

πŸ“ Filename πŸ“¦ Size ⚑ Download
Kimi-Dev-72B-IQ4_NL.gguf
LFS Q4
38.48 GB Download
Kimi-Dev-72B-IQ4_XS.gguf
LFS Q4
37.02 GB Download
Kimi-Dev-72B-Q3_K_M.gguf
LFS Q3
35.11 GB Download
Kimi-Dev-72B-Q3_K_S.gguf
LFS Q3
32.12 GB Download
Kimi-Dev-72B-Q4_0.gguf
Recommended LFS Q4
38.54 GB Download
Kimi-Dev-72B-Q4_1.gguf
LFS Q4
42.56 GB Download
Kimi-Dev-72B-UD-IQ1_M.gguf
LFS
22.35 GB Download
Kimi-Dev-72B-UD-IQ1_S.gguf
LFS
21.47 GB Download
Kimi-Dev-72B-UD-IQ2_M.gguf
LFS Q2
27.56 GB Download
Kimi-Dev-72B-UD-IQ2_XXS.gguf
LFS Q2
23.94 GB Download
Kimi-Dev-72B-UD-IQ3_XXS.gguf
LFS Q3
29.67 GB Download
Kimi-Dev-72B-UD-Q2_K_XL.gguf
LFS Q2
28.25 GB Download
Kimi-Dev-72B-UD-Q3_K_XL.gguf
LFS Q3
35.67 GB Download