Mistral Expands Open-Source Coding Models with DevStral 2507 Lineup
Mistral AI has launched two new AI coding models, DevStral Small 1.1 and DevStral Medium 2507, targeting developers and enterprise AI teams. The models boast competitive performance on SWE-Bench coding benchmarks, with DevStral Medium exceeding some GPT-4.1 variants. With generous context lengths, function-calling, and open-source availability, Mistral is sharpening its position in the code-generation market.
Mistral AI has introduced an upgraded coding model suite focused on developer workflows, open-source deployment, and affordability. DevStral Small 1.1 is a 24 billion parameter model released under the permissive Apache 2.0 license. It features a 128,000-token context window and function-calling capabilities, achieving 53.6 percent on the SWE-Bench Verified benchmark. This score positions it above earlier Mistral models and many competing open-source code-generation models. The Small variant is available in multiple GGUF quantized formats on Hugging Face, optimized for efficient local inference using popular runtimes like llama.cpp and LM Studio.
The new DevStral Medium 2507 model is exclusive to Mistral's API and aims to compete directly with top-tier coding models. It delivers 61.6 percent on SWE-Bench Verified, outperforming commercial mid-tier models like GPT-4.1 mini and Gemini 2.5 Pro, but at a fraction of their price point. Mistral lists Medium at $0.4 per million input tokens and $2 per million output tokens, while Small is priced even lower at $0.1 and $0.3 respectively.
With this release, Mistral is clearly targeting enterprise adoption in the coding domain by offering high-performance models with transparency and flexibility. The open availability of DevStral Small 1.1 makes it attractive to AI engineers building autonomous code agents and LLM-powered developer tools. Meanwhile, the DevStral Medium API is positioned as a cost-effective challenger for organizations seeking high coding accuracy without the financial overhead of closed models. This expansion signals Mistral’s growing ambition to anchor itself in the competitive AI coding ecosystem.
Pure Neo Signal:
We love
and you too
If you like what we do, please share it on your social media and feel free to buy us a coffee.