by DeepSeek
DeepSeek Coder V2 236B is a 236B MoE model with 21B active parameters. Trained on 2T tokens of code. Achieves 65.8% on SWE-bench, 90.2% on HumanEval, 88.4% on MBPP.
Parameters
236B (MoE)
Architecture
Mixture of Experts
Context
128K
Provider
DeepSeek
Drop-in replacement for OpenAI API. Just change the base URL.
Only pay for actual GPU compute time. No idle costs.
99.9% uptime SLA, SOC 2 compliant, dedicated support.
Scales from zero to thousands of requests automatically.
See how much you'd save running DeepSeek Coder V2 on Fleek
| Model Name | DeepSeek Coder V2 |
| Total Parameters | 236B (MoE) |
| Active Parameters | 21B |
| Architecture | Mixture of Experts |
| Context Length | 128K tokens |
| Inference Speed | 25,000 tokens/sec |
| Provider | DeepSeek |
| Release Date | Aug 15, 2025 |
| License | MIT |
| HuggingFace | https://huggingface.co/deepseek-ai/DeepSeek-Coder-V2-236B |
Software Engineering benchmark - real GitHub issues
OpenAI code generation benchmark
Mostly Basic Python Problems
Click any benchmark to view the official leaderboard. Rankings among open-source models.
Join the waitlist for early access. Start free with $5 in credits.