• LLM TrustAI Model Discovery
Platform
  • Home
  • Models
  • Categories
  • Compare
  • Blog
  • Newsletter
My Account
  • Dashboard
Categories
    Back to Models

    DeepSeek Coder V2

    236B

    236B MoE coding model that rivals GPT-4 Turbo on code tasks.

    open-sourcecodemoecoding-specialist128k-context
    Architecture

    deepseek

    Parameters

    236B

    Context Length

    131,072 tokens

    License

    DeepSeek License

    About DeepSeek Coder V2

    DeepSeek Coder V2 is a 236B MoE model (21B active) specialized for code. It supports 338 programming languages and 128K context for large codebases.

    Author

    Community

    Category

    code

    Downloads

    987,654

    License

    DeepSeek License