• LLM TrustAI Model Discovery
Platform
  • Home
  • Models
  • Categories
  • Compare
  • Blog
  • Newsletter
My Account
  • Dashboard
Categories
    Back to Models

    Mistral 7B v0.3

    7B

    Efficient 7B parameter model with sliding window attention. Excellent for local deployment.

    open-sourcelocal-friendlyefficientapache-2.0
    Architecture

    mistral

    Parameters

    7B

    Context Length

    32,768 tokens

    License

    Apache 2.0

    About Mistral 7B v0.3

    Mistral 7B v0.3 is a powerful yet efficient model using grouped-query attention and sliding window attention. It outperforms LLaMA 2 13B on most benchmarks.

    Author

    Community

    Category

    text generation

    Downloads

    9,823,451

    License

    Apache 2.0