• LLM TrustAI Model Discovery
Platform
  • Home
  • Models
  • Categories
  • Compare
  • Blog
  • Newsletter
My Account
  • Dashboard
Categories
    Back to Models

    StarCoder2 15B

    15B

    BigCode's 15B code model trained on 600+ programming languages.

    open-sourcecodefill-in-middlemultilingual-code
    Architecture

    starcoder

    Parameters

    15B

    Context Length

    16,384 tokens

    License

    BigCode OpenRAIL-M

    About StarCoder2 15B

    StarCoder2 15B is trained on The Stack v2 dataset covering 619 programming languages. It features fill-in-the-middle capability and supports 16K context.

    Author

    Community

    Category

    code

    Downloads

    456,789

    License

    BigCode OpenRAIL-M