Back to all models
Mixtral 8x22B

Mixtral 8x22B

by Mistral AI

A Mixture-of-Experts model offering excellent efficiency by activating only a subset of parameters per inference.

Technical Specifications

Parameters

176B total (39B active)

Context Window

65K tokens

Category

Open Source

Released

April 2024

Pricing Tier

Free

Capabilities

Text Code Multilingual Function Calling

Pricing Details

Open-weight under Apache 2.0. Self-hosting only.

Pros

  • Efficient MoE architecture
  • Strong multilingual performance
  • Fully open with permissive license
  • Good cost-to-performance

Cons

  • Smaller context window
  • No vision capabilities
  • Requires significant compute
  • Superseded by Mistral Large 3

Best Use Cases

Multilingual applications
Code generation
Self-hosted enterprise deployment
Research

Quick Info

Need AI Compliance Help?

GovernMy.ai provides expert-led compliance for EU AI Act, Colorado AI Act & ISO 42001.

Learn more

Related Models