Try Mixtral 8x22B in the Workbench
Run this model interactively, tune parameters, and compare outputs.
open-mixtral-8x22b
Mixtral-8x22B is an LLM that excels in multilingual support across English, French, Italian, German, and Spanish, as well as strong capabilities in mathematics and coding tasks. It utilizes a Sparse Mixture of Experts architecture, offering efficient performance with 39 billion active parameters out of 141 billion.
Some noteworthy use cases of Mixtral-8x22B include handling multilingual tasks and performing complex mathematical computations.
| Metric | Value |
|---|---|
| Parameter Count | 141 billion |
| Mixture of Experts | Yes |
| Active Parameter Count | 39 billion |
| Context Length | 64,000 tokens |
| Multilingual | Yes |
| Quantized* | Unknown |
Example request
- Minimal
- Basic parameters
- All parameters
Fetch model details
The models endpoint returns the full model object, including itsjson_request_schema.