Try Mixtral 8x7B in the Workbench
Run this model interactively, tune parameters, and compare outputs.
open-mixtral-8x7b
Mixtral 8x7B is an LLM designed for efficient processing and optimized model size. It excels in multilingual capabilities and cost-performance efficiency, making it a strong contender against models like GPT-3.5.
| Metric | Value |
|---|---|
| Parameter Count | 47 billion |
| Mixture of Experts | Yes |
| Active Parameter Count | 13 billion |
| Context Length | Unknown |
| Multilingual | Yes |
| Quantized* | Unknown |
Example request
- Minimal
- Basic parameters
- All parameters
Fetch model details
The models endpoint returns the full model object, including itsjson_request_schema.