Skip to main content

Try Mixtral 8x22B in the Workbench

Run this model interactively, tune parameters, and compare outputs.
Model ID: open-mixtral-8x22b Mixtral-8x22B is an LLM that excels in multilingual support across English, French, Italian, German, and Spanish, as well as strong capabilities in mathematics and coding tasks. It utilizes a Sparse Mixture of Experts architecture, offering efficient performance with 39 billion active parameters out of 141 billion. Some noteworthy use cases of Mixtral-8x22B include handling multilingual tasks and performing complex mathematical computations.
MetricValue
Parameter Count141 billion
Mixture of ExpertsYes
Active Parameter Count39 billion
Context Length64,000 tokens
MultilingualYes
Quantized*Unknown
*Quantization is specific to the inference provider and the model may be offered with different quantization levels by other providers.

Example request

Use the Workbench as a request builder: configure parameters for this model in the UI, then open the API tab to copy the exact cURL or Python call.
curl -X POST https://hub.oxen.ai/api/ai/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OXEN_API_KEY" \
  -d '{
  "model": "open-mixtral-8x22b",
  "messages": [
    {
      "role": "user",
      "content": "Hello, what can you do?"
    }
  ]
}'

Fetch model details

The models endpoint returns the full model object, including its json_request_schema.
curl -H "Authorization: Bearer $OXEN_API_KEY" https://hub.oxen.ai/api/ai/models/open-mixtral-8x22b

Request parameters

This model follows the standard OpenAI chat completions request body. See the chat completions reference for the full parameter list.