Skip to main content

Try Mixtral 8x7B in the Workbench

Run this model interactively, tune parameters, and compare outputs.
Model ID: open-mixtral-8x7b Mixtral 8x7B is an LLM designed for efficient processing and optimized model size. It excels in multilingual capabilities and cost-performance efficiency, making it a strong contender against models like GPT-3.5.
MetricValue
Parameter Count47 billion
Mixture of ExpertsYes
Active Parameter Count13 billion
Context LengthUnknown
MultilingualYes
Quantized*Unknown
*Quantization is specific to the inference provider and may vary.

Example request

Use the Workbench as a request builder: configure parameters for this model in the UI, then open the API tab to copy the exact cURL or Python call.
curl -X POST https://hub.oxen.ai/api/ai/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OXEN_API_KEY" \
  -d '{
  "model": "open-mixtral-8x7b",
  "messages": [
    {
      "role": "user",
      "content": "Hello, what can you do?"
    }
  ]
}'

Fetch model details

The models endpoint returns the full model object, including its json_request_schema.
curl -H "Authorization: Bearer $OXEN_API_KEY" https://hub.oxen.ai/api/ai/models/open-mixtral-8x7b

Request parameters

This model follows the standard OpenAI chat completions request body. See the chat completions reference for the full parameter list.