Try Ministral 8B in the Workbench
Run this model interactively, tune parameters, and compare outputs.
ministral-8b-latest
Ministral 8B is a Small Language Model (SLM) designed for edge computing and on-device applications. It excels in efficient inference, knowledge retrieval, common-sense reasoning, and multilingual understanding, making it ideal for low-latency tasks and resource-constrained environments.
Some other noteworthy features of Ministral 8B include native function calling support and interleaved sliding-window attention for faster and memory-efficient processing.
| Metric | Value |
|---|---|
| Parameter Count | 8 billion |
| Mixture of Experts | No |
| Context Length | 128,000 tokens |
| Multilingual | Yes |
| Quantized* | Unknown |
Example request
- Minimal
- Basic parameters
- All parameters
Fetch model details
The models endpoint returns the full model object, including itsjson_request_schema.