Try Ministral 3B in the Workbench
Run this model interactively, tune parameters, and compare outputs.
ministral-3b-latest
Ministral 3B is a 3B parameter LLM optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference.
| Metric | Value |
|---|---|
| Parameter Count | 3 billion |
| Mixture of Experts | No |
| Context Length | 128,000 tokens |
| Multilingual | Yes |
| Quantized* | Unknown |
Example request
- Minimal
- Basic parameters
- All parameters
Fetch model details
The models endpoint returns the full model object, including itsjson_request_schema.