DBRX Base
LLMLLM
Base pretrained DBRX model without instruction tuning. 132B total parameters, 36B active (MoE 16 experts, top-4). Pretrained on 12T tokens, 32K context window.
Available on
Technical specification
Context window
Parameters
Max output
License
Tools
Fine-tuning
Weights access
Knowledge cutoff
Last updated: May 4, 2026
Modalities
Input
Text
Output
Text
Code
Capabilities
4Coding★
Coding
Reasoning★
Reasoning
Long context★
Reasoning
Multilingual★
Language
