Robots AtlasRobots Atlas
DBRX MoE-B
Research
Mar 27, 2024
APICloud

DBRX MoE-B

LLM

Mid-size DBRX family member: 23.5B total parameters, 6.6B active. Used to study MoE training efficiency. Achieves 45.5% on the Databricks Gauntlet with 1.7x fewer FLOPs than LLaMA2-13B (13B active parameters).

Technical specification

Context window
0K
Parameters
23.5B total / 6.6B active
License
Databricks internal / research
Tools
No
Fine-tuning
No
Weights access
Closed
Hardware requirementsInternal Databricks research model; no public checkpoint available.
Last updated: May 4, 2026

Modalities

Input
Text
Output
Text
Code

Architecture and technologies

Applications