Robots AtlasRobots Atlas
DBRX MoE-A
Research
Mar 27, 2024
APICloud

DBRX MoE-A

LLM

Smallest DBRX family member: 7.7B total parameters, 2.2B active. Used internally by Databricks to study MoE training efficiency. Achieves 30.5% on the Databricks Gauntlet with 3.7x fewer FLOPs than MPT-7B.

Technical specification

Context window
0K
Parameters
7.7B total / 2.2B active
License
Databricks internal / research
Tools
No
Fine-tuning
No
Weights access
Closed
Hardware requirementsInternal Databricks research model; no public checkpoint available.
Last updated: May 4, 2026

Modalities

Input
Text
Output
Text
Code

Architecture and technologies

Applications