DBRX is a general-purpose large language model (LLM) built by Databricks' Mosaic research team. It outperforms all existing open-source models in standard benchmark tests. It uses a Mixture-of-Experts (MoE) architecture with 36.2 billion parameters, boasting excellent language understanding, programming, mathematical, and logical reasoning capabilities. DBRX aims to promote the development of high-quality open-source LLMs and facilitates enterprise customization of the model based on their own data. Databricks provides enterprise users with the ability to interactively use DBRX, leverage its long context capabilities to build retrieval-enhanced systems, and build customized DBRX models based on their own data.