tech

December 17, 2025

Open Release of Grok-1

We are releasing the weights and architecture of our 314 billion parameter Mixture-of-Experts model Grok-1.

Open Release of Grok-1

TL;DR

  • Grok-1 base model weights and network architecture released by xAI.
  • Grok-1 is a 314 billion parameter Mixture-of-Experts model.
  • The model was trained from scratch by xAI and concluded in October 2023.
  • The released model is a raw base checkpoint, not fine-tuned for specific tasks.
  • Weights and architecture are available under the Apache 2.0 license.
  • Instructions for use are available on GitHub.
  • A given token activates 25% of the model's weights.
  • The model was trained using JAX and Rust.

Continue reading
the original article

Made withNostr