All News
xAI open-sources Grok 2 model on Hugging Face

xAI open-sources Grok 2 model on Hugging Face

Elon Musk’s AI company xAI has released Grok 2 as open source on Hugging Face. The model, with an estimated 270 billion parameters, comes under a custom license that permits commercial use but restricts model training. Musk confirmed plans to release Grok 3 in the coming months and reiterated xAI’s target of scaling compute to 50 million H100-equivalent GPUs within five years.

August 25, 2025
August 25, 2025
August 28, 2025
Georg S. Kuklick

xAI has made its Grok 2 model openly available, describing it as the company’s top-performing system from last year. The release was confirmed by Elon Musk on X, who said the model would serve as a foundation for wider research and development. The announcement also included a commitment to release Grok 3 as open source within about six months.

Grok 2 is reported to have 270 billion parameters, designed as a mixture-of-experts model. In practice, only about 115 billion parameters are activated for each inference pass, with two out of eight expert modules engaged at a time. The model weights, split across 42 files and totaling around 500 gigabytes, are now hosted on Hugging Face.

The release comes with a new “Grok 2 Community License Agreement.” The license allows both commercial and non-commercial use of the model while barring its use for training or improving other AI systems. This structure gives developers freedom to deploy the model in production settings while maintaining guardrails around competitive reuse.

xAI’s move contrasts with the closed distribution of large models from competitors such as OpenAI. By placing Grok 2.5 in the open, the company offers researchers, startups, and enterprises direct access to a system at scale. The decision could accelerate experimentation in areas ranging from enterprise automation to consumer applications.

Alongside the release, Musk restated xAI’s long-term infrastructure goal. The company is targeting compute capacity equivalent to 50 million Nvidia H100 GPUs over the next five years. This ambition underscores the increasing competition among AI companies to secure hardware at scale and signals a strategy to support even larger models in the future.

Pure Neo Signal:
Share this post:

We love

and you too

If you like what we do, please share it on your social media and feel free to buy us a coffee.