All News
ByteDance releases Seed-OSS-36B with 512K token context

ByteDance releases Seed-OSS-36B with 512K token context

ByteDance has released Seed-OSS-36B, an open-source large language model with a native context length of 512,000 tokens. The model achieves leading open-source results in math, reasoning, and coding tasks while also supporting efficient deployment through quantization.

August 22, 2025
August 22, 2025
August 28, 2025
Georg S. Kuklick

ByteDance’s Seed Team published the Seed-OSS-36B family under the Apache-2.0 license, making it available to developers and enterprises without usage restrictions. The models are trained to handle long documents natively, extending context support to half a million tokens. This capacity allows users to process entire books or large codebases without fine-tuning or external retrieval setups.

The instruct-tuned version, Seed-OSS-36B-Instruct, scores at the top of open-source leaderboards for reasoning, math, and code benchmarks. According to ByteDance, it also performs competitively on general natural language understanding while maintaining efficiency in long-context scenarios.

Deployment options include 4-bit and 8-bit quantization supported by Hugging Face Transformers and vLLM. These features are aimed at reducing memory requirements and making the model accessible for a wider range of production environments.

The release positions ByteDance as a major contributor to the open-source LLM ecosystem. For researchers, it offers a high-capacity model for experiments in reasoning and long-context workflows. For enterprises, it reduces infrastructure hurdles while extending the scope of AI applications such as document analysis, legal review, and large-scale code reasoning.

Pure Neo Signal:
Share this post:

We love

and you too

If you like what we do, please share it on your social media and feel free to buy us a coffee.