>

ByteDance releases Seed-OSS-36B with 512K token context

ByteDance

ByteDance

ByteDance

ByteDance releases Seed-OSS-36B with 512K token context

ByteDance has released Seed-OSS-36B, an open-source large language model with a native context length of 512,000 tokens. The model achieves leading open-source results in math, reasoning, and coding tasks while also supporting efficient deployment through quantization.

Georg S. Kuklick

August 22, 2025

ByteDance’s Seed Team published the Seed-OSS-36B family under the Apache-2.0 license, making it available to developers and enterprises without usage restrictions. The models are trained to handle long documents natively, extending context support to half a million tokens. This capacity allows users to process entire books or large codebases without fine-tuning or external retrieval setups.

The instruct-tuned version, Seed-OSS-36B-Instruct, scores at the top of open-source leaderboards for reasoning, math, and code benchmarks. According to ByteDance, it also performs competitively on general natural language understanding while maintaining efficiency in long-context scenarios.

Deployment options include 4-bit and 8-bit quantization supported by Hugging Face Transformers and vLLM. These features are aimed at reducing memory requirements and making the model accessible for a wider range of production environments.

The release positions ByteDance as a major contributor to the open-source LLM ecosystem. For researchers, it offers a high-capacity model for experiments in reasoning and long-context workflows. For enterprises, it reduces infrastructure hurdles while extending the scope of AI applications such as document analysis, legal review, and large-scale code reasoning.

Never miss an update!

Subscribe for news, curated content, and special offers.

By clicking Subscribe Now you're confirming that you agree with our Terms & Conditions.

Built with ♥️ in Berlin, New York, and Vienna.

© 2025 Neo Digital Magazines llc. All rights reserved.