Alibaba
Qwen
Alibaba
Alibaba’s Qwen Team Unleashes Qwen-MT, Teases Wan 2.2 in Breakneck Model Rollout
Qwen‑MT brings instruction-tuned translation across 92 languages, built on Qwen3. With daily model drops and Wan 2.2 on the horizon, Qwen is pushing a relentless pace in open AI development.
Georg S. Kuklick
•
July 24, 2025

Alibaba's Qwen team has added yet another model to its expanding lineup. Qwen‑MT is a multilingual machine translation model based on the Qwen3 architecture, covering 92 languages with variants optimized for both instruction-following and real-time translation. The release includes a Turbo version and demonstrates performance surpassing traditional translation benchmarks like FLORES-101.
🚀 Introducing Qwen3-MT – our most powerful translation model yet!
Trained on trillions of multilingual tokens, it supports 92+ languages—covering 95%+ of the world’s population. 🌍✨
🔑 Why Qwen3-MT?
✅ Top-tier translation quality
✅ Customizable: terminology control, domain… pic.twitter.com/fZI8pw3ezS— Qwen (@Alibaba_Qwen) July 24, 2025
This is Alibaba’s third major model drop in a week, following Qwen3‑Coder and a Thinking variant. The team also confirmed that its next-gen video model, Wan 2.2, is coming soon. That puts Qwen on a model-a-day pace, signaling a rapid open-source counter to proprietary LLM platforms.
🚨Announcing the open-source release of Wan2.2.
Stay tuned. pic.twitter.com/4G7KJcSC70— Tongyi Lab (@Ali_TongyiLab) July 25, 2025
Qwen‑MT reflects Alibaba’s push to build modular, interoperable models tuned for real-world use. For global businesses, this means access to an open translation engine with strong instruction-following and long-context capabilities. For developers and AI builders, Qwen’s pace is a signal: the frontier is open, and moving fast.