Alibaba Launches Qwen3-Coder: 480B Parameter Programming AI Model Leads a New Era of Open-Source Technology

July 27, 2025
Alibaba qwen3
3 min

News Summary

The Alibaba Qwen team officially open-sourced its latest programming AI model, Qwen3-Coder-480B-A35B-Instruct, on July 22, 2025. This is a Mixture-of-Experts (MoE) architecture model with a total of 480 billion parameters and 3.5 billion active parameters, achieving a leading level among open-source models in tasks such as code generation and agentic programming, with performance comparable to Anthropic's Claude Sonnet 4.

Technological Breakthroughs Leading the Industry

Qwen3-Coder adopts a Mixture-of-Experts (MoE) architecture, with a total of 480B parameters and 35B active parameters. It natively supports a 256K token context window and can be extended to 1M token length via YaRN technology. This technological breakthrough enables it to handle large codebases and complex programming tasks, providing developers with unprecedented code understanding and generation capabilities.

The model was pre-trained on 7.5 trillion tokens of data, with code accounting for 70%. It supports 92 programming languages, including Python, JavaScript, Java, and C++. Through large-scale pre-training and reinforcement learning, Qwen3-Coder demonstrates excellent performance in core functionalities such as code generation, code repair, and code completion.

Record-Breaking Performance Benchmarks

Qwen3-Coder has demonstrated remarkable performance in multiple authoritative evaluations. In the SWE-Bench Verified test, the model surpassed several competitors, including Kim K2 and GPT-4.1, showcasing exceptional capabilities in code and problem-solving tasks.

In tasks such as Agentic Coding, Agentic Browser-Use, and Agentic Tool-Use, Qwen3-Coder has achieved state-of-the-art performance among open-source models, with performance comparable to Claude Sonnet 4.

Comprehensive Developer Ecosystem Layout

To enhance the developer experience, Alibaba has simultaneously open-sourced the accompanying command-line tool, Qwen Code. This tool is modified from the Gemini Code project and specifically enhances parser and tool support for the Qwen3-Coder series models, fully leveraging the model's potential in agentic programming.

Qwen3-Coder's API can also be used in conjunction with mainstream development tools like Claude Code and Cline. It is currently open-sourced on platforms such as ModelScope community and HuggingFace, and will subsequently be integrated into Alibaba's AI programming product, Tongyi Lingma.

Global Impact and Market Outlook

Industry experts believe that the release of Qwen3-Coder will have a profound impact on the global AI programming landscape. Lian Jye Su, Chief Analyst at Omdia, stated: "Western tech leaders may find open-source programming models like Qwen3-Coder attractive due to their performance in various benchmarks."

Abhishek Sengupta, Practice Director at Everest Group, pointed out: "Advances in China's AI tech stack, including foundational models and GPU hardware, could lead to a reduction in overall AI costs. As the U.S. adopts a more restrictive approach to AI ecosystem sharing, this could open up global markets for Chinese alternatives."

Open-Source Strategy Leading the Future

Qwen3-Coder adopts the Apache 2.0 open-source license, ensuring global developers and researchers can freely use and improve these models. This open strategy reflects Alibaba's commitment to promoting global AI technology development and injects new vitality into the open-source AI community.

As AI programming tools continue to evolve, the release of Qwen3-Coder marks a new milestone for open-source large models in the fields of code generation and agentic programming. This not only provides global developers with a powerful programming assistant but also paves a new path for the democratization and popularization of AI technology.