4 hours ago
Alibaba's Qwen Open-sources Qwen3-Coder-Next
Alibaba's Qwen has announced the open-source release of the efficient mixture of experts (MoE) model Qwen3-Coder-Next, specifically designed for programming agents and local development. The model has a total of 80 billion parameters, with only 3 billion activated during each inference. Currently, the Qwen3-Coder-Next (Base) and Qwen3-Coder-Next (Instruct) versions have been officially open-sourced, fully supporting research, evaluation, and commercial applications.
More News

  • Subscribe To Our News