One Year of DeepSeek: 113,000 Qwen Derivatives, 4x More Than Llama

One Year of Deep Chic Moment: 3 Changes Proven by Numbers

  • Over 113,000 Qwen derivative models — 4x more than Meta Llama (27,000)
  • DeepSeek ranks #1 in Hugging Face followers, Qwen at #4
  • Chinese AI organizations shift direction: “Open source is strategy”

What Happened?

Hugging Face released their ‘Deep Chic Moment’ one-year analysis report.[Hugging Face] This is the final part of a three-part series summarizing data on how China’s open source AI ecosystem has grown since DeepSeek’s emergence in January 2025.

Let’s start with the key metrics. Qwen (Alibaba) derivative models exceeded 113,000 as of mid-2025. Including repositories tagged with Qwen, the number surpasses 200,000.[Hugging Face] This is an overwhelming figure compared to Meta’s Llama (27,000) or DeepSeek (6,000).

Why Is It Important?

Frankly speaking, just a year ago, many people regarded Chinese AI as ‘copycat.’. But now it’s different.

ByteDance, Deepseek, Tencent, and Qwen rank among the top in Hugging Face’s popular papers rankings. In terms of follower count, DeepSeek is #1 and Qwen is #4. Looking at Alibaba as a whole, the number of derivative models matches Google and Meta combined.[Hugging Face]

What I personally find notable is Alibaba’s strategy. Qwen is structured as a ‘family,’ not a single flagship model. It supports various sizes, tasks, and modalities. Simply put, it means: “Use our models as general-purpose AI infrastructure.”

What Will Happen Next?

Hugging Face analyzed that “open source is a short-term dominance strategy for Chinese AI organizations.” The interpretation is that they aim for large-scale integration and deployment by sharing not only models but also papers and deployment infrastructure.

Within just one year, the numbers confirmed that the DeepSeek moment was not a one-time event. The center of gravity in the global AI open source ecosystem is shifting.

Frequently Asked Questions (FAQ)

Q: Are there more Qwen derivatives than Llama? Why?

A: Alibaba released Qwen in various sizes and modalities, expanding its application range. Chinese developers frequently use it for local deployment. The strategy of continuously updating the model range with Hugging Face has also been effective.

Q: Is DeepSeek still important?

A: Yes. DeepSeek has the most followers on Hugging Face. However, it trails Qwen in derivative model count. DeepSeek has strengths in papers and research contributions, while Qwen focuses on ecosystem expansion.

Q: What does this mean for developers?

A: Qwen-based models are strengthening multilingual support. Because it’s open source, local deployment and fine-tuning are free. It’s become a great environment to experiment without cost burden. However, license terms vary by model, so check before use.


If this article was useful, subscribe to AI Digester.

References

Leave a Comment