DeepSeek’s Year: 113,000 Qwen Derivative Models, 4 Times More Than Llama

Deep chic Moment 1 Year, 3 Changes Proven by Numbers

  • Over 113,000 Qwen-derived models — 4x more than Meta Llama (27,000)
  • DeepSeek #1 in Hugging Face followers, Qwen #4
  • Chinese AI organizations shift direction to “Open Source is a Strategy”

What Happened?

Hugging Face published an analysis report on the 1st anniversary of ‘Deep Chic Moment’.[Hugging Face] This is the final part of a 3-part series that organizes data on how China’s open-source AI ecosystem has grown since the emergence of DeepSeek in January 2025.

Let’s start with the key figures. The number of Qwen (Alibaba) based derivative models exceeded 113,000 as of mid-2025. Including repositories tagged with Qwen, there are over 200,000.[Hugging Face] This is an overwhelming number compared to Meta’s Llama (27,000) or DeepSeek (6,000).

Why is it important?

Frankly, just a year ago, many people considered Chinese AI to be a ‘copycat’. But it’s different now.

ByteDance, Deepseek, Tencent, and Qwen are at the top of Hugging Face’s popular paper rankings. In terms of follower count, DeepSeek ranks first and Qwen ranks fourth. Looking at Alibaba as a whole, the number of derived models is comparable to Google and Meta combined.[Hugging Face]

Personally, I’m paying attention to Alibaba’s strategy. Qwen is not a single flagship model, but a ‘family’. It supports various sizes, tasks, and modalities. Simply put, it’s saying, “Use our model as a general-purpose AI infrastructure.”

What will happen in the future?

Hugging Face analyzed that “open source is a short-term dominance strategy for Chinese AI organizations.” The interpretation is that they aim for large-scale integration and deployment by sharing not only models but also papers and distribution infrastructure.

In just one year, numbers have confirmed that the Deepseek moment was not a one-time event. The center of gravity of the global AI open source ecosystem is shifting.

Frequently Asked Questions (FAQ)

Q: Are there more Qwen derived models than Llama? Why?

A: Alibaba’s release of Qwen in various sizes and modalities has expanded its coverage. In particular, Chinese developers often use it for local deployment. The strategy of continuously updating Hugging Face and the scope of the model was also effective.

Q: Is DeepSeek still important?

A: Yes. DeepSeek is the organization with the most followers on Hugging Face. However, it lags behind Qwen in the number of derived models. DeepSeek has strengths in papers and research contributions, while Qwen is focused on ecosystem expansion.

Q: What does this mean for Korean developers?

A: Qwen-based models are strengthening Korean language support. Because it is open source, local deployment and fine-tuning are free. It has become a good environment to experiment without the burden of cost. However, license conditions vary by model, so you should check them.


If this article was helpful, please subscribe to AI Digester.

References

Leave a Comment