DeepSeek Moment 1 Year: 113,000 Qwen Derivative Models, 4x Llama

DeepSeek Moment 1 Year: 3 Changes Proven by Numbers

  • Over 113,000 Qwen-derived models – 4 times that of Meta Llama (27,000)
  • DeepSeek ranks 1st, Qwen ranks 4th in most followers on Hugging Face
  • Chinese AI organizations shift direction: “Open source is the strategy”

What happened?

Hugging Face released an analysis report on the 1st anniversary of the ‘DeepSeek Moment’.[Hugging Face] This is the final part of a three-part series that summarizes with data how the Chinese open-source AI ecosystem has grown since the emergence of DeepSeek in January 2025.

Let’s look at the key figures. The number of derivative models based on Qwen (Alibaba) exceeded 113,000 by mid-2025. Including repositories tagged with Qwen, there are over 200,000.[Hugging Face] This is an overwhelming number compared to Meta’s Llama (27,000) or DeepSeek (6,000).

Why is it important?

Frankly, even a year ago, there was a widespread perception of Chinese AI as ‘copycats’. But now it’s different.

ByteDance, DeepSeek, Tencent, and Qwen are all ranked high in Hugging Face’s popular papers. DeepSeek ranks 1st and Qwen ranks 4th in the number of followers. Looking at Alibaba as a whole, the number of derived models is comparable to that of Google and Meta combined.[Hugging Face]

Personally, I’m paying attention to Alibaba’s strategy. They structured Qwen not as a single flagship model, but as a ‘family’. It supports various sizes, tasks, and modalities. In simple terms, it means “Use our model as a general-purpose AI infrastructure.”

What will happen in the future?

Hugging Face analyzed that “open source is a short-term dominance strategy for Chinese AI organizations.” The interpretation is that they are aiming for large-scale integration and deployment by sharing not only models but also papers and deployment infrastructure.

It has been confirmed by numbers after a year that the DeepSeek Moment was not a one-off event. The center of gravity of the global AI open-source ecosystem is shifting.

Frequently Asked Questions (FAQ)

Q: Why are there more Qwen-derived models than Llama?

A: Alibaba released Qwen in various sizes and modalities, expanding its scope of application. In particular, Chinese developers use it a lot for local deployment. The strategy of continuously updating both Hugging Face and ModelScope was also effective.

Q: Is DeepSeek still important?

A: Yes. DeepSeek is the organization with the most followers on Hugging Face. However, it lags behind Qwen in the number of derived models. DeepSeek has strengths in papers and research contributions, while Qwen focuses on ecosystem expansion.

Q: What does it mean for Korean developers?

A: Qwen-based models are strengthening Korean language support. Because it is open source, local deployment and fine-tuning are free. It has become a good environment for experimenting without cost burden. However, license conditions vary from model to model, so confirmation is necessary.


If you found this article helpful, please subscribe to AI Digester.

Reference Materials

Leave a Comment