OpenAI, Sora feed philosophy revealed: “We do not allow doomscrolling”
- Creation first, consumption minimization is the key principle
- A new type of recommendation system that can be adjusted with natural language
- Safety measures from the creation stage, opposite strategy to TikTok
What happened?
OpenAI officially announced the design philosophy behind Sora’s recommendation feed, their AI video creation app.[OpenAI] The core message is clear: “This is a platform for creation, not doomscrolling.”
While TikTok has faced controversy for optimizing watch time, OpenAI chose the opposite direction. Instead of maximizing feed dwell time, they prioritize showing content most likely to inspire users to create their own videos.[TechCrunch]
Why is it important?
Honestly, this is quite an important experiment in social media history. Existing social platforms maximize dwell time to generate ad revenue. The longer users stay, the more money they make. This has resulted in addictive algorithms and mental health issues.
OpenAI already generates revenue through subscription models (ChatGPT Plus). Since they don’t rely on ads, they don’t need to “keep users hooked.” Simply put, because the business model is different, the feed design can be different too.
Personally, I’m curious whether this will actually work. Can a feed that “encourages creation” really keep users engaged? Or will it eventually revert to dwell time optimization?
4 Principles of Sora Feed
- Creative Optimization: Induces participation, not consumption. The goal is active creation, not passive scrolling.[Digital Watch]
- User control: The algorithm can be adjusted with natural language. Commands like “Show me only comedy today” are possible.
- Connection priority: Content from followers and people you know is shown before viral global content.
- Safety-freedom balance: Since all content is generated within Sora, harmful content is blocked at the creation stage.
How is it different technically?
OpenAI differs from existing LLMs. Using this approach, a new type of recommendation algorithm was developed. The key differentiator is “natural language instructions.” Users can explain to the algorithm directly in words what type of content they want.[TechCrunch]
Sora uses activity (likes, comments, remixes), IP-based location, ChatGPT usage history (can be turned off), and creator follower count as personalization signals. However, safety signals are also included to suppress harmful content exposure.
What will happen in the future?
The Sora app launched in just 48 hours. It reached #1 on the App Store. 56,000 downloads on the first day, tripled on the second day.[TechCrunch] Initial response was enthusiastic.
But the question is sustainability. As OpenAI acknowledges, this feed is a “living system.” It will continue to change based on user feedback. What happens when the creation philosophy conflicts with actual user behavior? We’ll have to watch.
Frequently Asked Questions (FAQ)
Q: How is Sora Feed different from TikTok?
A: TikTok’s goal is to optimize watch time to retain users. Sora does the opposite, showing content most likely to inspire users to create their own videos first. It’s designed to focus on creation rather than consumption.
Q: What does it mean to adjust the algorithm with natural language?
A: Existing apps only recommend based on behavioral data like likes and watch time. Sora allows users to input text instructions like “Show me only sci-fi videos today” and the algorithm adjusts accordingly.
Q: Are there parental protection features?
A: Yes. Using ChatGPT parental control features, you can turn off feed personalization or limit continuous scrolling. Teen accounts have a default daily limit on videos they can create, and the Cameo feature (videos featuring other people) also has stricter permissions.
If you found this article useful, subscribe to AI Digester.
Reference Resources
- The Sora feed philosophy – OpenAI (2026-02-03)
- How OpenAI designs Sora recommendation feed – Digital Watch Observatory (2026-02-03)
- OpenAI is launching the Sora app – TechCrunch (2025-09-30)