From chatbot to AI operating system, OpenAI is redefining how people interact with AI through a comprehensive ecosystem transformation.
On October 7, 2025, Beijing time, OpenAI held its third annual Developer Conference (DevDay 2025) at the San Francisco's Yerba Buena Center for the Arts. CEO Sam Altman opened the event with themes of "Driving Future Development" and "Making it Easier to Build with AI," presenting a new blueprint for AI's evolution from a tool to a platform.
The context of this conference is supported by staggering statistics: ChatGPT's weekly active users have reached 800 million, the number of developers has increased from 2 million to 4 million, and API processing volume per minute has soared from 300 million tokens to 60 billion.
These figures not only set a historical high for AI applications but also mark a turning point where AI technology is moving from the laboratory to the masses.
OpenAI's announcements at this developer conference can be summarized into four core areas, covering a full-chain layout from application ecosystem to development tools.
The opening of the ChatGPT application ecosystem is the primary highlight. Developers can now use the全新的 Apps SDK to build real applications within ChatGPT. These applications are interactive, adaptive, and personalized, allowing users to access full application functions without leaving the conversation interface.
In live demonstrations, applications from Coursera, Canva, and Zillow were seamlessly integrated into the conversation. Users can directly play course videos and ask questions in real-time within ChatGPT, or generate business posters based on the conversation content.
The introduction of AgentKit lowers the barrier to developing AI agents. This complete set of building blocks integrates everything needed to build, deploy, and optimize agent workflows. An OpenAI engineer built and deployed an information agent named "Ask Froggie" for DevDay in less than 8 minutes onsite, demonstrating a qualitative leap in development efficiency.
Codex programming agent has officially moved out of the research preview phase into a general release version. Running on the GPT-5 Codex model, Codex is responsible for writing almost all new code internally at OpenAI, and engineers have seen a 70% increase in completed pull requests per week.
Model API updates are equally noteworthy. GPT-5 Pro is now available to all developers, hailed as OpenAI's "most intelligent model" to date. Meanwhile, the GPT Real-Time Mini voice model, which is 70% cheaper, and the preview of the Sora 2 API provide developers with more choices.
OpenAI demonstrated not just a technical upgrade but a clear strategic transformation—from a model provider to an AI operating system.
The Apps in ChatGPT feature completely replaces the previous plugin model, allowing third-party applications to display full interactive interfaces directly within the ChatGPT interface. This resembles the Apple App Store model, but users can complete all services seamlessly within the conversation without needing to install or switch applications.
Altman emphasized in his speech that AI has evolved from a "system you can ask anything" to a "system you can ask to do anything for you." The core of this shift is that ChatGPT is evolving from a chatbot into a super gateway for the AI era.
The Model Context Protocol (MCP) promoted by OpenAI provides the technical foundation for this ecosystem, while the upcoming Agentic Commerce Protocol payment protocol offers a commercialization path for developers, forming a complete closed loop of "development-review-distribution-monetization."
To build a thriving ecosystem, OpenAI has significantly lowered the barrier to developing AI applications.
The Agent Builder within AgentKit is likened to "the Canva for building agents," allowing developers to design workflows via a drag-and-drop visual canvas. Real-world cases prove its value: fintech company Ramp compressed development work that originally took months down to a few hours using Agent Builder.
For enterprise customers, OpenAI provides the Connector Registry data connector, uniformly managing all data connections across ChatGPT and APIs. This standardized solution addresses pain points in enterprise integration, paving the way for large-scale AI application in business scenarios.
DevDay 2025 not only shocked the tech world but also sent ripples through Wall Street. During the conference, OpenAI unexpectedly became a "kingmaker" in the U.S. stock market.
When OpenAI announced a cooperation agreement worth tens of billions of dollars with AMD and potentially acquiring about 10% of AMD's equity, AMD's stock price surged by over 34% at one point, with its market value increasing by nearly $80 billion in a single day.
As one of the first application partners, Figma's stock price rose sharply by 7.4%, while Expedia, Booking.com, and Zillow also saw significant short-term stock price increases.
This market reaction confirms the immense influence OpenAI now wields—its strategic decisions can instantly create or evaporate tens of billions of dollars in market value.
Facing competition from rivals like Anthropic, Google, and Meta, OpenAI's comprehensive platform strategy has undoubtedly raised the industry's competition barriers. The landscape in the large model field is being reshaped as competition shifts from algorithms to ecosystem competition.
All the blueprints outlined by OpenAI are built on an extreme thirst for computing power. Company executives publicly acknowledged that the entire industry is facing "absolute computing power scarcity."
To meet this demand, OpenAI has launched a multi-pronged chip strategy: a hundred-billion alliance with NVIDIA, cooperation with AMD, and partnering with Broadcom to invest $10 billion in self-developed AI chips. These initiatives signal that OpenAI has transformed from a light-asset tech company to a heavy-asset giant requiring large-scale industrial investment.
Including the "Stargate" supercomputing project in cooperation with cloud vendors like Oracle, OpenAI's total infrastructure investment commitments have exceeded trillions of dollars, completely reshaping the competition threshold in the AI field.
Meanwhile, industry experts point out that the main battlefield of AI in the next decade will shift from training computing power to inference computing power. Training is like spending ten years studying at a top university to accumulate knowledge, while inference is like graduating and going to work to solve practical problems. Scenarios such as ChatGPT's hundreds of millions of daily user queries, factory quality inspection, and medical image analysis all rely on efficient, low-cost inference computing power, and such demand may increase by 10,000 times in the future.
Facing such enormous computing demands, cost-effective computing solutions will become key to promoting AI empowerment across various industries. In this regard, innovative service models have emerged in the domestic market, such as the computing power rental platform www.omniyq.com by Shenzhen Yunqing Tianxia Technology Co., Ltd., which is committed to providing easily accessible and cost-optimized computing power support for more enterprises and developers, facilitating the application landing and value release of AI technology across various sectors.
OpenAI's dimension of competition has shifted from algorithmic models to the entire ecosystem. As development barriers lower, AI applications will experience explosive growth, but this also means startups will face a more severe survival environment—they must find differentiated positioning within the ecosystem built by OpenAI.
Against the backdrop of computing power scarcity, the initiative for AI innovation is gradually concentrating in the hands of a few players capable of building trillion-dollar infrastructure.