The chipmaker bets big on infrastructure to scale next-generation intelligence globally.
San Francisco, September 2025. Nvidia has entered a landmark agreement to invest 100 billion dollars (roughly 85 billion euros) in OpenAI, channeling capital toward the deployment of at least 10 gigawatts of AI data-center capacity beginning in 2026. The partnership aims to expand computing power dramatically, aligning Nvidia’s hardware leadership with OpenAI’s software and model ambitions.
Under the agreement, the first gigawatt of Nvidia systems is expected to be installed in the latter half of 2026. The two organizations will finalize deployment details over the coming months, including site selection, energy provision, cooling architecture, and integration strategies. The announcement marked a deepening of their collaboration, extending beyond previous joint projects with partners such as Microsoft, Oracle, SoftBank, and others in the AI infrastructure sphere.
As part of the deal, Nvidia also committed an additional 5 billion dollars to invest in Intel, whose chip division has struggled to keep pace with AI demands. Nvidia frames this dual investment strategy as reinforcing its long-term dominance in the AI hardware stack and supporting broader ecosystem stability.
For OpenAI, the influx of resources facilitates scaling with fewer bottlenecks. Its leadership highlighted that without a massive increase in infrastructure capacity, future model training and deployment would remain constrained. With access to vast dedicated compute, OpenAI anticipates shorter training cycles, expansion of model size, and an ability to serve more users with lower latency.
The scale of the investment elevates the Nvidia–OpenAI alliance from a vendor-customer relationship into a structural pact in the AI domain. By anchoring compute infrastructure to software leadership, the collaboration reduces dependency on external cloud providers and integrates intelligence with specialized hardware in a more controlled environment.
Still, several technical and operational challenges lie ahead. Data centers at this scale demand vast energy resources, sustainable cooling solutions, grid resilience, and land with favorable access to renewable power. Managing e-waste, resource procurement and logistics, regulatory approval, and security also pose significant risks. Even with deep pockets, deploying ten gigawatts of capacity will test the limits of supply chains and capital deployment.
The strategic implications for AI markets are substantial. With this backing, OpenAI is positioned to outpace rivals who rely on third-party infrastructure. Yet it also commits Nvidia to maintaining leading performance, efficiency, and reliability under scrutiny from competitors and regulators alike. The partnership solidifies a hardware-software hegemonic model that could define future competition in generative AI.
Globally, the move is being watched as an inflection point: tech capitals, power sectors, and governments will likely vie to host these megacenters, offering incentives to attract AI investments. Countries with clean energy capacity or favorable regulatory regimes may gain economic advantage, turning data infrastructure into geopolitical leverage.
Critics point out that concentrating AI infrastructure in the hands of a few powerful actors risks reinforcing centralization, increasing barriers to entry, and potentially undermining open competition. Advocates respond that scale is necessary to push AI further and that responsible deployment and regulation can mitigate concentration risks.
From a capital markets perspective, the investment may reshape valuations across semiconductors, cloud, and AI specialist firms. Nvidia’s bet also signals confidence in sustained growth in demand for AI compute, pushing more capital toward data infrastructure, chip startups, power innovations, and adjacent sectors like semiconductor materials.
Regardless of challenges, Nvidia’s investment heralds a structural acceleration in the design, deployment, and governance of AI infrastructure. The real test will come when these megacenters go live and reveal which architectures, energy models, and software integrations dominate at scale.
Phoenix24: each silence echoes. / Cada silencio resuena.