OpenAI is securing one of the largest computing deals in recent AI history, agreeing to purchase up to 750 megawatts of processing capacity from chipmaker Cerebras over the next three years. The agreement, valued at more than $10 billion over its duration, reflects the escalating demand for high-performance infrastructure to power advanced AI models.

The ChatGPT creator will use Cerebras’ specialized systems to enhance the speed and responsiveness of its popular chatbot, part of a broader trend in which AI firms are investing heavily in cutting-edge hardware to maintain a competitive edge. Cerebras, a startup founded in 2015 known for its wafer-scale engines, will provide cloud services built on its chips, focusing on inference and reasoning workloads that require substantial computational throughput.

According to Cerebras CEO Andrew Feldman, talks began last August after the company demonstrated that OpenAI’s open-source models ran more efficiently on Cerebras chips than on conventional GPUs. After several months of negotiation, the partnership was finalized, with capacity scheduled to come online in multiple phases through 2028. Under the deal, Cerebras will build or lease data centers equipped with its chips, while OpenAI will pay to access the cloud-based services.

“Integrating Cerebras into our mix of compute solutions is all about making our AI respond much faster,” OpenAI said in a post on its website.

The move highlights the growing appetite for AI computing resources as companies race to develop more capable reasoning models and applications. Inference—the process by which models generate responses to queries—has become a critical bottleneck for scaling AI services, making high-performance chips increasingly essential.

For Cerebras, the partnership also represents a strategic step toward its public offering, helping diversify its revenue streams beyond its relationship with UAE-based tech firm G42, a key investor and client. The company, which previously attempted to go public in 2024 before postponing, is reportedly preparing for another IPO in the second quarter of this year.

OpenAI itself is reportedly laying the groundwork for a potential initial public offering that could value the company at as much as $1 trillion. CEO Sam Altman has previously outlined plans to invest $1.4 trillion to develop 30 gigawatts of computing resources—enough to power roughly 25 million U.S. homes.

While investments in AI infrastructure continue to soar, some analysts have expressed concern that the sector may be inflating a bubble reminiscent of the dotcom era, with valuations and commitments reaching unprecedented levels.