UII UPDATE 447 | DECEMBER 2025
Each January, Uptime Intelligence publishes five predictions for the coming year. The aim is not to engage in soothsaying but to highlight some not-so-obvious trends and likely developments and, where possible, to help interested parties anticipate or navigate some of the issues. The predictions were featured in a well-attended Uptime Institute webinar (see Five data center predictions for 2025) and are used to drive conversations for the first half of the year. Reports supporting some of the predictions are also available.
The predictions have no expiry date — they usually play out over several years, and some are still “active.” While some prove prescient, it is not always the case. Here, we briefly look back at the five predictions for 2025 and assess the current situation (for the full report, see Five data center predictions for 2025).
This prediction had two key components: first, that data center developers would face more organized, detailed and sophisticated opposition to their plans from the public; and second, that the value of data centers would increasingly be discussed and questioned by regulators and governments, partly in response to public pressure.
With hindsight, this may seem like an easy prediction — there has been a surge in data center expansion since 2022, accompanied by some growing disquiet. Yet, aside from a few highly publicized protests, there was little sign in 2024 that opposition to data centers — long regarded by some as a potentially polluting and power-hungry industry — was either important or effective.
In 2025, this has changed. Data center power use, in particular, has become a mainstream political and media topic — along with the local impact of large new developments. These concerns are not universal, but they have surfaced in several locations. In the US, for example, projects have been stopped or delayed in Northern Virginia and Georgia; in Canada (Wonder Valley), Norway (Hamar), and Ireland (Dublin), protestors have become organized, media-savvy and vocal. This said, governments are mostly still — or, if anything, even more strongly — in favor of data center developments than before the emergence of generative AI, which is viewed as economically and politically important.
But predictions usually play out over several years: concerns over resource use, power availability, power pricing, power stability, carbon and particulate emissions, accountability, and low job creation will continue to arouse opposition to data centers. As our prediction stated, more transparency, collaboration and reporting will likely be required.
Prediction status: ongoing
A year is a long time in technology. In 2024, analysts from firms such as Gartner predicted that the next big wave in AI power consumption would come from inference, primarily at the edge. It was widely expected that most enterprises would upgrade their IT and data centers (both on-site and colocation) to support much higher densities.
Some of this is true. But in 2024, Uptime was already noting a reluctance among enterprises to build (or buy) speculatively — even as they embraced the concepts of AI and invested in software development. We predicted that most investments in large-scale AI training infrastructure would come from hyperscalers and cloud providers, while enterprises would rely on public cloud services and pre-trained foundation models, fine-tuning and supplementing them to reduce computational overhead.
This has largely proven to be the case. Throughout 2025, hyperscalers have dominated AI infrastructure investment, collectively committing hundreds of billions of dollars to GPUs, power and new data centers — often partnering with neoclouds and colocation providers to scale more quickly than their own facilities would allow.
Enterprises, in turn, have leveraged public cloud services to fine-tune pre-trained models for specific use cases, avoiding the cost and complexity of training from scratch.
The future shape and balance of enterprise-level investment in AI capacity is still far from certain, but it is clear that the hybrid (cloud) model already prevalent in corporate IT will apply to AI as well. Enterprises will increase their own investments in AI and AI infrastructure — especially for inference — but in most cases they will continue to rely heavily on cloud-based foundational external models and, most likely, a wide variety of agents.
Prediction status: ongoing
Some of our predictions are based on deduction rather than market feedback or intelligence. From 2022 to 2024, it was becoming apparent that new power demand worldwide — especially in mature economies and in certain hotspots — would soon far exceed supply. At the same time, data centers were becoming, or planned to become, not only major consumers of power but also large-scale generators of power. It was clear that the traditional hierarchical, one-way model, with power companies supplying passive consumer data centers, would have to change. New and expanded data centers will increasingly be expected to provide or store power while, at times, shedding loads to support the grid.
This prediction is already playing out. Long wait times to connect data centers to the grid have given transmission companies and utilities greater bargaining power, enabling them to set rules for providing power back to the grid in exchange for connections. Some regulators, such as those in Texas (see State governments act to control power demand), now require participation in electricity demand management programs. This follows the implementation of earlier, though less strict laws, in Ireland and Singapore.
Momentum is building. An influential Duke University paper (Rethinking load growth: integrating large flexible loads in US power systems) argues that load shedding by data centers could release large amounts of power capacity to the grid. The International Energy Agency, which is widely followed by energy companies worldwide, makes a similar argument. The DCFlex initiative, now underway, brings together data center operators (including hyperscalers), energy companies and equipment makers to develop load-shedding and demand-response schemes — a report is expected in 2026.
Prediction status: ongoing
Before 2024, and for more than three decades, data center designs did not change much. Most data centers used standard racks, some variation of air cooling, and established topologies and voltages for power distribution and storage. Value engineering, scale and replication were the critical factors in driving innovation. But even before generative AI, it was clear that rising semiconductor densities would require a step change; Nvidia’s short-term roadmap — showing steeply climbing GPU power needs, since clarified and now even more extreme — made it clear that the time had come.
Uptime Intelligence’s prediction for 2025 — that a radical overhaul of data center electrification was coming — proved only partly right, however. During the first half of 2025, system makers (Nvidia, SuperMicro, HPE, Dell) and their infrastructure partners (Vertiv, Schneider Electric, Eaton and others), along with their largest AI customers, announced projects to build very high-density racks (above 500 kW per rack) using specialist 800V direct-current pods. New UPS systems, transformers, building design, busways, networking and cooling systems would all be needed. These are now in development, and much progress will no doubt be unveiled in 2026.
The predicted radical overhaul, however, will apply to only a few. Despite the near-universal buy-in to the Nvidia architecture (see the next prediction), Uptime has encountered only a small number of companies that expect to build to these density levels in the next three to five years. For the vast majority, dealing with a rise in density from less than 20 kW to somewhere between 50 kW and 180 kW — along with the associated cooling and design challenges — is a large enough lift that will meet most needs.
Prediction status: under review (will be revisited in Five predictions for 2026, published in January 2026)
At the tail end of 2024, Nvidia reigned supreme in AI systems infrastructure. It had a 95%-plus share of the AI processor market, a huge backlog of orders, an ambitious roadmap, and was influencing — if not dictating — the technical direction of the world’s largest data center companies, whether builders, operators, or suppliers. Nvidia confidently predicted that a range of its GPUs would spread beyond AI training into inferencing, agentic AI, and even general IT in the enterprise. All of this was rewarded with a stock market valuation of $3.5 trillion.
Uptime Intelligence’s prediction at that time was somewhat against the tide. We did not foresee a reversal of fortunes for Nvidia or any of its major partners, but rather a fragmentation of innovation and adoption, leading to a growing spectrum of choices over time. This view was supported by feedback from data center operators, mainly large enterprises. They made it clear that while there was a strong appetite for AI and the power of Nvidia’s GPUs, it was not at any cost.
This prediction is playing out, but slowly. The AI hardware market is becoming more diverse in 2025. AMD — whose processors are generally less power hungry — is scaling up production of enterprise GPUs, with revenues growing rapidly and a major partnership with OpenAI now in place.
Hyperscalers AWS and Google Cloud, meanwhile, continue to deploy their own proprietary AI accelerators, while Microsoft’s redesigned silicon for AI is expected to appear in 2026. In November, Google surprised analysts with the launch of Gemini 3, a model trained on its own hardware that outperformed OpenAI across a wide range of benchmarks.
Elsewhere, air-cooled server systems designed specifically for inference have moved closer to production, with several expected to appear on the market in 2026. In China, the rapid development of domestic alternatives to GPUs is set to continue, even if some Nvidia chips are once again permitted for import into the country (for more on this, see Hardware for AI: options and directions).
This increased competitive pressure has not slowed Nvidia’s stratospheric growth, nor has it dampened enthusiasm for its stock, which gained more than $1 trillion in value over 2025. Its products remain expensive and in short supply. Over time, however, we expect a more diverse and stratified AI market to emerge, with many competing approaches — a development that most in the industry will see as healthy.
Prediction status: ongoing
All five predictions identified at the beginning of 2025 still hold largely or partly true, and all, to a point, have yet to fully unfold. The new predictions, to be unveiled in January 2026, will build on these further.