filters
Date
all topics
  • AI Workloads
  • Cooling Systems
  • Power Systems
  • IT Systems & Networks
    • IT Systems
    • Networks & Connectivity
    • Cloud Computing
  • Mechanical & Electrical
  • Sustainability
    • Energy Efficiency
    • Reporting & Regulations
    • eWaste
    • Water
    • Low-Carbon Power
  • Resiliency, Risk & Security
    • Availability
    • Risk Management
    • Fire & Life Safety
    • Physical Security
    • Cyber Security
  • Data Center Management
    • DCIM
    • Building & Controls
    • AI & Automation
    • Procedures & Documentation
  • Staff, Skills & Training
    • Training & Professional Development
    • Staffing & Hiring
  • Strategy & Planning
    • Benchmarks & Industry Trends
    • Standards & Metrics
    • Economics
  • Design, Siting, Construction
    • Building Features & Design
    • Site Selection
    • Modernization & Consolidation
    • Prefab & Modular Data Centers
  • Data Center Types & Venues
    • Colocation
    • Enterprise
    • Hyperscale
    • Edge
    • Venue Selection
  • Innovation & Disruption
all types
  • Intelligence
  • Keynote Reports
  • Briefing Reports
  • Data Reports
  • Data Assets
  • Intelligence Updates
  • Intelligence Resources
  • Event Recaps
  • On-Demand Webinars
  • Upcoming Events
  • Discussions
all analysts
  • Andy Lawrence
  • Daniel Bizo
  • Douglas Donnellan
  • Dr. Owen Rogers
  • Dr. Rand Talib
  • Dr. Tomas Rahkonen
  • Jacqueline Davis
  • Jay Dietrich
  • John O'Brien
  • Max Smolaks
  • Peter Judge
  • Rose Weinschenk
Selected Filters
Unlocked content only
23 hours ago
Advances in fine-tuning reduce total volume of AI training

When choosing whether to develop a brand new LLM or fine-tune an existing one, the second option often makes more sense. It can be more cost-effective and requires fewer IT and facility resources.

1 day ago
Emerging tech: carbon capture at source

By integrating new natural gas electricity generation with carbon capture, operators can safeguard net-zero targets threatened by a reliance on fossil power — but initial adoption will be costly and limited to specific geographic locations.

As AI models improve, availability lags behind

AI applications are becoming critical to enterprise operations, but service availability still varies sharply across providers. Inference services should be evaluated not only on model capability, but on operational maturity.

30 Apr 2026
IT-OT telemetry failings are hindering real-time applications

IT-OT equipment telemetry offers huge potential to improve visibility into live facility operations, but data exchange often fails because systems and protocols are incompatible.

10 Apr 2026
How AI training choices affect infrastructure costs

Choosing whether to train a model from scratch or fine-tune an existing one comes down to the use case and cost — with hardware utilization remaining an important cost factor.

8 Apr 2026
Vendors gearing up for 800V DC adoption

The topic of direct current (DC) power distribution in data centers is back, and this time it may be different.

27 Mar 2026
CoolIT sale signals strong pipeline for DLC orders

Water treatment and chemicals giant Ecolab has agreed to pay $4.75 billion in cash for Canadian DLC specialist CoolIT. It is the latest sign of unabated demand for AI compute.

16 Mar 2026
Project approval will hinge on local benefit guarantees

Growing public opposition to data center development has pushed many data center companies to invest in media and promotional campaigns. But, as opposition intensifies, this type of outreach is unlikely to sway public opinion.

12 Mar 2026
Where to deploy AI inference: a guide to the economics

Although cloud platforms often offer the lowest cost for AI inference, on-premises deployment may be preferable due to application architecture, data locality and control requirements.

11 Mar 2026
Interactive AI inference costing tool

The cost of AI inference varies widely depending on deployment model, utilization and hardware. This costing tool compares on-premises, colocation and managed AI platforms on a like-for-like basis.

Feb. 2026 AI Infrastructure Survey [Results and Crosstab files]

Results from Uptime Institute's 2026 AI Infrastructure Survey (n=1,141) focus on the data center infrastructure currently used or being planned to use to host AI Training and AI Inference, as well as future industry outlooks on the usage of AI. The…

25 Feb 2026
Regulations for behind-the-meter power are emerging

Operators are proposing behind-the-meter power systems to accelerate the buildout of new AI data center infrastructure. Executing this strategy requires regulatory changes in many jurisdictions and new data center design approaches.

12 Feb 2026
Why inference will become a ubiquitous IT workload

As AI adoption spreads, most data centers will not host large training clusters — but many will need to operate specialized systems to run inferencing close to applications.

6 Feb 2026
Rising cost of traditional IT: temporary spike or long-term shift?

The shortage of DRAM and NAND chips caused by demands of AI data centers is likely to last into 2027, making every server more expensive.

3 Feb 2026
AI automation moves from pilots to early production

AI in data center operations is shifting from experimentation to early production use. Adoption remains cautious and bounded, focused on practical automation that supports operators rather than replacing them.

Posting discussions is not available for Network Guests