SUBSCRIBE NOW SUPPORT US

Data centers: A view from the inside

Data centers: A view from the inside
Published on

The rapid expansion of data centers, fueled by the artificial intelligence boom, has sparked growing curiosity about what actually goes on inside these massive facilities. AFP recently gained rare access to one, offering a closer look at the physical backbone of the digital world.

There are an estimated 12,000 data centers operating globally, about half of them in the United States, according to Cloudscene. At their core, data centers are concrete warehouses packed with thousands of computer servers operating in unison. Older sites are typically low, sprawling structures divided into cavernous rooms, while newer ones are rising higher to maximize space.

Each facility can serve a single company or multiple clients. The computers are mounted in standardized 19-inch racks — metal frames lined up in long rows — and a large center may hold tens of thousands of servers working around the clock.

These machines generate immense heat and consume large amounts of electricity, both for processing power and for cooling. High-speed networking equipment, including switches, routers and fiber optic cables, moves enormous volumes of data at blistering speeds.

Proximity to users matters. The closer a center is to populated areas, the faster data can be delivered — a critical advantage for applications such as financial trading and online gaming.

Ashburn, Virginia, home to the highest concentration of data centers worldwide and about 30 miles from Washington, D.C., has become a hub for this reason. But building near cities drives up land and construction costs and often faces local opposition.

As a result, many operators build large sites in rural areas, placing high-performance tasks — such as AI model training — far from cities, while keeping latency-sensitive systems closer to users.

Heat management is one of the biggest engineering challenges. A single server rack produces as much heat as several household ovens running nonstop, with cooling consuming roughly 40 percent of a center’s electricity.

Modern GPUs used for AI can exceed 90 degrees Celsius and are heavier than earlier chips, putting added strain on infrastructure.

Traditional cooling relies on air conditioning, but this is inadequate for today’s high-performance chips. Operators now deploy water-based methods: liquid systems that pump coolant directly to components or evaporative cooling that mimics perspiration.

Water use has surged — US data centers consumed 21.2 billion liters in 2014 and 66 billion liters in 2023, federal data shows.

Latest Stories

No stories found.
logo
Daily Tribune
tribune.net.ph