A plan to build a campus and data center dedicated to artificial intelligence in the UAE is set to become one of the world’s largest computing infrastructure initiatives.
Abu Dhabi-based AI company G42 has partnered with global technology leaders OpenAI, Oracle, Nvidia, and Japan’s SoftBank Group to launch Stargate UAE. This monumental project will be the largest AI data center globally and will be part of a network of OpenAI-linked centers around the world. It marks a significant vote of confidence in the UAE’s ability to host large-scale, tech-driven infrastructure critical to economic, societal, and business development.
“Great to work with the UAE on our first international Stargate! Appreciate the governments working together to make this happen. Sheikh Tahnoon has been a great supporter of OpenAI, a true believer in AGI, and a dear personal friend,” OpenAI CEO Sam Altman said in a post on X.
What Is Data Centre Capacity?
Data center capacity refers to the physical space and power needed to store and process data, commonly measured in kilowatts (kW) or megawatts (MW). These facilities are typically categorized by size:
- Small: up to 1,000 square feet
- Medium: 10,000 to 50,000 square feet
- Large: more than 50,000 square feet
Monthly energy consumption for these centers can reach up to 36,000kWh for small, 2,000MW for medium, and 10MW for large facilities. Beyond power, capacity also includes cooling systems, servers, and especially graphics processing units (GPUs)—the backbone of AI data processing. The industry is increasingly focusing on sustainability and eco-efficiency.
What Can 1MW or 1GW of Power Support?
A single megawatt (MW) can support about 1,000 Nvidia Blackwell GPUs for AI training or tens of millions of ChatGPT-style queries daily in inference mode.
“Think of 1MW as the backbone for a mid-sized national-language model serving an entire country,” said Mohammed Soliman, director of strategic technologies and cybersecurity at the Middle East Institute in Washington.
By contrast, 1 gigawatt (GW) of continuous power could run roughly one million high-end Nvidia GPUs—about the same electricity usage as San Francisco or Washington, D.C., annually.
How Much Does It Cost to Build a Data Centre?
Construction costs range from millions to billions of dollars, depending on size and scope. Key factors include:
- Land and construction
- Equipment (especially servers and GPUs)
- Cooling and power infrastructure
- Security (both physical and digital)
- Skilled Personnel
Operating costs may rise over time, and location plays a critical role. Tokyo, Singapore, and Zurich are among the most expensive places to build such facilities, according to Turner & Townsend.
For comparison:
- China Telecom’s data center, once the world’s largest, has 150MW capacity and cost $3 billion.
- Stargate UAE, with 1GW capacity in its initial phase (and a planned expansion to 5GW), is expected to cost around $20 billion, per OpenAI.
How Stargate UAE Compares Globally
The scale of Stargate UAE is unprecedented. It surpasses the data centers of major tech firms:
- Google: 100MW, $5.5 billion
- Microsoft: 50MW, $3 billion
- Apple (Arizona): 50MW, $2 billion
Once fully operational, the 5GW Stargate campus could house up to 2.5 million GPUs, consuming as much electricity as multiple mid-sized U.S. cities.
Currently, the UAE operates at least 17 data centers, according to DataCentres.com, and plans to significantly expand that number in the coming years.
With Stargate UAE, the nation positions itself as a global hub for advanced AI infrastructure, reinforcing its strategic ambitions in the tech world.