In Part 1 we saw how artificial intelligence is built from the ground up: chips, networks, datacenters and software, the four layers that sustain the digital mind of the world.
Today we look at who dominates those layers.
Because the real AI competition doesn’t happen between models, but between the companies that control the physical and energy foundations that allow those models to exist.
⚙️ Layer 1 — The Muscle: The Silicon War
Modern AI lives in silicon.
And in this terrain there’s an absolute leader: NVIDIA.
Their H100 chips and the new Blackwell B200 dominate large model training. NVIDIA controls more than 80% of the GPU market for AI, and their market value already exceeds $2.3 trillion, more than Amazon or Alphabet.
🔸 The Performance Battle
NVIDIA offers power and ecosystem (CUDA, NVLink, DGX), making it almost impossible to replace in the short term.
AMD, with its MI300X series, seeks to break that monopoly with more efficient chips and lower prices. In 2025 they signed an agreement with OpenAI to diversify supply.
Intel, which lost ground, tries to rearm with its Gaudi 3 line and the goal of becoming a Western foundry: manufacturing chips for third parties and reducing global dependence on TSMC (Taiwan).
TSMC and Samsung are the true “silicon lungs”: they manufacture almost all advanced chips on the planet. Without them, neither NVIDIA nor AMD could produce their processors.
🔸 Impressive Figures
- A server with 8 H100 GPUs exceeds $400,000
- Training GPT-5 will cost more than $1 billion in hardware and energy
- In 2024 alone, global AI infrastructure spending exceeded $200 billion, double the previous year
- NVIDIA’s data center revenue grew 409% year-over-year in Q4 2024
- The global AI chip market is projected to reach $1.2 trillion by 2030
💬 In this layer, power is measured in teraflops and gigawatts.
🌐 Layer 2 — The Veins: The Network That Transports Intelligence
Behind every model there’s a network connecting thousands of servers. This layer —connectivity— is what allows AI to think at planetary scale.
🔸 Inside Data Centers
Cisco and Arista Networks lead the AI switch market. Arista launched its 7800R3A series, capable of handling up to 800 Gbps per port.
Broadcom, less visible, manufactures the silicon that enables those speeds with its Tomahawk 5 chip.
Each training cluster can have more than 100,000 simultaneous connections between GPUs, where a few microseconds of latency can mean millions in cost.
🔸 Global Networks
Huawei, despite sanctions, develops its own AI stack (Ascend chips + 800G optical network). It’s expanding datacenters in Asia, Africa and Latin America, offering technological independence to countries outside the US-Europe axis.
Nokia and Ericsson are bringing AI to the edge (Edge AI), integrating processing into 5G base stations and future 6G nodes.
- Nokia AirScale AI allows industrial data to be processed in milliseconds
- Ericsson works with Intel and Vodafone on “AI-native” architectures for mobile networks
🔸 Operators Move
Telcos like Vodafone, Telefónica, AT&T and Orange are transforming their networks to offer AI as a service (AIaaN: AI-as-a-Network). McKinsey estimates they could capture $100 billion in new value by 2030 if they manage to integrate computing, connectivity and real-time analytics.
💬 Networks no longer just transport data: they transport thought.
🏢 Layer 3 — The Lungs: The New Datacenter Race
Datacenters are the meeting point between energy, hardware and software. And today they’re experiencing unprecedented expansion.
🔸 The Three Giants of Global Computing
Microsoft Azure: Strategic partner of OpenAI, has invested more than $13 billion in AI-dedicated datacenters. Uses liquid cooling, solar panels and cold locations (Ireland, Finland).
Google Cloud: Controls its entire chain —TPU chips, Gemini software and own centers— with record efficiency. A single TPU v5e pod reaches 400 petaflops of performance.
Amazon AWS: Remains the silent giant. Their Trainium and Inferentia chips reduce inference costs by up to 50%, and their new Spain region is optimized for sustainable AI workloads.
🔸 New Players
CoreWeave: Startup that provides dedicated infrastructure for OpenAI, Anthropic and Stability AI. Offers custom GPU with deployments in hours.
Oracle: Building ultra-density datacenters with clean energy agreements.
🔸 The Great Bottleneck
Datacenters consume so much energy that some countries are limiting their expansion. An AI cluster can demand 50-100 MW, the same as a small city. By 2030, AI could use 3% of global energy, triple that of 2023.
💬 The cloud is no longer ethereal: it’s a network of thought factories.
🧠 Layer 4 — The Brain: Models and Software at the Top
Here occurs the visible part of artificial intelligence, but also the most dependent on everything above.
🔸 Digital Brain Leaders
OpenAI (Microsoft): Pioneer with GPT-4 and GPT-5, seeks smaller and more efficient models.
Anthropic (Amazon + Google): Drives ethical and explainable AI with its Claude series.
DeepMind (Google): Combines scientific research with applied engineering (Gemini, AlphaFold).
Meta: Bets on openness with LLaMA 3, democratizing access to knowledge.
xAI (Elon Musk): Integrates AI software with Tesla hardware and X social networks.
🔸 The Efficiency Race
The real challenge is no longer who has the biggest model, but who makes it run faster and with less energy. Techniques like quantization, distillation or mixed parallelism reduce consumption by up to 40%. This explains why alliances between hardware and software (like NVIDIA + Microsoft or AMD + OpenAI) are increasingly strategic.
💬 In this layer, intelligence is no longer just trained: it’s optimized.
🔋 Conclusion — The Future of Digital Muscle
Artificial intelligence is the new infrastructure of the planet. Its growth no longer depends on ideas, but on resources: silicon, energy, talent and technological sovereignty.
Three forces will mark the next decade:
Energy: AI will demand new sources (solar, modular nuclear, geothermal).
Decentralization: Computing will migrate to the edge (Edge AI), reducing pressure on central datacenters.
Sovereignty: Each region will try to control its own chip and data chain. Europe drives the Chips Act (€43 billion), China invests more than $50 billion in semiconductors, and the US reinforces its alliance with Intel, NVIDIA and TSMC.
💬 The future of AI won’t be decided by the best algorithm alone, but by who can keep it powered.
📊 The AI Infrastructure War Map
| Layer | Dominant Players | Market Share | Key Battles |
|---|---|---|---|
| Hardware | NVIDIA (80%), AMD (15%), Intel (5%) | $200B+ | GPU supremacy, foundry control |
| Networks | Cisco, Arista, Huawei | $50B+ | Speed, latency, sovereignty |
| Datacenters | AWS (32%), Azure (23%), GCP (10%) | $300B+ | Energy efficiency, sustainability |
| Software | OpenAI, Anthropic, Google, Meta | $100B+ | Model efficiency, ecosystem |
📚 Recommended Readings
The cost of compute: A $7 trillion race to scale data centers (McKinsey & Company)
Analysis of the massive infrastructure investments required to support AI growth and the $7 trillion data center scaling challenge.
👉 https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/the-cost-of-compute-a-7-trillion-dollar-race-to-scale-data-centers
AI Infrastructure: Key Components & 6 Factors Driving Success (Cloudian)
Comprehensive guide to AI infrastructure components and the critical factors that drive successful AI deployments.
👉 https://cloudian.com/guides/ai-infrastructure/ai-infrastructure-key-components-and-6-factors-driving-success/
Future-ready AI infrastructure (Deloitte Insights)
How organizations can build AI infrastructure that adapts to future technological changes and business needs.
👉 https://www.deloitte.com/us/en/insights/topics/digital-transformation/future-ready-ai-infrastructure.html
✍️ Claudio from ViaMind
Dare to imagine, create and transform.