Microsoft plans to migrate data centers to its own chips.

Microsoft CTO Kevin Scott confirmed the company's plans to primarily use its own processors in its data centers going forward. This move will reduce dependence on Nvidia and AMD chips and optimize systems for specific AI tasks.

Microsoft plans to migrate data centers to its own chips.

Microsoft intends to transition its data centers primarily to using its own chips in the future, the company's CTO announced. This move will potentially reduce the tech giant's dependence on leading semiconductor suppliers such as Nvidia and AMD.

Semiconductors and servers used in data centers are the fundamental basis for developing AI models and applications. Until now, Nvidia has dominated this market thanks to its GPUs, while AMD's share has been less significant.

However, major cloud computing players, including Microsoft, are also developing their own chips specifically optimized for data center applications. Microsoft's CTO, Kevin Scott, outlined its strategy in this area during a discussion at Italian Tech Week.

Scott noted that Microsoft primarily uses Nvidia and AMD chips because they have long offered the best price-performance ratio. However, the company is open to considering any options to ensure sufficient computing power to meet growing demand.

In 2023, Microsoft unveiled its own AI accelerator, Azure Maia, and the Cobalt processor. The company is also reportedly working on next-generation semiconductor products and recently unveiled a new microfluidic cooling technology to address chip overheating.

When asked about long-term plans, Scott unequivocally confirmed his intention to use primarily Microsoft chips in his data centers, adding that the company is already actively using them. This focus on proprietary chips is part of a broader strategy to design complete data center systems, which includes optimizing networks, cooling systems, and compute for specific workloads.

Microsoft, along with competitors Google and Amazon, is developing its own chips not only to reduce its dependence on Nvidia and AMD, but also to improve the efficiency of its products, taking into account specific requirements.

Share

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0