
In a significant strategic pivot, Microsoft aims to increasingly rely on its proprietary chips for data centers, as revealed by the company’s chief technology officer in a recent discussion. This move could potentially lessen the tech giant's dependency on industry leaders like Nvidia and AMD for semiconductor technology. Semiconductors play a vital role in the development of artificial intelligence models and their applications. Currently, Nvidia holds a dominant position with its graphics processing units (GPUs), while AMD captures a smaller market share. However, major cloud service providers, including Microsoft, have also been creating custom chips tailored for their data center needs. During a fireside chat at Italian Tech Week, moderated by CNBC, Kevin Scott, Microsoft’s CTO, outlined the company's future plans regarding AI chip development. Traditionally, Microsoft has sourced chips primarily from Nvidia and AMD, focusing on selecting silicon that delivers optimal price-performance ratios. Scott emphasized, "We’re not religious about what the chips are," noting that Nvidia has been the go-to choice for many years due to its competitive performance. Despite this, Microsoft has been integrating some of its own chips in operations. In 2023, the company introduced the Azure Maia AI Accelerator, specifically designed for AI workloads, along with the Cobalt CPU. Moreover, Microsoft is actively developing a new generation of semiconductor products. Recently, they also unveiled innovative cooling technology utilizing microfluids to address the persistent issue of overheating in chips. When asked about the long-term vision for chip usage in Microsoft data centers, Scott affirmed, "Absolutely," indicating that a significant portion of their silicon is now from Microsoft. This chip focus forms part of a broader strategy to design complete systems optimized for data center operations. Scott stated, "It’s about the entire system design," highlighting the importance of integrating networks, cooling solutions, and chip technology to maximize computational efficiency. Microsoft, along with competitors like Google and Amazon, are not only developing their own chips to reduce dependence on Nvidia and AMD but also to enhance their products’ efficiency according to specific requirements. In a landscape where tech giants such as Meta, Amazon, Alphabet, and Microsoft have committed over $300 billion in capital expenditures this year—largely aimed at AI investments—the demand for computing power remains staggering. Scott noted the ongoing challenges in meeting this demand, stating, "[A] massive crunch [in compute] is probably an understatement." He pointed out that since the launch of ChatGPT, the pace of building sufficient computing capacity has been daunting. Despite Microsoft's extensive investments in data centers over the past year and plans for further expansion, Scott cautioned that even their most ambitious forecasts often fall short of the growing demand.
In a dramatic turn of events, Anthropic's legal representative claims the U.S. government is actively encouraging the st...
Business Insider | Mar 11, 2026, 02:35Oracle has addressed investor worries regarding its aggressive spending on data centers, emphasizing its commitment to e...
Business Insider | Mar 11, 2026, 24:15Lovable, a Swedish startup revolutionizing vibe coding, has witnessed an impressive 33% surge in its annual recurring re...
Business Insider | Mar 11, 2026, 01:30Anduril Industries has made headlines with its recent acquisition of ExoAnalytic Solutions, a firm specializing in missi...
CNBC | Mar 11, 2026, 04:15
In a strategic move amidst ongoing legal disputes involving Anthropic and the U.S. Department of Defense, Google is expa...
Business Today | Mar 11, 2026, 06:25