In the rapidly evolving landscape of artificial intelligence, the role of data storage is often underestimated. Scott Shadley, Director of Leadership Narrative and Technology Evangelist at Solidigm, emphasizes that data storage has been the 'underappreciated child' in computing architectures. However, the surge in data volume and speed driven by AI is reshaping this perception. Just five years ago, organizations would capture vast amounts of data, often retaining only a fraction. Today, there's a growing desire to preserve every piece of information. Traditionally, storage decisions focused on cost per gigabyte, with nearly 90% of data center storage still relying on older hard disk drives (HDDs), which are cheaper than their faster counterparts, solid-state drives (SSDs). Yet, as AI workflows demand more from storage solutions, HDDs are struggling to keep pace, highlighting the need for a reevaluation of data storage strategies. Shadley points out that while HDDs have long been praised for their cost-effectiveness—currently around $0.011 per GB—their limitations are becoming increasingly evident. For example, in high-demand environments like the Los Alamos National Labs, where researchers simulate seismic activity from underground nuclear tests, the need for rapid data capture and analysis surpasses what HDDs can provide. The latency involved in reading and writing data with HDDs simply cannot meet the speed requirements of modern AI applications. As SSD technology improves, it is becoming clear that these drives, although initially more expensive, offer a lower total cost of ownership (TCO) over longer periods. Solidigm's recent white paper shows that when storing one exabyte (one million terabytes) of data, SSDs outshine HDDs in terms of space efficiency, energy consumption, and reliability. Shadley notes that while HDDs have reached a storage capacity ceiling, with the largest currently at around 30 TB and projections for 100 TB by 2030, SSDs are already pushing beyond those limits. Solidigm is delivering 122 TB SSDs that are more compact and allow for higher density storage. Innovative solutions, such as liquid-cooled SSDs developed in collaboration with NVIDIA, are setting new standards for data storage in enterprise environments. These SSDs address various challenges and offer benefits like reduced power consumption and space savings, which are crucial for maximizing GPU performance in AI computing. Ultimately, to fully unleash the capabilities of GPUs, the entire data infrastructure must evolve. Shadley stresses the importance of focusing on the vast amounts of data stored, as it forms the foundation of the AI pipeline. As the demand for rapid data processing grows, organizations must prioritize advanced storage solutions to keep up with the pace of technological advancement.
In the early months of 2026, generative artificial intelligence has experienced an extraordinary leap in capabilities, e...
CNBC | Feb 28, 2026, 13:15
In a bold move to capture attention at the Mobile World Congress (MWC) in Barcelona, Xiaomi has introduced a range of ne...
TechCrunch | Feb 28, 2026, 15:45
Health authorities in Illinois recently sought the assistance of an AI chatbot to unravel a bizarre outbreak associated ...
Ars Technica | Feb 28, 2026, 18:20
The AI assistant app Claude, developed by Anthropic, skyrocketed to the second position on Apple's list of top free apps...
CNBC | Feb 28, 2026, 17:25
In the newly unveiled game 'Data Center,' players are thrust into the chaotic world of data management, beginning their ...
Business Insider | Feb 28, 2026, 10:00