New Data Center Challenges Demand New Strategies


brian schwarz headshotsm2

Brian Schwarz serves as a Vice President of Product Management at Pure Storage focusing on the FlashBlade product line. Prior to joining Pure in 2014, he spent six years at Cisco Systems building the Unified Computing System offering.

The deluge of data from sensors and devices, combined with AI and other applications that consume big data, is putting extreme pressure on data center infrastructure. IDG recently sat down with Brian Schwarz, VP of Product Management at Pure Storage®, to discuss today’s key data challenges and strategies.

What are the main problems associated with growing volume of data?

People are collecting large amounts of information, but they’re doing it in too many data silos. As you get to the hundreds or thousands of applications that exist in large-scale enterprises, even a small amount of variance and optimization in each silo can create a massive amount of complexity and overhead, and also make securing the data more difficult.

Beyond the silo problem, storage is generally more complex and confusing that other tiers of the data center, including the server and network tiers. There has been more standardization in the other tiers, while the storage tier has more diversity – diversity in vendors, in performance and capacity tradeoffs, in price points, and in features and capabilities.

How are IoT and AI affecting the data infrastructure?

The Internet of Things (IoT), which is fundamentally about machines generating data (versus humans generating data) is a big factor. Artificial intelligence (AI) is a net new phenomenon that consumes and analyzes massive amounts of data. One thing AI does well is process unstructured data such as video and audio files. Rich media, however, is a very heavy form of data, in that it consumes a lot of storage. With AI’s growth, data storage and delivery demands are problems that will get worse, not better.

How can organizations transition to a more modern data infrastructure?

To start, you have to stand back from the day-to-day projects and assess where you want to be in five years. If you know that AI is part of your five-year strategy, for instance, you need to start making incremental decisions that get you closer to your endpoint.

You also can’t just optimize your storage infrastructure for each and every project. You want to optimize to the big set of your needs. You might optimize a particular application’s storage to 80-90% rather than 100% if it will help you achieve your long-term goal and its larger benefits.

As you invest in storage platforms, you also need to consider your future needs. For example, you may want to buy devices that “speak” both the common file protocol for your existing applications as well as the emerging object protocol for AI and other emerging applications.

What role can flash storage play in this transition?

Solid state flash storage systems can provide a big first step toward meeting both today’s and tomorrow’s data demands. Flash storage is about 1,000 times faster and 10 times more energy efficient than rotating hard disk drives.

Flash is such a better media, such an inflection point, that it gives you the opportunity to do things differently than you’ve done them for the past 15+ years. For example, flash lets existing data centers support more storage and more applications without having to upgrade the power that goes into the facilities.

One caution: Hard drives are the main bottleneck in the vast majority of data centers today, but flash can remove that storage bottleneck. The bottleneck may then shift to another data center tier, such as the network tier, which may need to upgrade from, in essence, a country road to a super highway to handle the data volumes and speeds that flash systems can deliver.

For more information on Pure Storage, visit www.purestorage.com/whypure.html.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *