Banner
Backup D.R. Replication Virtualisation Hardware/Media Privacy

Current Filter: Storage>>>>>Opinion>

PREVIOUS

   Current Article ID:4426

NEXT



Right-size your flash: making flash storage work harder, more efficiently

Editorial Type: Opinion     Date: 07-2014    Views: 2908   




By Rajesh Nair, Chief Technology Officer at Tegile.

In recent years, storage has become really flashy… pun intended! A lot has been written and said about flash-hybrid, all-flash storage and flash as a disruptive media. However, the debate about whether to use flash or not is now moot – instead, it is time for companies to start asking smart questions to ensure their flash storage works harder.

All storage administrators will be familiar with the fact that enterprise workloads are just as varied as enterprise applications. Some workloads, such as online transaction processing (OLTP) and virtual desktops, are extremely latency sensitive, whereas others such as file shares and back up are throughput sensitive. In addition you also have virtualisation I/O which sits somewhere in the middle and can make it harder to distinguish workloads due to the ‘I/O Blender’ effect. As a result, storage administrators need to be cognisant of what storage functionality is required and what best suits the workload they are operating. So – taking all of this into consideration, how are organisations expected to choose?

It’s really very simple. By reviewing the classic buying criteria, organisations can determine which storage solution will best suit their needs, whether that is flash, hybrid or legacy storage.

A key factor when selecting a storage solution is IOPS or performance. Traditionally performance has been achieved by scaling the number of disks/spindles used. However, the introduction of hybrid architectures that leverage flash to accelerate performance has meant organisations can achieve the same performance with significantly fewer disks, spinning or otherwise. As a result, we expect 2014 to be the year that many customers start to break away from their incumbent storage system vendors, having realised that new storage architectures can be more effective.

Another aspect to consider is latency. Whether it is an end-user application (VDI) or a server application (database or mail servers), lower storage latency affects perceived and real performance. As a result, it is important for an organisation to review which applications are most latency sensitive and which do not require the same attention.

Reviewing storage capacity levels should be a focus for all organisations as enterprise data is ever increasing, with IDC stating that “annual sales of storage capacity will grow by more than 30 percent every year between 2013 and 2017 ”. Traditional storage incumbents do a very good job in this area, as do hybrid architectures. However, all-flash storage solutions can struggle as most are not able to provide the high capacities required for datacentres, and if they can, they fail the risk mitigation test as components can usually only be sourced from one vendor.

Another aspect to consider is rich storage functionality, which is largely a software play. This covers several important aspects such as access methodologies (not getting boxed into a single protocol), data protection (RAID, local snapshots, remote replication and DR support), high availability, integration into hypervisors and host operating systems. It’s important to note here that all the performance and capacity in the world is useless unless an organisation can ensure its databases and other applications are well protected locally and remotely, and can sustain disasters.

Something else to consider is cost. There are a number of budget influences, everything from acquisition costs and total cost of ownership will have an impact, not to mention the budget limitations of individual organisations. An ideal formula will consider cost per IOPS, cost per TB and ongoing management/support costs. Hybrid architecture solutions often offer the best value here, however for organisations running all-flash systems, data reduction technologies such as deduplication and compression can significantly reduce the cost.

Finally, one often ignored, but highly critical aspect is the quality of flash used within all-flash or hybrid storage systems. Multi -Level Cell (MLC) NAND, the most popular flash media used in all-flash and hybrid storage solutions, predominantly comes in two flavours – enterprise-grade (eMLC) and consumer-grade (cMLC). The major difference between the two is the longevity of flash memory cells as defined by the number of times it can be erased and re-written (programme-erase cycles). While cMLC is rated at about 3000-5000 programme erase cycles, the higher quality eMLC flash is rated at approximately 30,000-35,000 PE cycles. It is important to be aware of the quality of flash in a storage array as it can greatly impact the lifespan of the array and reliability of data.

The criteria discussed should not be new to any organisation and we anticipate it to remain the same for the foreseeable future. Flash is a revolutionary media but before organisations get swept up in the tide it is important to take a balanced approach, keeping these criteria in mind. In doing so, organisations should be afforded a storage solution that best addresses individual business needs.

Ideally organisations will end up with a seamless architecture that leverages flash both as a standalone media (all-flash) as well as an accelerator for other media (hybrid) – resulting in the best combination of cost per IOPS, cost per TB and latency characteristics. Organsations can choose to ignore one or more of these criteria for a narrow application; however, for storage across the entire datacentre, it is essential to choose a platform that gives you the best of all worlds.

There will always be faster and larger media, and equally voracious applications that will consume at higher speeds and bigger capacities. Looking forward, the number of storage options will only increase as costs are reduced and technology advances. Therefore, a large part of storage evaluation should be looking for a solution that can balance the tension between capacity and performance – the concept of right-sizing flash solutions has never been more relevant. It is essential that organisations review their storage selection criteria and choose the most relevant solution, that not only suits the business now, but that also easily scales up and down as the organisation evolves.

Like this article? Click here to get the Newsletter and Magazine Free!

Email The Editor!         OR         Forward ArticleGo Top


PREVIOUS

                    


NEXT