| ||||||||
| ||||||||
Current Filter: Storage>>>>>Technology Focus> Do you copy? Editorial Type: Technology focus Date: 01-2016 Views: 2421 Key Topics: Storage Virtualisation Infrastructure Big Data Backup Disaster Recovery Key Companies: Actifio Key Products: Key Industries: | |||
| Copy data virtualisation can help to free an organisations' data from its legacy physical infrastructure, suggests Ash Ashutosh, CEO, Actifio When the words 'Big Data' are used, there is much discussion about how to use, manage, and store data as a strategic advantage for companies. What is often forgotten is the fact that most organisations do not need special Big Data applications that are promoted under this hype. However, what in many cases is useful and necessary as a prerequisite for the efficient use and analysis of any company's data is the virtualisation of their data in the enterprise. The idea is based on the same concept as virtualised servers and networks in the past already having contributed significantly to the efficiency of businesses. By performing the essential step of data virtualisation, businesses are ideally equipped for handling the upcoming petabyte data loads that can be expected from Big Data.
UNDERSTANDING BIG DATA BETTER Its popularity is such that for many, it is seen as the Holy Grail for businesses today. It will enable organisations to understand what their customers want and target them to drive profitable sales and growth. The Big Data trend has the potential to revolutionise the IT industry by offering businesses insight on previously ignored and underused data.
GOING GLOBAL This trend has stimulated an intense debate about how Big Data can help organisations improve customer targeting and drive revenue. Amid the excitement, Big Data is often over-hyped and discussed in a context that overlooks the fact that data is meaningless without intelligent insight. The challenge for users is to negotiate toward a successful outcome while avoiding falling for the hype. Insight is important. But just because organisations now have access to vast amounts of information, they still need to understand and draw conclusions from complex and unwieldy data. Many fall into the trap of believing a correlation between data sets is all that is needed.
CAUSE AND EFFECT Above all else, Big Data is about storing, processing and analysing data that was previously discarded as being too expensive to store and process with traditional database technologies. That includes existing data sources such as web, network and server log data, as well as new data sources such as sensor and other machine generated data and social media data. For IT professionals, the opportunity to lead the way in helping organisations store and manage data is key. IDC has estimated that 60% of what is stored in data centres is actually copy data - multiple copies of the same thing or outdated versions. The vast majority of stored data are redundant copies of production data created by disparate data protection and management tools like backup, disaster recovery, development, testing, and analytics. While many IT experts are focused on how to deal with the mountains of data that are produced by this intentional and unintentional copying, far fewer are addressing the root cause of exponential copy data growth. In the same way that prevention is better than cure, reducing this weed-like data proliferation at the core should be a priority for businesses.
Page 1 2 | ||
Like this article? Click here to get the Newsletter and Magazine Free! | |||
Email The Editor! OR Forward Article | Go Top | ||
PREVIOUS | NEXT |