Tuesday, 15 March 2016

Hadoop, Greenplum & GPFS Big Data Environments Enabled by Commvault Software


Commvault announced technology enhancements to its market-leading Commvault integrated solutions portfolio designed to help enterprises better support and manage "big data" initiatives. Rolled out as part of the newest wave of innovation in the company's eleventh software release, Commvault's new technology will help bring best-practice policy and data management into projects leveraging big data environments such as Hadoop, Greenplum and Global Partition File Systems (GPFS).

According to a Gartner survey published in late 2015, more than three quarters of companies are investing in, or plan to invest in, big data initiatives in the next two years.  Given the need for actionable and intelligent insights into data sets and file systems, organizations increasingly must scale and store information at unprecedented levels. Big data initiatives leverage new approaches and technologies to store, index and analyze huge data sets, while minimizing storage requirements and driving faster outcomes.  However, as companies begin these initiatives, they often forgo applying data protection and disaster recovery routines to these large data sets sitting outside their traditional systems and infrastructures due to complexity, performance and cost issues.

The new innovations in Commvault Software and the Commvault Data Platform directly address these emerging customer requirements to manage big data environments.  Specifically, enhanced levels of visibility into the leading common big data tools including Hadoop, Greenplum and GPFS help customers map big data implementations and architectures, providing insight for how these environments can be protected and recovered in whole or across selected nodes, components, and/or data sets. Leveraging Commvault Software to manage these environments, users can now better understand the exact environment layout to drive performance, eliminate complexity, and better manage costs. 

Commvault Software provides companies an intelligent approach to protecting the complex infrastructure of big data initiatives with the ability to automate disaster recovery for these multi-node systems. The Commvault Data Platform further enhances the value of big data initiatives by extending seamless data portability to and from one of the industry's broadest range of infrastructure options (cloud, on premise, virtualized, traditional and converged).

"Companies of all sizes in all industries are quickly ramping up 'big data' initiatives, investing significantly to gain business insight from exploding data volumes, yet they are often moving forward without applying sound data management and disaster recovery disciplines for such strategic projects," said Don Foster, Senior Director of Solution Management at Commvault.  "In many cases the exponential growth of these big data infrastructures outpaces the ability for these solutions to self-manage and self-heal as they were designed. Now, for the first time, customers can leverage the full power of the Commvault Software portfolio to apply best-practices for data management to Hadoop, Greenplum and GPFS environments. These innovations provide important new benefits to Commvault customers, and open up significant opportunities for Commvault as the big data market continues to grow."

 "Over the past several years, we've been watching the growth of big data and seeing how organizations are adopting new technologies to manage the influx of information," said Phil Goodwin, Research Director at IDC. "The newest release of Commvault's open data platform leverages the company's history of data management capabilities to help provide customers with data intelligence and insight designed to make cloud deployments as seamless and cost effective as possible."

These latest innovations in the Commvault integrated solutions portfolio are available immediately.