Nov
9

The Big Data Approach to Business Continuity

The Big Data Approach to Business Continuity

November 9
By

Just a few years ago, big data was one of those nauseating buzzwords no one could stop talking about. The initial buzz has fizzled in favor of other gimmicky trends, but the phenomenon itself continues to grow at an incredible rate.

Gartner projects that worldwide revenue for business intelligence and analytics—the software foundation of big data—is on pace to reach $18.3 billion in 2017, and will climb to $22.8 billion by the end of 2020. The firm highlights a number of likely growth triggers, but the biggest driver is the growing need of organizations to find a more efficient way to tame the massive amounts of information they manage.

But there’s much more to big data than meets the eye. The big picture view reveals that with the right tools, the right information can be transformed into powerful, actionable insights. A well of virtually untapped potential has government, healthcare, and a multitude of industries investing heavily in effort to put their data to good use. But while big data presents a vast opportunity, there are also significant challenges to overcome, particularly where disaster recovery and business continuity are concerned.

Limited Big Data Protection

Platforms like Hadoop support big data applications by leveraging the same distributed computing framework that powers the cloud. Instead of using a single server to store, process, and manage large data sets, the load is evenly distributed across a cluster of many servers. These platforms are specially designed to streamline data management in the most efficient way possible. What role does data protection have in the process? One that is limited at best.

Hadoop, in particular, includes a number of features that aim to safeguard business data. The data protection suite offers encryption to protect data at the OS level and in transit across channels such as Data Transfer Protocol (DTP) and Remote Procedure Call (RPC). Another interesting feature is the NameNode, which stores spare copies of files in the Hadoop Distributed File System (HDFS) and maintains the state of the system. There is also built-in replication to protect against data loss and corruption. While incredibly useful, these features can only contribute so much to business continuity.

In order to provide maximum protection, all Hadoop data should be backed up on a daily basis to a secure offsite location. This goes for business data as well as the NameNode, which may contain outdated information or experience failure during recovery attempts. By coupling the onboard protection features of your big platform with a reliable third-party backup solution, you’ll gain the peace of mind that comes from knowing you can recover from the impact of severe weather, cybersecurity attacks, or other unforeseen disasters.

The Need for Bigger and Better Backups

The sheer size of a big data deployment can create major headaches for IT managers. Using a customary backup tool in this environment will likely lead to one of two scenarios: either inconsistent and unreliable performance, or complete failure. Traditional backup solutions are typically intended for scale-up databases like MySQL and other specific applications. As a result, they may run into problems when introduced to distributed databases that effortlessly process scores of petabytes. You would probably be just as lost if all you know is SQL queries and you were suddenly thrown into the NoSQL jungle without the proper tools or training.

Once you accept that you have a big data problem on your hands, it’s time to come up with a bigger, better backup strategy to protect it. This sort of project calls for looking beyond price and features as you’ll need to find a solution that is scalable and accommodates your rapidly growing data storage needs. Vendors of traditional solutions are steadily tweaking their technology to better support the big data revolution. It may not happen overnight, but if the backup market’s ability to finally adapt to virtualization and cloud computing are any indication, the stars will align soon enough.

Power to the People

When it comes to big data, solving the backup challenge is just one critical element to ensure business continuity. Personnel is equally important to the cause due to the number of requirements that must be met. The objectives of IT, BC managers, and security experts must continually align for the betterment of data management, storage, and protection. Whether it’s handling everything in-house or outsourcing to managed services, having the right people in the right place is essential to developing a business continuity strategy that makes the most of big data.