How to Ensure Your Big Data Initiatives Pay Off

10/17/2013
Big data has gone mainstream. Everywhere you look, people are talking about how to extract value from the massive and ever-growing troves of information residing within and beyond traditional data repositories. Information that, when fully leveraged, has the potential to vastly improve decision-making and transform businesses.
 
Not too long ago, big data was relegated to discrete projects that IT spun up on an as-needed basis. But today, if you’re not at least considering applying big data practices to your daily operations — and developing long-term plans to reliably support those initiatives — you’re at a profound disadvantage.
 
Why are big data projects so essential? Because they can impact nearly every aspect of your business, from operational efficiencies and product development to customer interactions and management processes.
 
For instance, imagine if you could identify patterns that enable you to predict — and ultimately avoid — manufacturing line problems. Or if you could increase sales through a recommendation engine for online purchases that could factor in everything from purchase history to demographic patterns in real-time, and instantly present tailored product options and pricings. Or if, as a gaming retailer, you could make real-time decisions on what happens next by continuously profiling players and discerning trends and, as a result, deliver more compelling adventures.  
 
All of this is eminently doable now. But in order to make it happen, you need to treat big data initiatives as mission-critical. This means ensuring any big data solution you deploy is enterprise-grade, just like your other core business systems.
 
Industry experts define enterprise-grade as requiring a fast network, multi-level security, built-in high availability and the ability to easily scale:
  • Network Connectivity
Big data includes huge quantities of structured (e.g., relational databases), semi-structured (e.g., XML files) and unstructured (e.g., basic text files) data from both internal and external sources. To effectively handle it all and enable ultra-fast decision-making, you need high-bandwidth network connectivity
  • Scalability
As your business experiences success with early big data initiatives, you will undoubtedly identify more data types and sources to funnel into the mix and more use cases for leveraging it. That data will grow at a surprisingly fast rate.
 
Consider the global universe of big data. IDC predicts it will be 44 times larger in 2020 than it was in 2009, expanding from 1.8 zettabytes to 35 zettabytes a year.1 Even if your data expands at a fraction of this rate, it will quickly push the limits of your infrastructure — unless you have a solution that can affordably and reliably  scale at the speed of business.
  • High Availability
When you depend on big data for strategic operations, you need to make sure it is always available. Otherwise, you risk losing revenue, customers and competitive advantage. Think about the catastrophic consequences of a retailer’s data-driven recommendation engine going down the week before Christmas, and you can begin to see that the more you ingrain big data practices, the more critical it is to ensure the highest possible levels of availability. 
  • Security
With so much of your business depending on big data, putting the right security measures in place is essential. The implications of not having effective security are far-reaching, from regulatory to compliance to privacy concerns. To ensure all data at rest is encrypted and can’t be accessed by unauthorized users, you must implement strict security at the data center, network and application levels.
 
Once you’ve realized that harnessing data is critical to your business, and that an enterprise-grade solution is key to successful big data practices, the real challenge becomes how to deploy and manage such a solution. As your big data operations scale, the infrastructure and overhead needed to support them can be substantial. And, as many businesses are starting to realize, in addition to finding capital budget, finding big data experts to run an in-house solution is far from easy. According to McKinsey & Co., demand for big data and analytics professionals in the U.S. alone will exceed the available supply by 140,000 to 190,000 positions by 2018.2 If you’re lucky enough to locate those experts, you may not be able to afford hiring them. 
 
The reality is that most businesses don’t have the infrastructure, expertise, budget and/or desire to implement, effectively scale and manage an enterprise-grade big data environment. Instead, they find that partnering with companies like Savvis, which offer fully hosted and managed big data in a services model, is the best way to go. Not only does it provide for a highly reliable, scalable infrastructure running leading big data software, along with the expertise to guide you in developing big data strategies, but it frees you to focus on effectively applying the outcomes of big data initiatives to your business so you can realize their significant benefits.

 
**1The Digital Universe in 2020: Big Data, Bigger Digital Shadows, and Biggest Growth in the Far East, December 2012
**2Big Data Analysts in Big Demand

 

ABOUT THE AUTHOR
Milan Vaclavik is Sr. Director and Solution Lead for Savvis’ big data practice. For more than 20 years, he has been bringing innovative software solutions to market in a variety of industries, including enterprise messaging and collaboration, digital rights management, document automation, supply chain management and physical security.  He has held senior product management, marketing and business development positions with startup software firms, as well as larger organizations such as Lotus Development/IBM, GE and LexisNexis. Milan holds a bachelor’s degree in Regional Science from the University of Pennsylvania and an MBA in Finance and Management of the Organization from Columbia Business School.
X
This ad will auto-close in 10 seconds