Big Data is mainstream. Over the past few years, the Big Data buzz has dissipated as companies look to shift from awareness to action. At O’Reilly’s Strata+Hadoop conference last September, I was excited to hear the victorious claim that Big Data had gone mainstream and ever since I have seen evidence through our clients, partners, and publicized examples of Big Data cases. While language like ‘data lake’, ‘data exhaust’, and ‘dark data’ still muddy the waters for Big Data action, most companies are now beyond the veil, acting on the Big Data trend.
The purpose of Business Intelligence (BI) is to take the often fast moving operational data, clean it, enrich it, and model it for fast and logical and performant analysis. When implemented correctly, BI can help decision makers get to the right data at the right time. Too often, we see complex systems with convoluted support processes. Days, weeks, and months can pass for seemingly small changes costing the organization time and money. Meanwhile, complexity increases as companies look to introduce more data, additional domains, and external data sources. In this post, I explain what the emergence of Big Data means for traditional BI and highlight predictions for continued adoption.
Emergence of Big Data – Pointing out the Problem
At the same time as BI was beginning to gain traction in the enterprise world, Big Data technology was being explored by internet companies to operate at scale. Due to the unprecedented growth of customer interaction, companies like Google and Yahoo were faced with key data challenges that forced adaptation. By scaling out data across commodity hardware instead of scaling hardware up, these companies were able to work through these storing challenges:
- Storing Data Costs Too Much – In a scale up model, hardware and licensing costs grow exponentially as more data is introduced. The typical response to hardware and licensing costs is to collect less data, or data at a different grain, but this can in turn limit competitive advantage.
- Data is Dynamic – Instead of the historically slowly changing data sources, companies now have to work with dynamic sources, often external to the organization. Unstructured and semi-structured data types like video and document are also commonplace whereas databases have historically been built to store basic data types.
- Data Needs Change – Data users want to ask questions, change their assumptions, and adapt their hypothesis. Needs change, and traditional models are slow to adjust. Waiting for data to be cleansed, mapped, and modeled costs valuable time. Often, the data coming out of this process is less trusted due to the black-box of business rules and transformations.
The emergence of Big Data is the result of innovators challenging these key problems. These changes are beginning to make their way from operations to analytics as companies look to leverage Big Data technologies to understand data across their organizational domains.
Predictions for Big Data Analytics
Business Intelligence is plagued with the same problems that forced Big Data adaption in operations. Today, BI is too big and clunky to meet the modern organization needs. While the concepts behind BI and data modeling for analysis will not disappear, BI systems must evolve to become nimble and enable Lean Analytics. The following predictions highlight the direction of the Big Data Analytics industry:
- The Big Data buzz will shift from ‘Operational’ to ‘Analytical’. The three V’s – Volume, Velocity, Variety – helped explain how Big Data technologies could open doors to operationalizing more and new types of data. The 3 V’s fall short for analytics, and more focus will be placed on how organizations can effectively analyze data across the many domains they are capturing. Being able to combine analysis across many sources and quickly absorb new source acquisition is a huge opportunity in the space.
- Lean Analytics will become the standard. Data exploration tools are changing the market to include more people in the data driven decision making process. This shift will force nimbleness of the underlying data model, both in the acquisition of new sources and the adjustment of the performant data structure that supports analysis. The demand for agility will cause nimble architectures to emerge as best practice to enable Lean Analytics. Governance around data use will become increasingly important to ensure accuracy and maintain consistency.
- Hybrid Big Data & BI solutions will enable exploration. The marketplace is filled with niche tools that piece together to deliver a full solution. While some organizations will wait for emerging technologies to be sold in a packaged product with a recognizable name, others will look to leverage Big Data technology in an isolated area of their architecture. Lean Analytics provides a platform for organizations to ask more, and different types of questions.
- Big Data Analytics will help transform digital offerings. There is an unprecedented competitive advantage to transforming through the capture and use of data. We are seeing companies leveraging data in creative ways to transform their businesses. More companies and offerings will emerge delivering Data as a Service, Analytics as a Service, and Dashboarding. Data will be seen more and more as an asset as ROI is apparent through these direct revenue channels.
Our Approach to Lean Big Data Analytics
West Monroe has been working to develop a methodology to solve for the identified problems, and have labeled our methodology & resulting platform “RAP” – “Rapid Analytics Platform”. RAP is named for its ability to rapidly adjust to changing business requirements and accommodate new sources. By minimizing technical complexity of the data architecture, the RAP Methodology enables Lean and Agile analytical data processing to meet always changing business needs. The RAP methodology utilizes Big Data technology to help our clients overcome common data problems:
- Combine tried-and-true architectural standards with cutting edge Big Data technologies to reduce licensing costs and enable the performance expected by traditional data modeling solutions. Enable large volumes through scale-out capabilities.
- Enable schema-on-read concepts to enable nimbleness in data ingestion.
- Shift data modeling to less support intensive ‘configuration’ rather than ‘customization’.
RAP pulls from data warehousing methodology and sprinkles in principles and technologies from the Big Data revolution that enable this Lean approach to data analytics. For more information on the RAP methodology, check out our whitepaper on RAP – Building a Lean Analytics Platform. While there are many other use cases for Big Data technologies, we have found this methodology leverages select areas to directly impact our client’s day to day operations.
Does your organization utilize trending Big Data technologies or principles in your organization? We are interested in hearing about how organizations are blending traditional analytical architectures with trending Big Data architectures – comment below.