Big Data Requires Integration Technology

November 12, 2014
207 Views

The market for big data continues to grow as organizations try to extract business value from their own masses of data and other sources. Earlier this year I outlined the dynamics of the business opportunity for big data and information optimization.

The market for big data continues to grow as organizations try to extract business value from their own masses of data and other sources. Earlier this year I outlined the dynamics of the business opportunity for big data and information optimization. We continue to see advances as big data and associated information technologies deliver more value, but the range of innovation also has created fragmentation among existing systems including databases that are managed onpremises or in cloud computing environments. In this changing environment organizations encounter new challenges not only in adapting to technology that is more efficient in automating data processing but also in integrating it into their enterprise architecture. I’ve already explained how big data can be ineffective without integration, and we conducted more in-depth research into the market, resulting in our benchmark research on big data integration, which reveals the state of how organizations are adopting this technology in their processes.

vr_BDI_03_plans_for_big_data_technologyThe research shows that use of big data techniques has become widespread: Almost half (48%) of all organizations participating in this research and two-thirds of the very large ones use it for storage, and 45 percent intend to use big data in the next year or sometime in the future. This is a significant change in that most organizations have used relational database manage­ment systems (RDBMSs) for nearly everything. We find that RDBMSs (76%) are still the most widely used big data technology, followed by flat files (61%) and data warehouse appliances (46%). But this is not the direction many companies are planning to take in the future: Hadoop (44%), in-memory database (46%), specialized databases (43%) and NoSQL (42%) are the tools most often planned to be used by 2016 or being evaluated. Clearly there is a revolution in approaches to storing and using data, and that introduces both opportunities and challenges.

vr_BDI_01_automating_big_data_integrationEstablishing a big data environment requires integrating data through proper preparation and potentially continuous updates of data, whether in real time or batch processing. A further complication is that many organizations will not have only one but several big data environments to be integrated into the overall enterprise architecture; that requires data and systems integration. Our research finds that some organizations are aware of this issue: Automating big data integration is very important to 45 percent and important to more than one-third. Automation can not only bring efficiency to big data but also remove many risks of errors or inaccurate and inconsistent data.

Data integration technologies have evolved over the past decade, but advances to support big data are more recent. Our research shows a disparity in how well organizations handle big data integration tasks. Those that are mostly or completely adequate are accessing (for 63%), loading (60%), extracting (59%), archiving (55%) and copying (52%) data while the areas most in need of improvement are virtualizing (39%), profiling (37%), blending (34%), master data management (33%) and masking for privacy (33%). At the system level, the research finds that conventional enterprise capabilities are most often needed: load balancing (cited by 51%), cross-platform support (47%), a development and testing environment (42%), systems management (40%) and scalable execution of tasks (39%). To test the range of big data integration capabilities before it is applied to production projects, the “sand­box” has become the standard approach. For their development and testing environ­ment, the largest percent­age (36%) said they will use an internal sandbox with specialized big data. This group of findings reveals that big data integration has enterprise-level requirements that go beyond just loading data to build on advances in data integration.

Big data must not be a separate store of data but part of the overall enterprise and data architecture; that is necessary to ensure full integration and use of the data. Organizations that see data integration as critical to big data are embarking on sophisticated efforts to achieve it. The data integration capabilities most critical to their big data efforts are to develop and manage metadata that can be shared across BI systems (cited by 58%), to join disparate data sources during transformation (56%) and to establish rules for processing and routing data (56%).

vr_BDI_09_big_data_integration_starts_with_basicsOther organizations are still examining how to automate integration tasks. The most common barriers to improving big data integration are cost of the software or license (for 44%), lack of resources to use on improvement (37%) and the sense that big data technologies are too complicated to integrate (35%). These findings demonstrate that many organizations need to better understand the efficiency and cost savings that can be realized by using purpose-built technology instead of manual approaches using tools not designed for big data. Along with identifying solid business benefits, establishing savings of time and money are essential pieces of a convincing rationale for investment in big data integration technology. The most time spent in big data integration today is on basic tasks: reviewing data for quality and consistency (52%), preparing data for integration (46%) and connecting to data sources for integration (39%). The first two are related to ensuring that data is ready to load into big data environments. Data preparation is a key part of big data and overall information optimization. More vendors are developing dedicated technology to help with it.

For a process as complex as big data integration, choosing the right technology tool can be difficult. More than half (55%) of organizations are plan­ning to change the way they assess and select such technology. Evaluations of big data integration tools should include considerations of how to deploy it and what sort of vendors can provide it. Almost half (46%) of organizations prefer to integrate big data on-premises while 28 percent opt for cloud-based software as a service and 17 percent have no preference. Half of organizations plan to use cloud computing for managing big data; another one-third (32%) don’t know whether they will. The research shows that the most important technology and vendor criteria used to evaluate big data integration technology are usability (very important for 53%), reliability (52%) and functionality (49%). These top three evaluation criteria are followed by manageability, TCO/ROI, adaptability and validation of vendors. Organizations are most concerned to have technology that is easy to use and can scale to meet their needs.

Big data cannot be used effectively without integration; we observe that the big data industry has not paid as much attention to information management as it should – after all, this is what enables automating the flow of data. Organizations trying to use big data without a focus on information management will have difficulty in optimizing the use of their data assets for business needs. Our research into big data integration finds that the proper technology is critical to meet these needs. We also learned from our benchmark research into big data analytics that data preparation is the largest and most time-consuming set of tasks that needs to be streamlined for best use of the analytics that reveal actionable insights. Organizations that are initiating or expanding their big data deployments whether onpremises or within cloud computing environments should have integration at the top of their priority list to ensure they do not create silos of data that they can’t fully exploit.