The process of modernizing and moving a company’s IT applications and infrastructure to the cloud, known as “cloud transformation”, continues to accelerate and will have a profound impact on businesses of every size and in every industry over the coming 5-10 years. The compelling benefits to businesses of shifting the burden of hosting and managing their data and software applications to a 3rd party cloud service provider such as AWS, Microsoft, or Google are driving this trend. At a high level, these include significant cost and operational savings, enhanced performance, reliability, scalability, security and a better foundation to become data driven for the benefit of customers and operations.
For years the goal of a comprehensive, holistic view of customers and business operations has been held as the “unicorn” of business data integration. Fortune 2000 companies have spent billions of dollars on efforts to achieve that magical state of frictionless data access and complete visibility into day to day operations, customer activities, impending problems and employee productivity. With myriad software applications in place (none of which are designed to play nicely with each other), surprisingly, many enterprises have achieved at least a measure of success in pulling those feeds together.
Competition in business is stronger than ever. In the pursuit of high levels of customer service and operational efficiency, companies are driven to purchase more and more software applications to support their efforts. While each new application usually makes sense in isolation– each is solving a specific need—this approach runs into serious trouble when considered as a component of the company’s overall business systems environment. As the number of applications a company operates proliferates, its application environment grows more complex and its data gets more fragmented and dispersed.
Since the widespread adoption of computers in business over fifty years ago there has been a need for companies to integrate data from different systems for reporting, analytics or application development purposes. As software systems advanced and proliferated companies found themselves supporting a wide range of dissimilar hardware, software and applications. Faced with demands from management, users, and regulatory authorities to report or access this disparate data in an integrated way, IT departments often took the expeditious approach of writing code to integrate the required disparate data sources to respond to these requests. As a particular report or application proved popular, requests for additional data to be added often followed. This meant additional, time consuming, hand coding resulting in excessive drains on IT resources. Overwhelmed by these numerous demands, IT departments often found it necessary to delay responding, creating a backlog that resulted in an ever-more frustrated stakeholder base.
Integrating businesses during an acquisition or merger is hard—so hard, in fact, that many fail in the attempt. While there are many reasons for this (cultural, financial, political) many acquisitions don’t live up to the expectations of either party because they fail to properly plan for and execute the integration of the dissimilar technologies of the two companies. In fact, a McKinsey study discovered that while: “more than half the synergies in a merger are strongly related to IT…fewer than 40% of companies realize these benefits due to a lack of planning during the due diligence phase of the process.” One approach that can substantially contribute to ensuring planned IT benefits are realized is the introduction and use of Data Virtualization.
When a developer or team of developers build a software application, an essential issue they grapple with is how the application will integrate various data sources (databases or web services) it will need to operate. Data virtualization is an elegant new technological approach that offers a powerful way to address this issue.