Something that doesn’t get talked about enough in the service orientation world is data virtualization. That is, it’s handy to be able to pull data from various sources into an abstracted service layer, versus having services or applications tapping live production databases. This helps cut down the need for physical storage, and provides a common interface for all applications using the data, especially BI, analytics, and transaction systems.
Something that doesn’t get talked about enough in the service orientation world is data virtualization. That is, it’s handy to be able to pull data from various sources into an abstracted service layer, versus having services or applications tapping live production databases. This helps cut down the need for physical storage, and provides a common interface for all applications using the data, especially BI, analytics, and transaction systems.
The whys and hows of data virtualization are explored by Judith Davis and Robert Eve in a new book, Going Beyond Traditional Data Integration to Achieve Business Agility. As with any service technology engagement, data virtualization involves a lot of players across the enterprise, so challenges tend to be more organizational and cultural than technical.
Davis and Eve outline 6 key best practices anyone undertaking a data virtualization effort needs to consider:
1) Centralize responsibility for data virtualization. “The key benefit here is the ability advance the effort quickly and to take on bigger concepts, such as defining common canonicals and implementing an intelligent storage component,” the authors say.
2) Agree on and implement a common data model. “This will ensure consistent, high quality data, make business users more confident in the data and make IT staff more agile and productive.”
3) Establish a governance approach. “This needs to include how to manage the data virtualization environment. Key issues are who is responsible for the shared infrastructure and for shared services.”
4) Educate the business side on the benefits of data virtualization. “Allocate time to consult with business users and make sure they understand the data,” Davis and Eve advise. “Establish an ongoing effort to make data virtualization acceptable to other areas of the organization.”
5) Pay attention to performance tuning and scalability. “Tune performance and test solution scalability early in the development process. Consider bringing in massively parallel processing capability to handle query performance on high-volume data. Accommodate the fact that users are unpredictable on ad hoc analysis and reporting.”
6) Take a phased approach to implementing data virtualization. “First abstract the data sources, then layer the BI applications on top and gradually implement the more advanced federation capabilities of data virtualization.”