Showing posts with label Philip Russom. Show all posts
Showing posts with label Philip Russom. Show all posts

Friday, April 30, 2010

Data Integration in a Nutshell: Four Essential Guidelines


Dashboard Insight published a post by Philip Russom, senior manager of TDWI Research, called Data Integration in a Nutshell: Four Essential Guidelines, where he compiled a list of four points that keep coming up in conversations, interviews, and consulting about data integration. He thinks of these points as guidelines in a nutshell that can shape how people fundamentally think of DI, as well as how people measure the quality, modernity, and maintainability of DI solutions. He hopes these nutshell guidelines can help DI specialists and the people who work with them to see a more future-facing vision of what DI can and should be.

Guideline #1: Data integration is a family of techniques and best practices

The unfortunate knee-jerk reaction of many data warehouse professionals is that the term data integration is synonymous with ETL (extract, transform, and load) simply because ETL is the most common form of data integration found in data warehousing. However, there are other techniques (and best practices to go with them), including data federation, database replication, and data synchronization. Different techniques have different capabilities and prominent use cases, so it behooves a data integration specialist to know and apply them all.

Guideline #2: Data integration practices reach across both analytics and operations

In Analytic DI, one or more DI techniques are applied in the context of business intelligence (BI) or data warehousing (DW). Operational DI applies DI techniques outside BI/DW, typically for the migration or consolidation of operational databases, synchronizing operational databases, or exchanging data in a business-to-business context. Analytic DI and operational DI are both growing practice areas, and both are progressively staffed from a common competency center or similar organization.

Guideline #3: Data integration is an autonomous data management practice

In some old-fashioned organizations, DI is considered a mere subset of DW. It can be that, but it can also be independent. For example, the existence of operational DI proves DI’s independence from DW. Furthermore, hundreds of DI competency centers have sprung up in the last ten years or so as a shared-service organization for staffing all DI work -- not just DI for DW.

Guideline #4: A data integration solution should have architecture

After all, other types of IT solutions have architecture. DI architecture helps you with DI development standards, the reuse of DI objects, and the maintenance of solutions. The preferred architecture among integration technologies -- whether for data or application integration -- is the hub-and-spoke. For this reason, most DI tools today lend themselves to hub-and-spoke. However, there are many variations of it, so you need to actively design an architecture for your DI solutions.

Friday, August 14, 2009

Collaborative Data Integration


What Works is a publication with an interesting content, provided by The Data Warehousing Institute(TDWI). In the Volume 27 - August 2009, Philip Russom, Senior Manager of TDWI Research, published a good article entitled Collaborative Data Integration.

He started the article with the definition of TDWI Research for collaborative data integration: A collection of user best practices, software tool functions, and cross-functional project workflows that foster collaboration among the growing number of technical and business people involved in data integration projects and initiatives.

According the article, several trends are driving up the requirements for collaboration in data integration projects:
- Data integration specialists are growing in number
- Data integration specialists are expanding their work beyond data warehousing
- Data integration work is increasingly dispersed geographically
- Data integration is now better coordinated with other data management disciplines
- More business people are getting their hands on data integration
- Data governance and other forms of oversight touch data integration

Different organizational units provide a structure in which data integration can be collaborative:
- Technology-focused organizational structures
- Business-driven organizational structures
- Hybrid structures

"Corporations and other user organizations have hired more inhouse data integration specialists in response to an increase in the amount of data warehousing work and operational data integration work outside of warehousing", he wrote.

"Although much of the collaboration around data integration consists of verbal communication, software tools for data integration include functions that automate some aspects of collaboration", he also wrote. Some features have existed in other application development tools, but were only recently added to data integration tools,like: Check out and check in, Versioning, and source code management features.

About Data Integration Tool Requirements for Business Collaboration: "a few data integration and data quality tools today support areas within the tools for data stewards or business personnel to use. In such an area, the user may actively do some hands-on work, like select data structures that need quality or integration attention, design a rudimentary data flow (which a technical worker will flesh out later), or annotate development artifacts (e.g., with descriptions of what the data represents to the business)".

He explained that collaboration via a tool depends on a central repository: The views just described are enabled by a repository that accompanies the data integration tool. Depending on the tool brand, the repository may be a dedicated metadata or source code repository that has been extended to manage much more than metadata and development artifacts, or it may be a general database management system.

He finished with some recommendations:

- Recognize that data integration has collaborative requirements. The greater the number of data integration specialists and people who work closely with them, the greater the need is for collaboration around data integration.

- Determine an appropriate scope for collaboration. At the low end, bug fixes don’t merit much collaboration; at the top end, business transformation events require the most.

- Support collaboration with organizational structures. These can be technology focused (like data management groups), business driven (data stewardship and governance), or a hybrid of the two (BI teams and competency centers).

- Select data integration tools that support broad collaboration. For technical implementers, this means data integration tools with source code management features (especially for versioning). For business collaboration, it means an area within a data integration tool where the user can select data structures and design rudimentary process flows for data integration.

- Demand a central repository. Both technical and business team members—and their management—benefit from an easily accessed, server-based repository through which everyone can share their thoughts and documents, as well as view project information and semantic data relevant to data integration.

Wednesday, August 27, 2008

TDWI's Technology Poster about Master Data Management


The Data Warehousing Institute (TDWI) published Technology Poster about Master Data Management, designed by Philip Russom, senior manager of TDWI Research. According TDWI: the Master Data Management poster sorts out the complex layers of the MDM stack, illustrating how people, practices, and software automation are coordinated in a mature MDM implementation.

This poster will help explain:
- The tools, technologies, and techniques that go into the MDM technology stack.
- How the tech stack adjusts to practices like operational MDM and analytic MDM.
- How the MDM technology stack is influenced by pre-existing systems, architectural approaches, growth over time, and build-versus-buy decisions.

This kind of poster is very interesting, because it can be used to explain how MDM works in two ways, first as an overview and after detailing each layer of the process.


Philip Russom tells what he imagined to create the poster:
"On a break during the TDWI World Conference in Chicago this May, I left the hotel and walked up the street to the Museum of Contemporary Art. As soon as I entered, I saw mobiles by the great American sculptor Alexander Calder. My head was spinning from the technical presentations I’d seen at the conference, and it struck me that Calder mobiles resemble the way we draw technology stacks and system architectures. Both are compartmentalized, yet the parts are connected and interactive. Both move and evolve slowly as the winds of change brush them.

Later, when I needed a metaphor for the many pieces that master data management connects and coordinates, I naturally thought of Calder’s elegant and organic mobiles.

As you look at the poster, try to imagine the pieces in motion like a Calder mobile, with the operational, analytic, and enterprise practices spinning, and the balance shifting from collaboration to implementation and back again.

My thanks to Deirdre Hoffman for translating my ideas into the graphic images of this poster."

You can request a free print copy (US and Canada Only) or download a PDF version, in the TDWI website (registration required).

The following companies are sponsors of this technology poster: Baseline Consulting, BizGui Inc, EasyAsk, Exeros, MicroStrategy, Syncsort, Talend.