Showing posts with label Jill Dyche. Show all posts
Showing posts with label Jill Dyche. Show all posts

Wednesday, March 24, 2010

Rethinking the role of BI


Increasingly, the companies have questioned the importance of the BI teams in the delivery of value to business and also to be more effective and strategic to organizations. Wayne Eckerson wrote a great post on the role of BI in his blog at The Data Warehousing Institute (TDWI), entitled Evolving Your BI Team from a Data Provider to a Solutions Provider. He commented about a Jill Dyche's presentation at TDWI’s BI Executive Summit, where she explained that BI teams can either serve as "data providers" or "solutions providers." Data providers focus on delivering data in the form of data warehouses, data marts, cubes, and semantic layers that can be used by BI developers in the business units to create reports and analytic applications. Solutions providers, on the other hand, go one step further, by working hand-in-hand with the divisions to develop BI solutions.

Eckerson believes that BI teams must evolve into the role of solutions provider if they want to succeed long term. They must interface directly with the business, serving as a strategic partner that advises the business on how to leverage data and BI capabilities to solve business problems and capitalize on business opportunities.

He wrote that historically, many BI teams become data providers by default because business units already have reporting and analysis capabilities, which they've developed over the years in the absence of corporate support. These business units are loathe to turn over responsibility for BI development to a nascent corporate BI group that doesn't know its business and wants to impose corporate standards for architecture, semantics, and data processing.

However, this separation of powers fails to deliver value, he commented. The business units lose skilled report developers, and they don’t follow systematic procedures for gathering requirements, managing projects, and developing software solutions. They end up deploying multiple tools, embedding logic into reports, and spawning multiple, inconsistent views of information. Most of all, they don’t recognize the data resources available to them, and they lack the knowledge and skills to translate data into robust solutions using new and emerging BI technologies and techniques, such as OLAP cubes, in-memory visualization, agile methods, dashboard, scorecards, and predictive analytics.

A corporate BI team needs to rethink its mission and the way it's organized. It needs to actively engage with the business and take some direct responsibility for delivering business solutions. To provide solutions assistance without adding budget, it will break down intra-organizational walls and cross-train specialists to serve on cross-functional project teams that deliver an entire solution from A to Z. The BI team will become more productive and before long eliminate the project backlog.

He commented about some successful cases where the companies have a high performance BI team. In one of them, for example, the BI is housed in an Information Management (IM) organization that reports to the CIO and is separate from the IT organization. The IM group consists of three subgroups: 1) the Data Management group, a data integration team that handles ETL work and data warehouse administration 2) the Information Delivery group, a BI and Performance Management team which purchases, installs, and manages BI and PM tools, provides training and solutions using reporting, OLAP, and predictive analytics capabilities; and 3) the IM Architecture group, that builds and maintains the IM architecture, which consists of the enterprise data warehouse, data marts, and data governance programs, as well as closed loop processing and the integration of structured and unstructured data.

Eckerson finished the article with the statement: "The message is clear: if you want to deliver value to your organization and assure yourself a long-term, fulfilling career at your company, then don’t be satisfied with being just a data provider. Make sure you evolve into a solutions provider that is viewed as a strategic partner to the business."

I agree with him, the BI team as a data provider can meet organizations, but the role of BI team as solution provider allows a more strategic role for BI and facilitates the delivery of value to business.

Tuesday, November 24, 2009

Master Data Management: Building a Foundation for Success


Evan Levy wrote a nice white paper, provided by Informatica, called Master Data Management: Building a Foundation for Success (Free registration required to download the white paper). In the white paper, he told about the challenges to implement a master data management. He said that in his book Customer Data Integration: Reaching a Single Version of the Truth, co-written with Jill Dyché, his partner and co-founder of Baseline Consulting, MDM is defined as: “The set of disciplines and methods to ensure the currency, meaning, and quality of a company’s reference data that is shared across various systems and organizations.” Below is a summary of the white paper:

Depending on the maturity and size of an IT organization, there may be a sizable collection of infrastructure services associated with managing data and systems. In their experience working with firms on new MDM programs, there are typically five technical functions that are core to MDM processing. Frequently, these capabilities are already part of the IT infrastructure. The five functions are:

1. Data cleansing and correction

Data cleansing is fairly common within most IT organizations, particularly when data is highly prone to data entry errors. Most data quality environments have been set up to focus on the cleansing and correction of well-understood data values. Customer name and address values are frequently the starting point for many companies seeking to clean up their “bad data.”

2. Metadata

Metadata isn’t limited to identifying the individual data element names and their definitions. It also identifies the data values and the origin of that data (its lineage).But metadata isn’t MDM. Metadata focuses on the descriptive detail about data.

The most visible metadata features that are useful in an MDM environment include: Terminology • y or data element names—for instance, the data element name is “ItemColor”
- Data values—for instance, the acceptable values are “red, green, or blue”
- Value representation—for instance, CC0000, 00CC00, and 0066CC
- Lineage details—for instance, “Created: 01/12/2009. System of origin: Order System”
- Data definitions—for instance, “the color of the item”

The use of existing metadata structures can dramatically simplify other systems’ abilities to adopt and use MDM because they may already recognize and work with the existing metadata.

3. Security and access services

One of the lesser-known functions within MDM is the need to manage access to individual reference data elements. MDM systems typically manipulate data using CRUD processing—to create, read, update, and delete—so centralizing the permissions to access and change key data is an important part of maintaining master data. MDM can support a very detailed or granular level of security access to the individual reference data elements.

Because of the granular level of data access that MDM affords—CRUD processing against individual reference values—an MDM system should avoid having its own siloed proprietary security solution. Instead it needs to interface to existing security services that an IT department relies on to manage application access.

4. Data migration

Data migration technologies alleviate the need to develop specialized code to extract and transfer data between different systems. Companies have invested heavily in ETL tools and application messaging technologies such as enterprise service bus (ESB) because of the need to move large volumes of data between their various application systems. The challenge in moving data between systems exists because the data is rarely stored in a simple, intuitive structure.

Regardless of the specific tools and technologies used in transporting data into or out of an MDM system, there are two basic ways of moving data: large volume bulk data (loads or extracts) and individual transactions. To load the MDM server initially with data from an application system requires bulk data loading capabilities. ETL tools are very well equipped to handle the application data extraction and MDM system loading.

5. Identity resolution

Most MDM solutions also benefit from the ability to uniquely identify and manage individual reference details. In the situations of simple subject areas (like color or size), managing and tracking the various reference values is very straightforward. This simplicity comes from the fact that the values are easy to differentiate; there’s not much logic involved in determining if the value matches “red”, “green”, or “blue.” Complexity occurs when the subject details being mastered have more variables, and variables that can be vague—such as a person. In the instances of complex subject areas, mastering the reference value requires more sophisticated analysis of the numerous attributes associated with the individual reference value such as their name or address.

The Inventory for an MDM Foundation

Here are some techniques for you to use to assess how to move forward on your own MDM development effort.

1. Data Cleansing and Correction
- Identify data cleansing tools that are already implemented within your company on other projects; pay special attention to applications that are likely to link to the MDM system
- Determine the data cleansing rules for the master data subject area and whether they have been documented
- Establish how you will interface your MDM system to the data cleansing tool (e.g., Web services or APIs)
- Contact the product vendor to determine which MDM product(s) it may already support

2. Metadata
- Review existing data warehouse metadata to discover whether there is metadata content that may apply to the master data subject area in question
- Determine whether there are metadata standards in place that apply to the given master data
- Find out if any of the key application systems use existing metadata standards
- If your company has a data management team, see if there is already a sanctioned process for identifying and developing new metadata content

3. Security and Access Services
- Identify the level of security requirements that need to apply to core master data
- Talk to the application architecture group to determine if there are application-level security standards in place
- Investigate whether your company has already defined a set of security and access Web services that may be leveraged for MDM

4. Data Migration
- Identify the bulk data migration standards that are in place—for example, how is Informatica® technology used in implementations other than data warehouse or BI implementations?
- Determine the current mechanism for application-to-application connectivity: Is it EAI? ESB? Point to point?
- Clarify which application systems can accept reconciled and corrected data
- Determine which legacy systems can only support bulk data extract versus transaction access

5. Identity Resolution
- Understand if your company already has identity resolution software in place to identify the existence of multiple customers, products, or other subject area items; this type of capability is most likely to exist in a customer-facing financial system
- Investigate whether the existing identity resolution system is configurable with new or custom rules
- Determine how the identity resolution technology interfaces to other systems (e.g., embedded code? API? Web services?)

6. MDM Functional Requirements
- Identify a handful of high-profile systems that depend on the timely sharing and synchronization of reference data; if you have a BI environment, we recommend looking at the predominant source systems as well as the most-used data marts
- Decide which reference data is shared and synchronized between multiple systems; the need for operational integration typically reflects higher business value
- Determine the individual system’s functional requirements for its reference data (e.g., create/ read/update or data quality/correction)
- Categorize the individual data elements by subject area to identify the specific subject areas requiring mastering. It’s only practical to implement MDM one subject area at a time.

His conclusion: "When it comes to the question of build versus buy, for MDM the answer is as complex as your company’s IT infrastructure, functional requirements, and business needs. The considerations are multifaceted and require an intimate knowledge of what you can do today versus what you’ll need to accomplish tomorrow. When it comes to integrating master data, one thing is clear: The first step to ROI is avoiding unnecessary expenses. Investigate and use the capabilities that already exist within your IT infrastructure, enabling you to make an informed decision on launching MDM the right way."

Evan Levy is an expert in MDM, and wrote a nice white paper, with a good explanation about how to build a MDM foundation. For those interested in the subject, he and Jill Dyché wrote a good book called: Customer Data Integration: Reaching a Single Version of the Truth.

Monday, October 26, 2009

Teradata Partners User Group Conference 2009


The Teradata Partners User Group Conference 2009 happened last week (October 18-22, 2009 in Washington D.C), with some interesting announcements: an Enterprise Analytics Cloud initiative and the plan to launch a release of an Extreme Performance Appliance based on solid-state drives (SSDs). A lot of people wrote about the event, here are some links with news, posts and articles about the conference:

- This Week at the Teradata Partners User Conference - Curt Monash
- Teradata Taps the Cloud, Announces Solid-State Appliance - Doug Henschen
- More on Teradata's SSD Speedster and (Cautious) Public-Cloud Offering - Doug Henschen
- Teradata the Wise - Wayne Eckerson
- Teradata Partners: A Retrospective - Jill Dyché
- Teradata Partners - Richard Hackathorn did a very nice coverage, with several posts
- Stein tells of doom and hope for the future - Evelyn Hoover - Editor-in-Chief Teradata Magazine

Wednesday, October 15, 2008

The Business Value of Master Data Management


Tomorrow, October 16th at 3PM ET, will happen a live Web broadcast presentation entitled The Business Value of Master Data Management,provided by DM Review and hosted by Eric Kavanagh with Jim Ericson, in its program called DM Radio.



According DM Review: "Achieving the coveted 360-degree view of the customer -- or even of a product, line of business or other entity -- is more possible than ever these days, thanks in large part to the maturation of Master Data Management. With advances in automated data quality and matching software, coupled with ever-faster data delivery mechanisms, this relatively new discipline is taking the information management industry by storm.

Unlike traditional, tightly coupled information systems, MDM solutions use a loose coupling of enterprise applications and a master data hub to deliver near real-time master records about customers, products, locations and other dimensions. These MDM solutions help improve the efficiency of enterprise systems by managing the maintenance of master records. This improves overall data quality, because master data records need not be maintained multiple times throughout the enterprise.

Tune into this episode of DM Radio to learn how MDM solutions are helping organizations align their information systems with business goals and strategies. We'll talk to Jill Dyche of Baseline Consulting, Darren Peirce of Kalido, Judy Ko of Informatica, and Anurag Wadehra of Siperian.

Attendees will learn:
- How MDM can yield significant business value
- The basics of Customer Data Integration
- The fundamentals of Product Information Management
- Why data quality must be “baked in” as opposed to “bolted on”
- Trends in MDM design and deployment."

In the DM Review website, you can register for this live Web broadcast.

You also can check out the DM Radio archives to hear previous programs with a variety of other issues.

The Master Data Management is one of the most important issues for companies nowadays. According Gartner: The truth is that achieving a single view across the enterprise, is key to running your business. The effective management and governance of master data is both an opportunity and challenge for many large enterprises. MDM poses unique challenges and requires new relationships between business and IT in areas such as workflow, governance, stewardship and data integration. To be successful, organizations must understand the role of information governance and the impact MDM has on their applications portfolio and information infrastructure. Organizations use MDM to accelerate enterprise agility, promote operational efficiency, achieve competitive differentiation, support enterprise transparency, make SOA work more effectively, and ensure corporate compliance.

Sunday, September 7, 2008

Comments about TDWI World Conference


TDWI World Conference happened last month, provided by The Data Warehousing Institute (TDWI), and here are some links with posts and articles commenting on the conference:

- TDWI in San Diego - the New and the Niche - Cindi Howson - BIScorecard
- Open Source Blossoms at TDWI - Mark Madsen - Intelligent Enterprise
- Cross the BI-Web Analytics Divide - Tony Byrne - Intelligent Enterprise
- The Velocity of eBay - Jim Ericson - DM Review
- How to make networking fun - Ericka Wilcher - The Sascom Magazine Blog
- Better than free food at TDWI San Diego - Ted Cuzzillo - Datadoodle
- Makin' Friends at TDWI San Diego - Jill Dyche - B-Eye-Network