Showing posts with label Intelligent Enterprise. Show all posts
Showing posts with label Intelligent Enterprise. Show all posts

Monday, June 21, 2010

Business Intelligence and Performance Management for the 21st Century

Ventana Research is a benchmark research and business technology advisory firm. They usually publish good studies, researchs and white papers. They also have a good blog, where they publish posts on the business, IT, technology and industry issues. Recently, Intelligent Enterprise published a great article about a new study by Ventana Research, entitled Business Intelligence and Performance Management for the 21st Century. According the article, Ventana Research undertook this benchmark research to assess the current state of maturity, trends and best practices. The goal was to determine how organizations approach BI and performance management and prioritize their key components, and to identify what elements they desire in a comprehensive approach.

The research found strong interest in and growing demand for BI and performance management. However, the research paints a picture of a market in an early stage of development. It shows that most organizations face considerable obstacles. According the study, they have only basic BI capabilities such as querying sources for specific data (74%), generating reports from data (74%) and accessing data from a spreadsheet for further analysis (70%). These and other findings lead Ventana to conclude that in general, organizations are still maturing in their use of BI and performance management. Organizations’ most important goals in deploying BI tools are to provide access to data through a variety of tools (cited by 57% of participants), to make it possible to apply analytics to the data easily (61%), and to communicate and collaborate on the analytics (55%).

Based on the research, Ventana listed 10 recommendations on how to proceed. Below is a summary of Ventana's recommedation:

1. Assess your organization’s maturity in BI and performance management. Applying the Ventana Research Maturity Index methodology, Ventana found that only 15 percent of organizations reach the "Innovative" level in all four functional categories of maturity (People, Process, Information and Technology). Ventana's analysis also reveals that maturity across these categories is uneven. From a technology perspective, organizations still use spreadsheets and e-mail too often to perform BI tasks. Examine your own capabilities in each of the four maturity categories and research how organizations that rank higher are able to do so.

2. Consider the effectiveness of your current tools and applications. Ventana found that most organizations have at least some doubts about the technology they currently use for BI and performance management. Only 12 percent of participants said they are completely confident in their BI technology, and only 9 percent made that assertion for the technology they use to manage performance. Explore users’ feelings about the tools they use and identify to tools that should be replaced.

3. Reduce the number of BI tools and the use of spreadsheets. BI systems have been around for years now, yet all but 10 percent of participants said they have at least some degree of difficulty standardizing BI into a consistent and reliable technology. Spreadsheets are an impediment to collaborative processes in enterprises. They are prone to errors and conflicts in data between files that are thought to be the same. Ventana advises taking steps to reduce the number of BI tools in use and standardize them. This research reinforces the many past findings that spreadsheets should not be used for this or any collaborative, enterprise purpose.

4. Compare the BI capabilities you have with those you want. The research shows that most participating organizations have deployed or are deploying basic BI capabilities such as querying sources for specific data (74%), generating reports from data (74%) or accessing data from a spreadsheet for further analysis (70%). However, notably fewer have more advanced capabilities. For example, only 30% can apply analytics to data effectively, 24% can collaborate on data and metrics, and 22% can conduct what-if analysis for planning and forecasting. Decide what BI capabilities you want and examine products that can supply them.

5. Determine whether products currently in use can handle performance management well. Participants in our research said their most important goals in managing performance are to align actions and decisions to goals and strategy (cited by 77%) and to be able to plan effectively for improvement (75%). Business intelligence can be a key tool for helping organizations understand, align and optimize their performance; however, participants expressed mixed feelings about how well their BI tools help them in these efforts. In several aspects of understanding performance, fewer than 10% said their current products are superior, and the largest percentages called them only adequate or worse. In aspects of aligning performance, products fared worse: Again fewer than 10% rated their products superior in any category, and inadequacy outpolled basic adequacy. For optimizing performance, responses followed the same pattern, with no aspect exceeding 10% in superior rating. These findings suggest that you should take a hard look at the adequacy of the tools and systems you use to manage performance; determine whether other tools would enable more cost-effective performance management.

6. Identify the types of data you need to access and analyze. A majority of participants (57%) said that an important goal in providing BI tools is to provide access to data through a variety of methods. Asked to identify the types of data they consider most critical to access, 71 percent of participants cited not a type but a source: databases residing in data warehouses or operational data stores. The next two most-often-cited data types involve business activities important to a range of job functions: finance data (67%) and customer data (61%). More than half of participants said they need to access spreadsheets (55%) or transactional data (54%) from enterprise systems such as customer relationship management (CRM), enterprise resource planning (ERP) or online transaction processing (OLTP). Indicating the increasing diversity of data types, one-third (34%) said they need to access unstructured content such as text, images, voice or Web data. To put BI and performance management to the best use, Ventana advises identifying the types and sources of data your company or unit needs to access most often and then evaluating tools that can help you do so easily.

7. Consider adopting or expanding metrics for performance management. The ability to measure and track performance is an integral component of performance management. Currently 41% of participating organizations evaluate performance data and 29% are assessing metrics or measures to do so; more than one-third of executives (37%) measure performance. Find which processes and employees in your organization might perform better if metrics were available for evaluating their performance, then deploy systems that can monitor those metrics.

8. Address organizational barriers to improving BI and performance management. While research participants clearly recognize the worth of improvement, they also acknowledge a number of barriers to implementing projects to do that, mostly involving money or institutional support. The barrier cited most often is lack of resources (60%), followed by lack of a budget (43%). The top two people issues are lack of awareness (cited by 36%) and lack of executive support (26%). Determine what barriers exist in your own organization and discuss how to overcome them.

9. Look into alternative means of software deployment. Asked how they deploy this software, half (53%) of participants said they currently install and manage it on their own premises in the established manner; however, that percentage drops dramatically regarding plans for deployment in the next 12months (13%) or 12 to 24 months (11%). Nearly as many said that in the next 12 to 24 months they will choose hosted software managed off-site (13% and 9%, respectively) or renting software as a service (SaaS) on demand (12% and 10%). These options can help address organizational barriers such as lack of resources to provide BI and performance management.

10. Examine software that can be deployed across roles in the enterprise. The research found that two-thirds (66%) of organizations are planning to evaluate new technologies for BI and performance management. In terms of roles, managers (72%) were most assertive about planning to consider new products. Consider the breadth of implementation you require from software to support BI and performance management and make that a criterion when evaluating products and vendors. Challenge vendors to demonstrate the appropriateness and usability of the products they’re offering, and ask to speak with customers in situations similar to yours.

You can download the complete executive summary version of the Ventana Research report (registration required).

Thursday, February 25, 2010

The Competitive Market of Data Warehousing


Doug Henschen commented about the competitive market of Data Warehousing, in an article published in the Intelligent Enterprise, entitled Upstart Vendors Keep Data Warehousing Competitive, with several announcements from data warehousing upstart vendors at The Data Warehousing Institute (TDWI) World Conference, happened this week in Las Vegas.

He wrote that taken individually, the headlines aren't earth shattering, but taken together, the announcements underscore that the data warehousing universe is expanding, with alternative providers getting stronger despite the pressures of a weak global economy. The announcements are:

- Aster Data: Aster Data nCluster 4.5, an upgrade of its core product featuring a combined SQL/MapReduce visual development environment, a suite of prebuilt analytics modules, support for Fusion-io flash-memory drives, and a new management console for optimizing query performance.

- ParAccel: support for solid state drives (SSDs) from Fusion-io (a move also announced today by Aster Data). SSDs replace spinning, mechanical disks with flash memory chips for faster performance. ParAccel is promising 15 times faster query performance with optional flash SSDs installed by the hardware supplier of the customer's choice.

- Kognitio: GroupM, a division of advertising giant WPP, has asked it to build an analytical environment to match and monitor advertising placements on global scale. GroupM operates in more than 80 countries, and data from these local operations will ultimately be fed into a centralized Kognitio database that will reportedly analyze almost one-third of the world's advertising spend across multiple media outlets.

- Vertica: upgrades including better workload and resource management features aimed at handling mixed query workloads and many more users.

Henschen commented that the data warehousing industry leaders have already embraced flash technology. Oracle put a huge emphasis on flash memory in the Sun-Oracle V2 upgrade of Exadata announced last October. Teradata's SSD-based Extreme Performance Server 4555 was also announced last October, and it's slated for general availability in the first half of this year. IBM has demonstrated an SSD-based test appliance, but it has yet to announce or ship a production SSD-enabled data warehouse appliance. Netezza has used the combination of competitive pricing and fast query performance to win more than 300 customers to date.

Despite the vendor consolidation in several IT categories, the data warehousing market is very competitive, and it is very good, mainly for the customers, that can choose the product that best suits you, to meet their ever increasing needs of data analysis.

Saturday, January 16, 2010

Predictions and Trends for 2010


Every new year the people make their predictions for the future and also comment on trends. I've read many posts and articles about predictions and trends for 2010 on business intelligence and performance management. Below is a summary of some that I found most interesting:

Early december Nenshad Bardoliwalla published in Enterprise Irregulars and in his personal blog, a very good and well detailed post entitled The Top 10 Trends for 2010 in Analytics, Business Intelligence, and Performance Management, where he described his trends:

1 - We will witness the emergence of packaged strategy-driven execution applications. As we discussed in Driven to Perform: Risk-Aware Performance Management From Strategy Through Execution (Nenshad Bardoliwalla, Stephanie Buscemi, and Denise Broady, New York, NY, Evolved Technologist Press, 2009), the end state for next-generation business applications is not merely to align the transactional execution processes contained in applications like ERP, CRM, and SCM with the strategic analytics of performance and risk management of the organization, but for those strategic analytics to literally drive execution. We called this "Strategy-Driven Execution", the complete fusion of goals, initiatives, plans, forecasts, risks, controls, performance monitoring, and optimization with transactional processes.

2 - The holy grail of the predictive, real-time enterprise will start to deliver on its promises. While classic analytic tools and applications have always done a good job of helping users understand what has happened and then analyze the root causes behind this performance, the value of this information is often stale before it reaches its intended audience.

3 - The industry will put reporting and slice-and-dice capabilities in their appropriate places and return to its decision-centric roots with a healthy dose of Web 2.0 style collaboration.

4 - Performance, risk, and compliance management will continue to become unified in a process-based framework and make the leap out of the CFO’s office. The disciplines of performance, risk, and compliance management have been considered separate for a long time, but the walls are breaking down.

5 - SaaS / Cloud BI Tools will steal significant revenue from on-premise vendors but also fight for limited oxygen amongst themselves.

6 - The undeniable arrival of the era of big data will lead to further proliferation in data management alternatives.

7 - Advanced Visualization will continue to increase in depth and relevance to broader audiences.

8 - Open Source offerings will continue to make in-roads against on-premise offerings. Much as Saas BI offerings are doing, Open Source offerings in the larger BI market are disrupting the incumbent, closed-source, on-premise vendors.

9 - Data Quality, Data Integration, and Data Virtualization will merge with Master Data Management to form a unified Information Management Platform for structured and unstructured data.

10 - Excel will continue to provide the dominant paradigm for end-user BI consumption. For Excel specifically, the number one analytic tool by far with a home on hundreds of millions of personal desktops, Microsoft has invested significantly in ensuring its continued viability as we move past its second decade of existence, and its adoption shows absolutely no sign of abating any time soon.


James Kobielus published his thought-provoking post Advanced Analytics Predictions For 2010 in the Forrester's blog:

- Self-service operational BI puts information workers in driver’s seat: Enterprises have begun to adopt self-service BI to cut costs, unclog the analytics development backlog, and improve the velocity of practical insights. Users are demanding tools to do interactive, deeply dimensional exploration of information pulled from enterprise data warehouses, data marts, transactional applications, and other systems. In 2010, users will flock to self-service BI offerings as the soft economy keeps pressure on IT budgets. In the coming year, BI software as a service (SaaS) subscription offerings will be particularly popular, in a market that has already become fiercely competitive. So will the new generation of BI mashup offerings for premises-based deployment.

- User-friendly predictive modeling comes to the information workplace: Predictive analytics can play a pivotal role in day-to-day business operations. If available to information workers—not just to Ph.D. statisticians and professional data miners—predictive modeling tools can help business people continually tweak their plans based on flexible what-if analyses and forecasts that leverage both deep historical data as well as fresh streams of current event data. In 2010, user-friendly predictive modeling tools will increasingly come to market, either as stand-alone offerings or as embedded features of companies’ BI environments.

- Advanced analytics sinks deep roots in the data warehouse: Advanced analytics demands a high-performance data management infrastructure to handle data integration, statistical analysis, and other compute-intensive functions. In 2010, in-database analytics will become a new best practice for data mining and content analytics, in which the enterprise data warehousing professionals must now collaborate closely with the subject matter experts who build and maintain predictive models. To support heterogeneous interoperability for in-database and in-cloud analytics, open development frameworks-- especially MapReduce and Hadoop—will be adopted broadly by data warehousing and analytics tools vendors. In the coming year, we’ll also see the beginning of an industry push toward an open development framework for inline predictive models that can be deployed to CEP environments. Clearly, in-CEP predictive analytics will be a critical component of truly adaptive BAM for process analytics.

- Social network analysis bring powerful predictive analysis to the online economy: Social network analysis thrives on the deepening streams of information—structured and unstructured, user-generated and automated—that emanate from Facebook, Twitter, and other new Web 2.0 communities. In the coming year, many vendors of predictive modeling tools will enhance their social network analysis features to support real-time customer segmentation, target marketing, churn analysis, and anti-fraud.

- Low-cost data warehousing delivers fast analytics to the midmarket: Though enterprises can certainly do BI without a data warehouse, this critical infrastructure platform is essential for high-performance reporting, query, and analytics against large data sets. In 2010, many data warehousing vendors will lower the price of their basic appliance products to less than $20,000 per usable terabyte.At the same time, enterprises will see a growing range of cost-effective solution appliances in 2010, combined DW appliances with preconfigured BI, advanced analytics, data cleansing, industry information models, and other data management applications and tools.

- Data warehousing virtualizing into the cloud: The data warehouse, like all other components of the BI and data management infrastructure, is entering the cloud. In 2010, we’ll see vendors continue to introduce cloud, SaaS, and virtualized deployments of their core analytic databases.


Howard Dresner wrote a nice and interesting post in his blog, called A thought (or two) for the New Year. He asked a question: why do we (still) struggle to effectively use information to make better decisions and what can we do to improve?, and told about five ideas that might help:

1 - Get the culture right: If a culture is not receptive to BI and EPM, those efforts will have limited impact. This is the basis for my latest book, Profiles in performance – Business Intelligence Journeys and the Roadmap for Change. In it I assert that organizations need to establish a “performance-directed culture” first – as a context or rationale for these solutions. To this end, I developed the Performance Culture Maturity Model (Patent Pending) and related methodologies for assessing an organization’s culture and offering a path to becoming more “performance-directed”.

2 - Don’t get overly enamored with technology: This is not to say that technology isn’t important. You certainly will want to have appropriate technology once you have the right environment in place to use it. However, it’s a means to an end, not an end in itself and large sums of money can be wasted with a “technology-led” strategy.

3 - Get strategic: There was a time when many/most organizations had “strategic planning” functions. They were chartered to think and plan for the future – developing multiple scenarios and associated action plans. Today, few organizations have this sort of a function and it shows. Most organizations have allowed themselves to become overwhelmingly tactical and reactive in nature.

4 - Get the metrics right: Assuming we have a well defined and communicated mission and strategy, we can use metrics as a means of measuring and managing execution. This is where things get complex and there’s a real risk of providing large quantities of information with little impact. Here’s where “less is more”. Metrics need to be focused upon alignment with the strategy in a way that they’re actionable.

5 - Take action: Many of us either engage in “analysis paralysis” or rely upon intuition when faced with a critical decision. Instead, we should view Business Intelligence and associated analyses as part of a learning process – which uses information to inform our decision-making, but doesn’t make the decision for us. This requires taking calculated risks, since information will typically be incomplete. However, the former two scenarios expose the organization to completely unknown risks. So, frame the decision to be made. Collect and analyze enough information/facts to build workable assumptions. Assess the benefits, risks, and alternatives and make your decision. Finally, monitor the impact and adjust if possible and as needed.


Cindi Howson published in Intelligent Enterprise a good post entitled Predicting BI Highlights for 2010, where she mentioned her thoughts:

- In-memory will be a key theme this year as Microsoft will ship Gemini, SAP opens up BW Accelerator, IBM Cognos increasingly leverages TM1, and MicroStrategy 9 OLAP Services gains traction. In-memory approaches are not only key to BI platforms but also to any analysis that involves both speed and analytic complexity (Spotfire, SAS JMP, QlikView). The winners in this are the customers; the losers will be the vendors who have no strategy in this space or where in-memory is their only differentiator.

- Cloud computing and SaaS will become less niche as both BI heavy weights and vertically-focused vendors recognize that the infrastructure side of BI offers little competitive advantage; instead, it's the time-to-value and agility. IT owners who don't want to give up any control are in for a bruising.

- SMBs will embrace BI but, faced with a myriad of good BI tool choices, these customers will choose products from vendors who offer better service, clarity of value, a partnership mentality, and at the least cost.

- The enterprise vs. departmental BI debate will continue but will be tempered with the reality of "best" and "right" doesn't matter if you get outsourced, laid off, or go bankrupt. Those burned by over spending on software will look for IT to offer some enterprise restraint. The wiser of the industry will find an ideal balance of having an enterprise focus on those items that bring economies of scale and synergies, while departmentalizing those aspects in which differentiation and time to value matter more.

- Got dashboards? This category of tools only keeps getting better. Dashboards will become as commonplace as reporting and ad hoc query capabilities; but in 2010, they will be more animated, better integrated, packing more effective insights, on whatever device users prefer (including the iPhone and Droid).

- Good data, bad decisions remain BI's biggest problem. I'd like to be optimistic and think that we will rid the BI industry of all that ails it, but the world economy, corruption in politics, the epidemic of overweight people while others starve -- you name it -- tell me that human nature will continue to sabotage even the best of BI deployments.

- Social networking and sentiment analysis should be on everyone's radar. Now that it seems every company has a Facebook presence (maybe for marketing, maybe for customer support), the need for sentiment analysis grows. So all those tweets, blogs, and social network updates only add to the data explosion and sense of information overload.

Tuesday, October 6, 2009

The culture of performance


Howard Dresner gave a nice interview to Intelligent Enterprise, where he answered the questions of Doug Henschen and told about his upcoming book: Profiles in Performance: Business Intelligence Journeys and the Roadmap for Change.


In the interview, he explained the concept of "Performance Culture Maturity Model", described in his book, with six dimensions and four levels of maturity. He also told about the organizations that he published the case studies in the book.

Below I highlighted some answers at the interview:

I'll describe the six major dimensions. Two of them are strategic, two of them are operational in nature and two of them are technical. The first one is alignment with the mission as a cultural tenet. In other words, it's about people who really believe in what they're doing. That's clearly culture. Another one is transparency and accountability. If you're aligned with the mission and believe in what you're doing, next you have to share information, and everybody has to hold themselves and the organization accountable. The information may come from computer systems or it may not. The point is that it's open and transparent. If everyone is open it's really a good thing. The problem is that in most organizations, everybody wants everyone else to be open and they want to stay closed.

Another dimension is being able to resolve conflicts. A performance-directed culture is able to air these conflicts and resolve them in a positive way. You get the issues out there.

Action on insight is another dimension, that simply means when you learn something, you're able to act upon it in a coordinated fashion. In so many environments, when something becomes known, people stick it in their hip pocket and don't do anything. Acting upon insight is simply about taking information, changing behavior to take advantage of what you've learned and, at the highest level of achievement, doing it in a coordinated fashion.

The technical dimensions starts with trust in the data. A lot of information is going to come directly or indirectly from some sort of a system, but not always. Where did it come from? And what do I do with that information? That's really important. Am I going to be transparent? Am I going to make sure that we share this insight openly and that we act upon it as a single organization? How does an organization behave like a single organism and achieve its objectives as effectively as possible?

The sixth dimension is availability and currency of information. And does the currency of that information actually match the application? Those last two are technical and are very much about data, because information is the lifeblood of a performance-directed culture.

The four levels go from "chaos" to "departmentally optimized" to "the performance-directed culture emerging" to "the performance-directed culture realized."

If you want to have things like transparency and accountability, well, that's BI and performance management in terms of an implementation perspective, but not in terms of a cultural perspective. You have to have the systems and support if you want to achieve this on any kind of scale, but there's an attitude that goes with that.

I agree with him, the culture of the organization is determinant to the success of a performance management initiative. Based in the interview, I am anxious to read his book.

Tuesday, June 30, 2009

The 10 Essential Rules of Dimensional Modeling


Margy Ross, president of the Kimball Group, recently published in Intelligent Enterprise, a nice article where she mentioned and defined what she considers the 10 Essential Rules of Dimensional Modeling:

1 - Load detailed atomic data into dimensional structures

Dimensional models should be populated with bedrock atomic details to support the unpredictable filtering and grouping required by business user queries.

2 - Structure dimensional models around business processes

Business processes are the activities performed by your organization; they represent measurement events, like taking an order or billing a customer. Business processes typically capture or generate unique performance metrics associated with each event. These metrics translate into facts, with each business process represented by a single atomic fact table.

3 - Ensure that every fact table has an associated date dimension table

The measurement events described in Rule #2 always have a date stamp of some variety associated with them, whether it's a monthly balance snapshot or a monetary transfer captured to the hundredth of a second. Every fact table should have at least one foreign key to an associated date dimension table, whose grain is a single day, with calendar attributes and nonstandard characteristics about the measurement event date, such as the fiscal month and corporate holiday indicator. Sometimes multiple date foreign keys are represented in a fact table.

4 - Ensure that all facts in a single fact table are at the same grain or level of detail

There are three fundamental grains to categorize all fact tables: transactional, periodic snapshot, or accumulating snapshot. Regardless of its grain type, every measurement within a fact table must be at the exact same level of detail.

5 - Resolve many-to-many relationships in fact tables

Since a fact table stores the results of a business process event, there's inherently a many-to-many (M:M) relationship between its foreign keys, such as multiple products being sold in multiple stores on multiple days. These foreign key fields should never be null.

6 - Resolve many-to-one relationships in dimension tables

Hierarchical, fixed-depth many-to-one (M:1) relationships between attributes are typically denormalized or collapsed into a flattened dimension table. If you've spent most of your career designing entity-relationship models for transaction processing systems, you'll need to resist your instinctive tendency to normalize or snowflake a M:1 relationship into smaller subdimensions; dimension denormalization is the name of the game in dimensional modeling.

7 - Store report labels and filter domain values in dimension tables

The codes and, more importantly, associated decodes and descriptors used for labeling and query filtering should be captured in dimension tables.

8 - Make certain that dimension tables use a surrogate key

Meaningless, sequentially assigned surrogate keys (except for the date dimension, where chronologically assigned and even more meaningful keys are acceptable) deliver a number of operational benefits, including smaller keys which mean smaller fact tables, smaller indexes, and improved performance. Surrogate keys are absolutely required if you're tracking dimension attribute changes with a new dimension record for each profile change.

9 - Create conformed dimensions to integrate data across the enterprise

Conformed dimensions (otherwise known as common, master, standard or reference dimensions) are essential for enterprise data warehousing. Managed once in the ETL system and then reused across multiple fact tables, conformed dimensions deliver consistent descriptive attributes across dimensional models and support the ability to drill across and integrate data from multiple business processes. The Enterprise Data Warehouse Bus Matrix is the key architecture blueprint for representing the organization's core business processes and associated dimensionality.

10 - Continuously balance requirements and realities to deliver a DW/BI solution that's accepted by business users and that supports their decision-making

Dimensional modelers must constantly straddle business user requirements along with the underlying realities of the associated source data to deliver a design that can be implemented and that, more importantly, stands a reasonable chance of business adoption. The requirements-versus-realities balancing act is a fact of life for DW/BI practitioners, whether you're focused on the dimensional model, project strategy, technical/ETL/BI architectures or deployment/maintenance plan.

Margy Ross wrote a nice article, compiling and defining the main rules that a professional needs when are designing a model using the concepts of Dimensional Modeling.

Tuesday, June 23, 2009

The LucidEra's shut down


This week, the rumor about the LucidEra is shutting its operations down spread on internet. Unfortunately, the rumor became true, with the news published in many sites:

Jeff Kelly posted in Search Data Management yesterday:
"The vendor sent an email to customers on Thursday with the news and pledged to help them wind down their relationship with the company and its SaaS-based BI products by the end of June, said Darren Cunningham, vice president of marketing at LucidEra, in a phone interview.

LucidEra's decision to shut down was brought about by a lack of funding, not a lack of interest in its products or in SaaS BI as a whole, Cunningham said. He would not go into details regarding LucidEra's financial problems other than to say, "It was a matter of funding or being acquired. And neither of those things happened."

Today, Doug Henschen posted in Intelligent Enterprise:
"Darren Cunningham, LucidEra's VP of Marketing, responded to inquiries 6/23 at 3:40 pm ET with the following e-mail message:

All that I can say at this time is that our product and pipeline were both stronger than they'd ever been. Customer adoption was growing, which was reflected in the 20+ 5-star reviews on the Salesforce AppExchange since January. We got hit by just really, really bad timing to have to be raising our next round of funding in this economic climate.

Right now, various options are being looked at in the best interest of our creditors, customers, employees, and shareholders. There should be resolution for everyone involved soon so there is an orderly transition."

LucidEra is a SaaS BI startup, founded in 2005, and according their website: "LucidEra was formed to shake up the stagnant business intelligence industry and traditional approaches to corporate information access and analysis by delivering business visibility as an on-demand service."

Also according their website, the company's last round of funding came in August 2007, when it raised $15.6 million in Series B funding.

In my opinion, the LucidEra's shut down does not mean that the SaaS business model failed. There are market for both SaaS(on-demand) and traditional BI(on-premise).

Thursday, April 30, 2009

Eight Guidelines for Low-Risk Enterprise Data Warehousing


Ralph Kimball published in Intelligent Enterprise a very nice article entitled Eight Guidelines for Low-Risk Enterprise Data Warehousing, where he make recommendations for controlling project costs and reducing risks in Enterprise Data Warehousing initiatives.

He said that in today's economic climate, business intelligence (BI) faces two powerful and conflicting pressures. The business users want more focused insight from their BI tools into customer satisfaction and profitability and, these same users are under huge pressure to control costs and reduce risks.

The Eight Guidelines for Low-Risk Enterprise Data Warehousing are:

1 - Work on the Right Thing

He recommends a simple technique for deciding what the right thing is. Make a list of all your potential EDW/BI projects and place them on a simple 2x2 grid, considering the business impact and the feasibility.

Figure out, with your end users, how valuable each of the potential projects would be, independent of the feasibility. Next, do an honest assessment of whether each project has high-quality data and how difficult it will be to build the data delivery pipelines from the source to the BI tool. Remember that at least 70 percent of BI project risks and delays come from problems with the data sources and meeting data delivery freshness (latency) requirements.

2 - Give Business Users Control

The transfer of control means having users directly involved with, and responsible for, each EDW/BI project. Obviously these users have to learn how to work with IT so as to make reasonable demands.

3 - Proceed Incrementally

In this era of financial uncertainty, it's hard to justify a classic "waterfall" approach to EDW/BI development. In the waterfall approach, a written functional specification is created that completely specifies the sources, the final deliverables and the detailed implementation. The rest of the project implements this specification, often with a big-bang comprehensive release.

Many EDW/BI projects are gravitating to what could be called an "agile" approach that emphasizes frequent releases and mid-course corrections. Interestingly, a fundamental tenet of the agile approach is ownership by the business users, not by technical developers.

An agile approach requires tolerating some code rewriting and not depending on fixed-price contracts. The agile approach can successfully be adapted to enterprisewide projects such as master data management and enterprise integration.

4 - Start with Lightweight, Focused Governance

Governance is recognizing the value of your data assets and managing those assets responsibly. Governance is not something that is tacked onto the end of an EDW/BI project. Governance is part of a larger culture that recognizes the value of your data assets and is supported and driven by senior executives.

5 - Build a Simple, Universal Platform

One thing is certain in the BI space: the nature of the end-user-facing BI tools cannot be predicted. we must recognize that the enterprise data warehouse is the single platform for all forms of business intelligence. This viewpoint makes us realize that the EDW's interface to all forms of BI must be agnostic, simple and universal.

Dimensional modeling meets these goals as the interface to all forms of BI. Dimensional schemas contain all possible data relationships, but at the same time can be processed efficiently with simple SQL emitted by any BI tool.

6 - Integrate Using Conformed Dimension

Enterprisewide integration has risen to the top of the list of EDW/BI technical drivers along with data quality and data latency. Dimensional modeling provides a simple set of procedures for achieving integration that can be effectively used by BI tools. Conformed dimensions enable BI tools to drill across multiple subject areas, assembling a final integrated report. The key insight is that the entire dimension (customer, for example) does not need to be made identical across all subject areas. The minimum requirement for a drill-across report is that at least one field be common across multiple subject areas. Thus, the EDW can define a master enterprise dimension containing a small but growing number of conformed fields. These fields can be added incrementally over time. In this way, we reduce the risk and cost of enterprise integration at the BI interface. This approach also fits well with our recommendation to develop the EDW/BI system incrementally.

7 - Manage Quality a Few Screens at a Time

In our articles and books, Kimball Group has described an effective approach to managing data quality by placing data quality screens throughout the data pipelines leading from the sources to the targets. Each data quality screen is a test. When the test fails or finds a suspected data quality violation, the screen writes a record in an error event fact table -- a dimensional schema hidden in the back room away from direct access by end users.

The data quality screens can be implemented one at a time, allowing development of the data quality system to grow incrementally.

8 - Use Surrogate Keys Throughout

Make sure to build all your dimensions (even Type 1 Dimensions) with surrogate primary keys. This insulates you from surprises downstream when you acquire a new division that has its own ideas about keys. What's more, all your databases will run faster with surrogate keys.

Saturday, December 13, 2008

Nine Choices on the Road to BI Solution Centers


Intelligent Enterprise published this month a good article about Business Intelligence Solution Centers, by Boris Evelson and James Kobielus, analysts of Forrester Research. This article is based on the Forrester Report Implementing Your Business Intelligence Solutions Center.

They said as BI grows more pervasive, complex, feature-rich, and mission-critical, it also becomes harder to implement effectively. Many information and knowledge management professionals question whether they architect, implement, and manage their BI initiatives properly. Doing so requires sound BI and performance management best practices — and an awareness of the myriad ways it can all go wrong.
Forrester defines a BISC as: A permanent, cross-functional organizational structure responsible for governance and processes necessary to deliver or facilitate delivery of successful BI solutions, as well as being an institutional steward, protector, and forum for BI best practices.

The chief symptoms of suboptimal BI management practices include:
- The lack of a single trustworthy view of all relevant information.
- BI applications too complex and confusing to use effectively.
- BI applications too rigid to address even minor changes.

You need to customize your BISC approach, and the intersections of these four dimensions — process, people, data, and technology — create multiple BISC scenarios and approaches that information and knowledge management pros must consider when developing a BISC most relevant to support your BI efforts.

The nine scenarios and approaches you must consider when implementing your BISC are:

1 - Strategic Or Operational Objectives?
Some organizations deploy BISCs that are purely strategic or advisory in nature. In those organizations BISC accepts the role of being a BI champion, providing subject matter experts, and overseeing BI standards, methodologies, and a repository of best practices. When these BISCs take on more operational duties they become responsible for tasks like the BI project management office (PMO), training, and vendor management.

2 - In-house or Outsourced?
Enterprises deploying BI will need help from experienced consultants and systems integrators (SIs). This expertise is critical because BI is very much an art and will remain that for the foreseeable future, since it involves engineering a complex set of systems and data to address the changing imperatives of business organizations. As a result, most of the more successful BISC organizations include both internal and external staff.

3 - Virtual Or Physical?
Organizations have a choice of leaving their BISC staff within their lines of business (LOBs) or functional departments, or moving them to a centralized physical BISC organization.

4 - Operational or Analytical in Scope?
A BISC for some may focus on addressing the front-end access, presentation, delivery, and visualization requirements of analytic applications. Alternately, others may encompass a wider scope including data warehousing; data integration; data quality; master data management (MDM); and many other analytics-relevant infrastructures, processes, and tools.

5 - Support IT only or All Stakeholders?
Information and knowledge management pros must determine whether their organizational culture is ready to support BISC beyond BI infrastructure in scope.

6 - Type of Funding Model?
BISC can be treated as a corporate cost center, and all departments across the enterprise can use and benefit from BISC services. A cost allocation model based on the actual usage of BISC services can be fairer, but detailed, activity-based cost allocation models can be tricky to set up, implement, and manage.

7 - Narrow or Broad Scope?
Forrester recommends business leadership and business-led governance orientation, not a technology-centric focus, for the BISC. The same road map principles that apply to the best practices of implementing BI apply to the BISC: strategy first, architecture next, technology last.

8 - Performance Measurement Approach?
BISC stakeholders require transparent measurements of the success of the BISC program in order to support ongoing momentum and funding. BISC leaders must establish a clear set of BISC performance metrics and clearly communicate them on a periodic basis.

9 - Isolated or Aligned With Other Solution Centers?
No BI environment is an island from the rest of the data management infrastructure. Just as BI applications touch, depend on, and overlap with many related processes and technologies, BISCs cannot exist in isolation from other competency centers, solutions centers, or centers of excellence. Federation between the BISC and other data management competency centers is a best practice.

The Forrester wrote a good report, and I think this report can help companies to implement an effective BI Solution Center.

Sunday, November 30, 2008

Upgrading your data integration efforts to enable Business Intelligence (BI) 2.0


I read a post in Informatica Corporation's blog, entitled Upgrading your data integration efforts to enable Business Intelligence (BI) 2.0, written by Rick Sherman.

He mentioned two good articles that talking about the concepts of BI 2.0, the first article called Business Intelligence 2.0: Simpler, More Accessible, Inevitable, written by Neil Raden, and published in Intelligent Enterprise; and the second article called BI 2.0: The Next Generation, written by Charles Nichols, and published in DM Review.

He said: "With the advent of ICCs (Integration Competency Centers) and robust data integration suites, companies can eliminate integration stovepipe efforts and work towards enabling one data integration backbone. Common people, processes and procedures work towards data integration.

BI 2.0 sounds cool and makes you think you need yet another BI tool. Of course that BI tool has to be the latest and greatest BI tool on the market today but don’t be fooled, it is really not about BI 2.0 but rather DI 2.0 (Data Integration 2.0.)"

I agree with him when he talks about data integration, but I think the new concepts of Data Integration are included in the concepts of BI 2.0.

Monday, November 17, 2008

Open Source BI Still Fighting For Its Share


Seth Grimes published today in Information Week, an article about Open Source BI called Open Source BI Still Fighting For Its Share (in the PDF file of the article the title is "Fine, but Not Fine-Tuned Yet").

He heard many executives of open source BI companies, mainly Steve Snyder of JasperReports and Richard Daley, CEO of Pentaho.

He said although the open source BI market is growing, the market remains dominated by the likes of Business Objects, Cognos, Microsoft, and Oracle.

BI suites typically cover core query, analysis, and reporting functions, and also provide data integration and dashboard visualization capabilities. Commercial open source BI vendors, notably Pentaho and JasperSoft, offer these components in free community editions with open source licenses, and also packaged with non-open source extensions in paid, supported, indemnified editions. The extensions include spreadsheet services for Microsoft Excel, Ajax interactive dashboards, and metadata abstraction layers that insulate business end users from the underlying database schema.

He also said that have two additional open source BI offerings are worth considering. The first, the Eclipse Business Intelligence and Reporting Tools project, is primarily of interest to Java developers. The second, the Palo OLAP Server from German firm Jedox, develops enterprise technologies for Excel applications, targeting enterprise planning, analysis, reporting, and consolidation apps.

One customer - Beyond Compliance, a provider of hosted compliance management software--uses Palo Excel-based reporting for analytical reports that include tables and graphs. Beyond Compliance harnesses the Palo OLAP Server on the back end and the non-open source Palo Worksheet Server for report distribution. "The nice thing about Palo is that we've taken report design away from developers and brought it to our client-services team, to end users," says Rick Clazie, Beyond Compliance's technology and infrastructure manager. The company doesn't measure the ROI of its open source choice in financial terms, trusting that faster reporting turnaround and extended capabilities increase client satisfaction.

He finished the article: Will others take this leap to open source BI? Gartner projects triple the adoption by 2012, implying much faster growth than the overall BI market. BI is making progress, particularly when commercially packaged to deliver usability and support lacking with free components. As people like Snyder and Clazie push these tools out to employees, that packaging, coupled with open source's lower costs, will be critical to open source BI's enterprise success.

Seth Grimes wrote a post with additional material in his blog in Intelligent Interprise, complementing his article, I would like to highlight:

OSBI's growing appeal to enterprise end users. End users need capable, robust, and usable software.

That core software components are free makes open source attractive both to end users and for systems integrators and independent software vendors that sell products and services built on OSBI components.

With open source, baseline costs are lowered, boosting margins, and [integrators and ISVs] have the ability to customize the code or develop extensions if they wish. Customers benefit and so does the greater user community. For instance, Yves de Montcheuil, marketing VP at open-source data-integration vendor Talend, boasts that half of his company’s 250 data-source/destination connectors were contributed by users.

In my opinion, Seth Grimes did a good explanation about how is the Open Source BI scenario.

Tuesday, November 11, 2008

Cool BI: Rating The Latest Innovations


Cindi Howson published yesterday in Intelligent Enterprise, a good article called Cool BI: Rating The Latest Innovations, where she examines the BI innovations thinking about them in terms of maturity, value, and pervasiveness:

Maturity: Consider the technology's maturity, particularly in its integration with BI. Some innovations have been making inroads into BI for years, whereas others are more recent.

Value: Consider the value of the innovation, either to reduce BI's cost of ownership, improve productivity, or increase BI's contribution to business performance. Some BI deployments focus on the value of a single, big decision. At these companies, BI is often deployed first to the experts and analysts. Other firms look for improvement on all the little decisions that may have a small individual impact yet a huge aggregate contribution to results.

Pervasiveness: While many vendors have trumpeted the rallying cry for "mainstream BI" or "pervasive BI," BI adoption even among established practitioners is relatively low at 25 percent of employees within the company. In evaluating innovations, recognize that each will appeal to different user segments as illustrated in the chart at right. Advanced visualization, for example, is powerful for business analysts, whereas BI search is ideal for casual users, and embedded BI helps front-line workers. Mobile BI has big potential for executives and front-line workers, but the business analysts who are currently the largest BI constituency may say it's not at all important. Innovations that make BI more pervasive are not necessarily more important than those that benefit power users; the point is to recognize that different user segments will benefit more from certain innovations.

She defines a quadrant chart with several BI innovations according to these three factors. Maturity is along the X axis, pervasiveness is along the Y axis, and the size and color of the bubble indicates the financial value:


She describes some of them and also a bottom line about the innovation:

Rich Reportlets

The term "rich reportlets" refers to the influence of rich internet applications (RIA) in the BI world.
Bubbles and trend lines dancing across a page may not seem essential to BI techies, but ask business users which tool they'd rather use and they'll vote for a rich reportlet. When you are trying to change the way people work, don't underestimate the power of the cool factor in getting them to try new tools.
Beyond the appeal of these interfaces, rich reportlets contribute substance to BI.

Bottom line: embrace rich reportlets and interfaces for their ability to make BI engaging to existing users and more appealing to new classes of users.

In-Memory Analytics

In-memory analytics has been around for years, so why the recent buzz? The answer lies in a combination of:
- Increasing adoption of 64-bit operating systems and greater availability of addressable memory
- Lower costs for random access memory (RAM )
- How in-memory analytics are used and delivered
- Shifts in user expectations for speed-of-thought analysis

Bottom Line: There was a time when users were happy to get a weekly report. Today, the pace of business demands fast data access and even faster insight. 64-bit technology provides greater scalability to in-memory tools and is gaining greater adoption and support.

Web 2.0 Meets BI

The collaborative side of the Web is so prevalent and accepted by a new generation of workers that we have to contemplate the effects on BI. She suggests that social networking could solve some of BI's greatest challenges:

- Information overload
- Information hoarding
- Providing insights to the most appropriate people to decrease the time to decision

Bottom Line: Some of social networking's influence on BI certainly has a high "cool" factor, but the business value is not yet proven. Nonetheless, the potential to extend the reach of BI, reduce information overload and give users greater autonomy make this an important area to watch.

She also talked about a few other innovations: Open Source BI, Software as a Service, BI Search, Mobile BI and Predictive Analytics.

About Predictive Analytics, she said: This is a mature capability but one that continues to remain a specialist tool and task. Encompassing predictive analytics within a total BI deployment means more people can benefit from predictive discoveries and insights. This is both a strategic issue and a technical issue. With SAS and Teradata closely partnering and SPSS striking deals with Business Objects and Cognos, the convergence of BI and predictive analytics continues to gain momentum.

Cindi Howson is an expert of BI Industry and this is a good article about the BI innovations. She is the founder of BIScorecard, a Web site for in-depth BI product reviews.

Saturday, November 1, 2008

IBM Information On Demand 2008


The IBM Information On Demand 2008 happened this week (October 26 - 31, 2008 - Mandalay Bay - Las Vegas, Nevada).

According IBM's brochure:
IBM Information On Demand 2008 incorporates seven previous information management conferences, including IBM DB2®, IMS™ Technical Conference, Content Management Technical Conference, Business Intelligence Customer Solution Summit, Master Data Management Conference, Information Integration Live and Americas UserNet Conference. Learn how to unlock the business value of your information. You’ll discover everything you need to know about data management, enterprise content management, information integration, master data management, business intelligence and performance management.

Here are some links with news, posts, and articles about the conference:

- Cognos solutions

- B-Eye-Network
. Blog: Richard Hackathorn - He did a very nice coverage, with several posts about the Conference.
. Blog: Shawn Rogers

- Intelligent Enterprise - Doug Henschen
. IBM's On Demand Push: Greater than Sum of Parts
. IBM, Oracle and the Appliance Campaign Trail

- eWeek - Brian Prince
. IBM Pushes Forward with Its Information On Demand Vision
. IBM Expands Information On Demand Software Portfolio

- ItWorldCanada - Eleven things heard and seen at IBM’s Information on Demand conference - Kathleen Lau

Saturday, October 25, 2008

Teradata Partners User Group Conference


The Teradata Partners User Group Conference & Expo, happened last week (October 12-16, 2008, at the Mandalay Bay Resort, The Mandalay Bay Convention Center, Las Vegas, NV).

The Teradata Partners User Group Conference & Expo is a world-class, customer-driven conference that provides the very best opportunities to learn, grow, and increase data warehousing expertise.

The Partners Conference offers more than 200 sessions tailored to business, technical, management, and executive attendees. Many of these sessions are presented by Teradata customers, who will share insights gained from implementing and managing their own data warehouses.

Here are some links with news, posts, articles and podcasts about the conference:

- B-Eye-Network
. Blog: Richard Hackathorn - He did a very nice coverage, with several posts about the Conference
. Blog: William McKnight - He also did a very nice coverage, with several posts about the Conference
. Several Podcasts

- Intelligent Enterprise - Mark Madsen
. Teradata's Tectonic Shift
. Teradata Adds to a Growing Portfolio
. My Takeaway on Teradata's Keynotes

- Destination CRM - Lance Armstrong Rides Again - Lauren McKay

Monday, October 20, 2008

Five Steps to Optimizing BI and Data Warehouse Performance


David Stodder published today in Intelligent Enterprise, an interesting article called Five Steps to Optimizing BI and Data Warehouse Performance. He is vice president and research director at Ventana Research and the article is an executive summary of the resulting report entitled Optimizing BI and Data Warehouse Performance.

Stodder talks that with the pressure is on business intelligence and data warehousing professionals to handle ever-higher data volumes and ever-more-complex queries while reducing decision latency.


Optimizing BI and data warehouse performance is vital to meeting business objectives. Based on the results of Ventana's recent research, and on knowledge of best practices involving people, processes, information and technology, Ventana Research recommends the five following steps toward BI and data warehouse performance improvement:

1. Let business drivers and benefits direct performance improvement efforts. Before taking steps to improve BI and data warehouse performance, make sure you understand the purpose of these systems, the objectives they serve and the benefits that your organization expects to derive from optimization. This knowledge will help you set priorities for tuning current systems and augmenting them with new systems.

2. Improve information assets for analyzing and tuning performance. Determine early on what information sources your organization uses to understand performance. In most organizations, the sources are diverse and include both people and systems.

3. Use performance demand to guide deployment of appliances, specialized databases and query accelerators. Determine your strategy for deploying appliances, specialized databases and query accelerators based on analysis of performance demand. A growing trend is use of BI and data warehouse appliances, which offer preconfigured combinations of software, hardware and storage systems.

4. Reduce the time it takes to remedy unsatisfactory performance and implement information change requests. Make it a goal for your organization to improve your users' experience by addressing problems and implementing information change requests rapidly. A critical factor in improving response time is to know when performance demands will be at their highest so you can plan resources accordingly.

5. Assess your organization's maturity and invest for improvement.
Ventana Research measures organizational maturity using a four-level scale. In this benchmark research, most participating organizations rank at the middle levels, with 38 percent at the third-highest Strategic level and 29 percent one step down at the Advanced level.
Use these results to assess your own maturity and to determine where you can apply improvements in terms of your people, processes, information and technology.

The complete "Optimizing BI and Data Warehouse Performance" report, which is available for purchase from Ventana Research, offers extensive detail on its benchmark research, including 15 charts and graphs, on the demographics and people, process, information and technology maturity of survey participants.

This is an interesting article and I think the report is also an interesting benchmark research about this important issue.

Wednesday, September 24, 2008

The Semantic Web and Web 3.0


Tomorrow, September 25th at 3PM ET, the DM Review,in its initiative called DM Radio, will provide a live web broadcast called The Semantic Web and Web 3.0, hosted by Eric Kavanagh with Jim Ericson.

Accordind DM Review: "You've likely heard of the Semantic Web, which some pundits say will usher in Web 3.0. More than just a new spin on an old domain, the Semantic Web promises to essentially re-architect Internet content in a way that facilitates a new level of information integration. What will that mean for the burgeoning industry of business intelligence and data warehousing? How will this Semantic Web affect business strategy, decision-making; and what impact will it have on the lucrative enterprise software market for information management?

Tune into this episode of DM Radio to hear the experts explain what's happening and why. We'll talk to Hired Brains co-founder Neil Raden, who will outline the Web Ontology Language (OWL), which is built on the Resource Description Framework (RDF), a relatively new W3C standard for for describing Web resources, such as the title, author, modification date, content, and copyright information of a Web page. We'll also hear from Jason Hekl of InQuira, and a special guest.

Attendees will learn:
- Why the Semantic Web changes the rules of information management
- What impact the Semantic Web could have on enterprise architectures
- How organizations can transition from Web 2.0 to Web 3.0
- How relational information models will be affected
- What a declarative ontology is all about."

In the DM Review website, you can register for this live Web broadcast.

You also can check out the DM Radio archives to hear previous programs with a variety of other issues.

I think this episode of DM Radio will be very nice, the subject is interesting; and also because I like so much to read the Neil Raden's blog in Intelligent Enterprise, he is an experienced professional, has good ideas and an interesting point of view about the business and technologic issues. He is co-founder of Smart (enough) Systems, a company specialized in Enterprise Decision Management.

Monday, September 15, 2008

Choosing BI, BAM or CEP


Intelligence Enterprise published a very nice article today, by Doug Henschen, where Gartner analyst Roy Schulte answered questions about event processing (EP).

Roy Schulte did several interesting comments:
- The event processing (EP) are coming in three varieties: low-latency BI dashboards, lower-latency Business Activity Monitoring (BAM), and ultra-low-latency complex event processing (CEP).

- The BI vendors are certainly moving toward BAM. If you put up a BI dashboard and you're refreshing it, say, every ten minutes, which you can do with any BI dashboard, then you are doing BAM with traditional BI tools. BAM is really a style of deployment rather than a distinct technology type, so I would say BAM can be done with traditional BI tools.

- When it comes to low-latency BAM, where you're trying to refresh the dashboard every few seconds, traditional BI tools aren't fast enough. That's when you need BAM products that are purpose-built to run fast.

- If you're doing something that involves a human being, you probably don't need CEP. People can't work that fast, so in those applications you may be able to use BI or BAM. When you're dealing with machine speeds, that's where CEP comes in.

- Customer experience management is a great use for CEP because you can't control it, you can't predict it, it's high volume and it's continuous.

- I'm excited about BAM — not necessarily the term "BAM," because most people don't use it, but the concept. I think this whole approach of creating dashboards and giving visibility to business people is such a big win.

I think the concepts and technologies are evolving dramatically, so you should consider carefully your need, choose the right technology, and start with the right architecture.

Friday, September 5, 2008

Business Intelligence Gets Smart


Intelligent Enterprise published today, an article about what the trends top companies are incorporating in their BI initiatives, called Business Intelligence Gets Smart, written by Doug Henschen, Intelligent Enterprise's Editor-in-Chief.

The article talks about the results of InformationWeek Special Report Survey (download here, requires free registration), with 358 business technology professionals and also what the companies are doing in their BI initiatives.


I would like to highlighted:
- The companies have managed to centralize BI planning and standardize practices, capabilities, and technologies.
- The companies are getting more competitive with the aid of low-latency data access and near-real-time analysis, but today you need more than broad deployments or innovative technologies. Successful companies are also far more likely to report "pervasive" or "fairly broad" BI adoption.
- The users are looking for BI tools easy to use, how we can see when "complexity of tools and interfaces" was ranked first by survey respondents as impediments to success. In this case, vendors have responded to demands to make BI more accessible and user-friendly with Web-based options, including visual dashboards and key performance indicators, that can be embedded within portals or applications.
- The users have more deployed than integration with desktop applications, with Microsoft Excel usually the tool of choice. Within the last year, vendors including Microsoft and SAP's Business Objects unit have added two-way integration functions that let authorized users edit or revise information in Excel and then update the central repository.
- Complicated interfaces and data latency be damned—visionaries, and some vendors, increasingly are saying we should look beyond BI to automating decisions, using the concepts of Enterprise Decision Management(EDM).
- Contextual BI has been around for years in the form of "smart" or "analytic" business applications, such as customer relationship and supply chain management tools with embedded reporting and analysis capabilities.
- Performance management applications are designed to not just provide insight—the part powered by conventional BI—but also to help people take action to improve the performance of the business.

I also think it is very interesting the prediction of Marge Breya, executive VP and general manager at SAP's Business Objects unit, when he said: "Within three to five years, there will not be an application on the face of the planet that does not have embedded BI. The question is, what do you do when you have to look at information outside of the application?"

The BI tools are increasingly smart and pervasive, and the companies can benefit from it.