The field of Business Intelligence (BI), despite making significant advances and providing a multitude of useful features, faces an unusual paradox today. According to a Gartner survey, BI and analytics adoption in most organizations remains low at around 30%. Although companies continue to invest in new technologies, at the ground level, BI still faces obstacles in becoming pervasive. Business leaders today continue to struggle to ask simple questions from their data such as

“who are my most valuable customers?”

“why has revenue declined this month compared to last month?”

without 3-4 levels of requests that lead to a team of analysts.


So What Makes a BI Tool “Self-Service?”

Very often, a one size fits all approach is deployed in implementing a BI or data analytics culture. A single BI tool is often expected to play the role of fulfilling enterprise reports, self service BI, and enable data scientists to implement complex algorithms. This may be due to cost constraints or ideological inclinations but such an approach prevents the chosen BI tool from being deployed efficiently and becoming truly self-service. Thus, an organization often needs to balance its needs for having:


  1. A well designed and good quality decision support data in a data warehouse.
  2. An efficient Business Intelligence tool which is able to query and present this data with the least amount of effort.
  3. A pool of business users trained to analyze the key takeaways from the analytical tool and derive useful business insights for the profit of organization and,
  4. A small but efficient IT team which supports the business teams.

While many solutions promise to cater to the analytical needs of the modern enterprise with more tools and technology, in the words of distinguished software engineer Grady Booch, ‘A fool with a tool is still a fool.”


8 Reasons Why Business Intelligence Tools Have Failed to Deliver True Self-Service


We have identified eight factors that hinder modern BI tools from becoming truly self-serve and result in low adoption rates of pervasive BI in organizations:


1. Lack of Access to Clean Data

Very often the transactional platforms employed by organizations leave a lot of fields as non compulsory fields, or free form text fields. These free form fields in a longer span prove to be the biggest bane for efficient data processing. A properly verified data entry form with appropriate validation goes a long way in securing the quality of data that gets generated in an organization. There are some very valid use cases where use of fields that contain a comma separated multi field collection makes sense, however such fields occurring with frequency could lead to complex situations where it comes to cleaning and processing the data.

The fact remains that despite advances in data capture governance and automation, organizations typically do not have access to clean data. Usually, the data collected during different business operations, when merged together, has no unifying format, may be massaged, and is not complete or consistent. In fact, cross-industry studies have shown that less than half of structured data and less than 1% of unstructured data is actively used decision-making in organizations. However, enterprises that ‘win with data’ are those who maximise the use of data regardless of these limitations. Understanding that your data is not perfect is the first step towards using it in the right way to take decisions. This subsequently results in progressive improvements in the quality of data, and incremental benefits from the consumption of data over time.


2. Focus on ‘Pretty Graphs’

Visualizations help an organization in containing the proliferation of spreadmarts (users retaining pools of data on their machines). An important aspect to keep in mind however is that an excessive preference for beautiful visualizations often leads to time spent beautifying the data than meaningfully consuming the analytics. Aesthetic presentation of data is important, but it is far more important to use data to gain analytical insights, bonus if this also happens in an aesthetically pleasant format.

The challenge here comes from the fact that most times, companies get too caught up in using BI tools for visualizations and dashboards instead of answering the questions that matter, in a timely manner. Users no longer want to decipher ‘canned reports’ and expect their data & analytics platform to proactively alert users of critical changes in their KPIs for decision making. These alerts need to be relevant to the function/role of the user so as to remove the ‘clutter’ from heaps of visuals. Further, no matter how many visualizations are created, differences in skillets between users of various functions are inevitable. A modern BI platform needs to aid users with natural language generated explanations of visualizations for effective data storytelling and decisions.


3. Clunky User Interfaces

A tool with a small learning curve helps users to expedite use of actionable insights and gives them the ability to deliver actionable insights as fast as possible. Ideally, it should also give users the ability to modify the implementation and adopt data changes as fast as possible. Tools which have a small architectural footprint help to optimize hardware, support and maintenance costs.

Despite the advanced functionalities and features, modern BI tools still tend to be complicated and unwieldy for end-users – filters, menus, and hundreds of pages of help manuals to get the most simple insights from your data. This deters from them making full use of the tool and ultimately lowers the adoption rates across the organization. Clunky interfaces are all too common in today’s tools and this acts as another obstacle to BI becoming self-service.


4. Lack of Collaboration

A self-service BI tool becomes all the more useful when it incorporates the mechanism to collaborate with other participants in the organization and share content. The ability to securely deliver data and analytics to entitled participants is also crucial.

The reality of the matter is that there are very few BI tools out there which allow employees to collaborate on the work they do and in generating insights. Leading tools today continue to adopt an archaic method of sharing embedded datasets via email instead of helping users collaborate with data where most of their work and decisions happen. This is more apparent with the new reality we live in, and advent of the remote workplace where the majority of  a knowledge worker’s time is spent on platforms such as Slack and Microsoft Teams. Without this aspect, it is not possible for BI to be called self-service because it still requires people to depend on each other and work in isolation to generate insights. Such bottlenecks result in slow turn around times, unnecessary back and forth between team members and limited visibility and passive learning within collaborative work environments.


5. Rigid Data Governance

An organizational policy that introduces flexibility to choose appropriate tools for complex reporting requirements for regulatory reporting, reporting which requires use of specific visual motifs such as organizational logos and tabular structures vis a vis, tools which fulfill the need for ever expanding diagnostic, predictive and prescriptive analytics etc. aids in empowering the analytics users to keep their focus on analyzing data instead of struggling with the complexities of tools meant for fixed format reporting and data visualization with dashboards.

Despite the advances in BI, people continue to work in silos, in isolation from each other. Few organizations have governance frameworks and policies that ensure that data and BI tools are both flexible and accessible. In the absence of such frameworks and common access, it becomes difficult for BI tools to carry out its function of being self-service. A tool geared towards empowering end users to run their own analytics and run advanced statistical models to unearth insights while keeping the complex backend processing to a minimum is the need of the hour today.


6. Low Data Literacy Rates

Data literacy is the ability to understand and analyze the data you work with. One of the main reasons for low adoption rates is that employees at the ground level simply do not have the requisite knowledge to work with the data that they own and instead continue to view data analysis as an IT-focused task. Naturally, this makes them reluctant to use new tools and fully integrate BI within their workflows. However innovative companies such as Amazon and Netflix are confidently moving away from this position and empowering their employees to obtain various analytics capabilities. In fact, Gartner predicts that by 2023, data literacy “will become an explicit and necessary driver of business value.”

7. Complex Workflows

BI fails to become pervasive due to the complexity of the tool as well. When the interface is not intuitive or easy to use, adoption rates plummet and individuals usually stop investing their time in these tools and return to traditional programs like Microsoft Excel. This also applies to situations where integrating the BI tool with legacy or third-party databases and other enterprise systems is too complicated for the average user to perform. Change management frameworks and investing in training and education are absolutely essential to encourage those employees who want to be part of an analytics culture but are either reluctant or do not know how to start.


8. Inappropriate Choice of Tools for Intended Users

Oftentimes, organizations embark on their BI journey with good intentions but ultimately end up deploying a tool that does not have the required capabilities to serve business needs. One pitfall is that IT departments select tools based on their own understanding of data which includes SQL and other programming skills, a good understanding of data models, and so on. However, the end-users of a BI tool are not always data-savvy – they only know what questions to ask and expect answers for their questions in the simplest possible way. Hence, while organizations expect the tool to be extensible and incorporate advanced data science models, in reality, this can prove to be hard. Incorporating advanced features requires specialist skillsets and proprietary scripting skills. For example, the tool may only be able to offer basic descriptive analytics and visualization capabilities that do not fully answer the questions of the end-users.


Ultimately, it is important to realize that the challenges to BI adoption are multi-faceted. If an organization falls short in any one area, it could adversely affect adoption rates and attitudes towards Business Intelligence and analytics in the company. The key to successful adoption lies in establishing a strong data culture, reducing time to implementation, and choosing an easy-to-use and collaborative system which is responsive to user needs and flexible.

It is our focus at Unscrambl to bring to you an AI-powered data analyst that is accessible 24/7 and simplifies the way you access and consume data & insights. Explore QBO and embark on a 14-day free trial to experience the benefits that BI and analytics can bring for your business!