Three Common Pitfalls of Self Service Business Intelligence

Self Service Business Intelligence (BI) is on the rise. Whereas previously the creation of reports and visualisations was mainly the domain of the BI department and the Data Analysts, now users are also able to create these themselves. In the effort to make this possible, however, something is often overlooked.

When striving for the introduction of Self Service BI, a lot of time, effort and money is invested in unlocking data in a usable way. Datasets are defined, built and unlocked, aimed at answering the most important questions unequivocally. Cubes of pre-aggregated data are often replaced by tabular data where the user can combine them to their heart’s content. They are not only aimed at answering previously formulated questions, but also at offering the possibility of answering ad hoc questions.

Theory vs Reality

Users are trained in the use of tools, such as Microsoft Power BI, and then get to work with the data. The result is an organisation where end users can quickly extract insights from the available data independently. In theory.

As is usually the case, reality is more complicated and there is a number of things that need to be taken into account, three of which are mentioned below.

Knowledge retention

The first is that the training that users receive often comprises no more than the basic principles of using the tools. That in and of itself does not have to be a problem. If a user is able to carry out the most elementary operations and create visualisations, this can already be very valuable. More complex questions can then still be forwarded to the analysts or the data team. However, the majority of users who have followed a training course do not use the knowledge they have acquired to a great extent in the time that follows, making it hard to retain this knowledge over time. Only those who are given the opportunity to put the knowledge into practice on a frequent basis will retain it and even deepen it. Often these are the people with the role of analysts. The others remain dependent on those analysts for most of the fulfilment of their information needs.

Constantly evolving tooling

The second thing that occurs is partly related to the first.

The tooling is developing rapidly. New, often very welcome, functionalities are added almost every month and things are being improved constantly. However, the training that users have followed is always based on the functionality available at a given time. And because such a training is adapted much less frequently than the tooling it is about, there is always functionality that remains underexposed. In order to be able to make optimal use of the tooling, a user will therefore have to make an effort to keep up, something that is not self-evident for a non frequent user.

Consuming vs Building

Thirdly, despite all the efforts made, most of the organisation will be consumers, not builders. That means that they will consume information products built by others with the new tooling. In our opinion, too little attention is being paid to training those consumers. In a static report, an Explanatory information product, the designer has tried to visualise what she wants to bring to attention as clearly as possible, emphasizing important information. However, when it comes to Exploratory information products, reports or dashboards in which the user can select and filter data, different abilities are required. Those abilities consist of two parts: an analytical part and a skill part.

The analytical part relates to the ability to formulate questions and answer them with data. The skill part deals with the ability to make use of the functionality of the tools with which the information product was built, in order to get to those answers. The latter is something that is not only underexposed in the training courses that are followed, but which also evolves with the tooling. When a designer of information products incorporates the new functionalities offered by the tooling into the end product, but the user does not know how to deal with them, or even that they exist, this leads to a poorer user experience and less good analyses.

The first two points are inherent to how organisations function and are something to take into account, rather than something that I believe needs to be changed. The third point is though. More on how to tackle this in a subsequent blog post.

Let’s connect

Are you ready to work more easily and cost effective? Please contact us for more information or a no-obligation consultation. We will get back to you as soon as possible!

  • This field is for validation purposes and should be left unchanged.