What lurks in the shadows?

Today’s economic climate has made it imperative that business users be able to react faster to changing business conditions. To do this they need ready access to data and agile analytics to support important decisions.

Perhaps in the not-so-distant past, those decision makers didn’t have quite the same urgency. Today this is no longer the case. Business users simply can’t wait. The consequences are too severe. They need their data, and they need it now.

Spurred on by this urgency, enterprises continue to make big investments in data and analytics, despite uncertain economic conditions.  According to results from New Vantage Partners in their Big Data and AI Executive Survey, 99% of the firms they surveyed in 2021 reported active investments in analytics, Big Data and AI.

Unfortunately, the return for many aspiring data-driven organizations remains low. They still have trouble getting access to the data they need; they still use legacy tools to access offline data; they still perform AI/ML projects in isolation; and—perhaps worst of all—despite all these investments, decision makers still don't trust the analytics and data they’re seeing.

Organizations in the SAP community, in particular, are suffering from these challenges. According to a recently published UK & Ireland SAP User Group Data Analytics Report, which is based on an SAP user survey conducted in late April, fewer than half (44%) of those surveyed believe their organisations have the necessary analytics and intelligence technologies available to make effective use of their SAP data.

Twisting the knife, the same SAP research revealed that only a third (32%) of respondents think their organizations have the business / data skills necessary to use its data effectively.

As a result, business users are instinctually driven to pursue their own answers with their own tools using their own data. It’s Shadow BI 2.0. And multiplying crises seem to be accelerating the wrong kind of transformation. Transmogrification, not transformation.

In a recent report by the analyst firm BARC—Analytics Unchained, published in May 2021—suggest the original shadow BI (let’s call it Shadow BI 1.0) was initially powered by Excel. The more virulent form of shadow BI (Shadow BI 2.0) can be traced to ungoverned, self-service BI tools:

“Change came a decade later with the advent of user-friendly self-service BI and visualization tools. Early resistance was largely overcome when these tools were embraced as an opportunity to eradicate the bottlenecks created by cost-oriented BICCs. Unfortunately, a general lack of governing capabilities, originally perceived as guarantors of flexibility, and the dissemination of analytics into all corners of modern companies overstrained the approach.”

BARC argues that the way forward is a vertically integrated approach to BI software: “To overcome the defects of earlier generations of analytics and BI software, vertically integrated data and analytics software couples the flexibility required for quick insights with governance features for scaling decentralized self-service and blending it with central delivery.”

The consequences of not addressing tool proliferation are well-understood: a Frankenstein’s monster of tools and data chaos. Multiple versions of the truth. Data insecurity. Data mistrust. Only now it’s amplified and threatening to spiral out of control.

So, the challenge of the day is this: how do you create the environment for a data culture to thrive, where business insights can be revealed in minutes, without specialized training, and without risking data accuracy or security? And more critically, how do it without bringing another BI tool into the mix.

The problem of tool proliferation

In any modern organization, business users have become accustomed to complexity. Every day, they deal with complicated data stacks and lots of tools to access it, which makes getting to the data and working with it a struggle.

According to BARC in their BI Survey 21, data quality is a fundamental prerequisite for any successful BI project. However, the analyst firm cited “the proliferation of self-service BI tools that allow for data definition, import and modeling on a user’s desktop” as the second most vexing problem after poor data quality.

So many tools, so little time. What if the secret to analytics adoptions wasn’t a myopic focus on specific tools, but instead a focus on platforms designed for broader organizational analytics’ needs—to harness the power of the backend data infrastructure already in place?

What if fewer tools meant less context switching, less time learning new processes? In short, more time spent solving problems with data, and less time fiddling with the controls. And what if the analytics front end could protect those backend investments while delivering modern functional capabilities?

Doing more with less

Nowadays, professionals across the organization expect to work with lots of software—they’re not afraid of the consequences of plucking whatever tool they need solve the problem at hand. This has to stop. Data leaders need to take the initiative and focus on simplifying the data and analytics environment so that broader segments of the organization are collaborating.

What’s the solution? Choose analytics and BI platforms purpose-built for flexibility, ones that can connect all data sources without playing favorites, and can even complement existing tools to serve short term needs to bridge the gap. Choose ones designed not for one type of user, but many types of users—so all users can work together in the same environment. Choose ones that can operate anywhere: cloud, on-premises, both. That way you prepare yourself for any type of data configuration, today and tomorrow.

That’s the good news in all this. When it comes to digital transformation, data leadership can absolutely influence how users interact use software to harness their data. The right tools can service the needs of the group. This drives adoption, which, crucially drives a thriving data and analytics culture.