Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
For decades, business intelligence (BI) and analytics tools have promised a future where data can be easily accessed and transformed into information and insights for making timely, reliable decisions. However, for most, that future has not yet arrived. From the C-team to the frontline, employees rely heavily on technical teams to understand data and gain insights from dashboards and reports. As the CEO of a data and decision intelligence company, I’ve heard countless examples of the frustration this can cause.
Why, after 30 years, does traditional BI fail to deliver value? And why do companies continue investing in multiple, fragmented tools that require specialized technical skills? A recent Forrester report shows that 86% of companies use at least two BI platforms, with Accenture finding that 67% of the global workforce has access to business intelligence tools. Why, then, is data literacy still such a prevalent issue?
In most use cases, the inaccessibility of analytical forecasting arises from the limitations of today’s BI tools. These limitations have perpetuated several myths, widely accepted as “truths.” Such misconceptions have undercut many businesses’ attempts to deploy self-service analytics and their ability and willingness to use data in crucial decision intelligence.
Myth 1: To analyze our data, we’ve got to bring it all together
Traditional approaches to data and analytics, shaped by BI’s limited capabilities, require bringing a company’s data together in one repository, such as a data warehouse. This consolidated approach requires expensive hardware and software, costly compute time if using an analytics cloud, and specialized training.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
Too many companies, unaware that there are better ways to blend data and apply business analytics to them to make intelligent decisions, continue to resign themselves to costly, inefficient, complex and incomplete approaches to analytics.
According to an IDG survey, companies draw from an average of 400 different data sources to feed their BI and analytics. This is a Herculean task that requires specialized software, training and often hardware. The time and expense required to centralize data in an on-premises or cloud data warehouse inevitably negates any potential time savings these BI tools should deliver.
Direct query involves bringing the analytics to the data, rather than the reverse. The data doesn’t need to be pre-processed or copied before users can query it. Instead, the user can directly query selected tables in the given database. This is in direct opposition to the data warehouse approach. However, many business intelligence users still rely on the latter. Its time-creeping effects are well-known, yet people mistakenly accept them as the cost of performing advanced analytics.
Myth 2: Our largest datasets can’t be analyzed
Data exists in real time as multiple, fluid streams of information; it shouldn’t have to be fossilized and relocated to the analytics engine. However, in-memory databases that rely on such a method are a staple of business intelligence. The issue with this is that a business’s most extensive datasets quickly become unmanageable — or outdated.
Data volume, velocity and variety have exploded over the last five years. As a result, organizations need to be able to handle large amounts of data regularly. However, the limitations of legacy BI tools — some dating back to the 1990s, long before the advent of cloud data, apps, storage and pretty much everything else — which rely on in-memory engines to analyze data have created the sense that it’s an unwinnable battle.
Businesses can solve the problems inherent in in-memory engines by going directly to where the data lives, permitting access to larger datasets. This also future-proofs an enterprise analytics program. Direct query makes it infinitely easier to migrate from on-premises to cloud services such as those provided by our partners, AWS and Snowflake, without entirely rewriting code.
Myth 3: We can’t unify our data and analytics efforts within the organization
Too often, common practice is conflated with best practice. Ad-hoc selections and combinations of BI tools produce a cocktail of preference and functionality — with organizations frequently taking department-by-department approaches. Sales might like one platform; finance may prefer something different, while marketing could elect yet another option.
Before long, each department has a different set of tools, creating information siloes that make it impossible for the apps to talk to each other or share analytical information. According to the previously cited Forrester survey, 25% of firms use 10 or more BI platforms.
The problem is that splitting data prep, business analytics and data science among different tools hampers productivity and increases the time spent switching and translating between platforms.
Certain business areas work best when leaders allow their departments to choose an individual approach. Analytics is not one of those. Leaders and decision-makers need to trust their data. But trust is eroded every time it passes through another set of tools along the journey to creating actionable insights. The process inevitably results in data conflict and opacity. Consistency and understanding are critical.
Myth 4: Chasing the AI dream distracts us from the day-to-day realities of doing business
Many technologies, including BI tools, claim to be AI-driven. The promise is to replace human labor with unerring machine-learning efficiency; the reality is more often disappointing. Therefore, many businesses have abandoned the idea of using AI in their day-to-day analytics workflow.
Technology professionals can be understandably cynical about the real-world use cases for widespread AI in the enterprise. People still find themselves manually structuring and analyzing their data, extracting insights, and making the right decisions — all from scratch. The idiosyncrasies and decision-making processes of the human mind are challenging, if not impossible, to synthesize.
The trick to making AI a functional, effective tool in analytics is to use it in ways that support everyday business challenges without walling it off from them. Knowing exactly which AI-driven capabilities you need to use is vital. It may be intelligent but, like any tool, it needs direction and a steady hand to deliver value. Automating the routine enables humans to employ intuition, judgment and experience in decision-making. There’s no need to fear a robot uprising.
Myth 5: To get the most out of our data, we need an army of data scientists
Huge demand is building in the industry for the ability to collect vast amounts of disparate data into actionable insights. But company leadership still believes that they need to hire trained interpreters to dissect the hundreds of billions of rows of data that larger organizations produce.
Processing, modeling, analyzing and extracting insights from data are in-demand skills. As a result, the services of data scientists with specific and intensive training in these areas come at a premium.
But while they add value, you reach a point of diminishing returns. And these employees are no longer the only ones who can perform data science. A generation of business workers has entered the workforce, and they are expected to assess and manipulate data on a day-to-day basis.
High-pedigree data scientists, in some cases, aren’t necessary hires when non-technical business users have governed self-service access to augmented analytics and decision intelligence platforms. These users have invaluable domain knowledge and understanding of the decision-making chain within their business. What’s needed to make their job more accessible is a solid foundation of data and analytics capabilities that traditional BI tools often struggle to provide.
Value propositions and broken promises
The current analytics and BI landscape has made it obvious to business leaders that certain natural limits are imposed on their data and analytics efforts. While still useful for specific use cases, traditional tools are applied in loose combinations, varying between one department and the next. The frustration that this causes — the inefficiency and the potential time savings that are lost — are a direct result of the gaps in current BI capabilities.
Traditional BI is preventing firms from making the best use of their data. This much is evident: Businesses on the enterprise scale generate vast amounts of data in various formats and use it for a wide range of purposes. Confusion is inevitable when the method of data collection and analysis is, itself, confused.
Something more comprehensive is needed. Companies lack faith in AI-driven processes because legacy BI tools cannot deliver on their promises. They lack faith in democratized access to data because their departments don’t speak the same analytics language. And they lack faith in their data because in-memory engines aren’t scaling to the degree they need, leaving them with incomplete — and therefore, unreliable — data.
Data and analytics innovation is how businesses deliver value in the era of digital transformation. But, to innovate, you need to know that your barriers are breakable.
Omri Kohl is cofounder and CEO of Pyramid Analytics.