The Future of Business Intelligence Part 1: The Mangled 'Supply Chain of Analytics'
Why the 'supply chain of data' has reached it's terminal endpoint, a history of how we got here, and a glimpse at what comes next.
As I write this, we inhabit an interregnum in our industry: the second great wave of enterprise business intelligence (BI) innovation has reached its end stage, and the third great wave has yet to take shape. Attempts to birth it around a set of natural language and AI driven self-service features bundled together under the term ‘augmented analytics’ have thus far fallen flat, leaving BI practitioners stuck in the same dashboard and ad-hoc query loop for almost a decade. Meanwhile, the rest of the analytics industry rapidly advances as new tech proliferates and whole new job families are created. Something has got to change. In this first of four(ish) posts, I define the major waves of BI innovation, situate us on the cusp of the next great wave and reveal the governing metaphor for the future of BI. Buckle up!
Well… how did we get here? A history lesson
Before launching into what the next platform looks like, we need to cover a little history. This is more than just stage setting, as forgotten elements of past BI paradigms are poised to make a comeback and understanding how they arose and why they were (temporarily) discarded will be crucial in using them again without replicating the mistakes of the past.
Wave 1: Enterprise reporting
Wave 1 of business intelligence1 emerged thanks to the web browser and corporate intranet in the early 2000s. Like most IT technologies of the era, the goal was to migrate a real world process into the digital space largely as-is while relying on the scale and speed of the web to deliver value. For BI, this meant taking the form factor of the paper report - headers, footers and a table of numbers - and moving it to your computer, available on demand or via scheduled email distribution. This alone was revolutionary.
A curated library of insights
The key advancements of wave 1 allowed a centralized team of skilled data professionals to build a rock solid, large scale report distribution system that served tens of thousands of people with curated and accurate metrics. Some of these are:
Centralized metrics modeling: The ability for the IT team to define a single metadata model that provided metric consistency across potentially tens of thousands of reports.
Variable prompts: The ability for the end user to provide variable values at runtime via a series of cascading UI elements like pick lists, radio buttons and date selectors. Coming from a world of paper reports this was pure magic.
Bursting: The automatic, mass scale generation and distribution of parameter driven, personalized reports to email via pdf.
Server based architecture: When managed by a skilled administrator, these systems provided a consistent experience for all users with high uptime, and enabled centralized processing with vastly more power than was available on a user’s machine.
Centralized security: Because everyone accessed the BI system via the browser with no desktop component, security was applied consistently to all users. Security was done at the folder, report, object, table or field level. Turtles all the way down.
Observability and audit: The system kept a comprehensive record of who logged in and what they did in each session, as well as capturing the identity of burst report recipients.
This all sounds great, and it was! As someone who built these systems I can assure you, they continue to deliver exactly what I describe above at a level of sophistication that is sometimes shocking to people in the startup space today, where this suite of capabilities would require 5 products and a lot of custom code. However, big changes were afoot in the early 2010s that would quickly brand this way of working with the dreaded ‘legacy’ epitaph.
A generalized move away from attempts to mimic the pre-digital world and into what we’d now call ‘digital native experiences.’ In the case of BI, this meant abandoning the ‘report’ form factor which mimics a sheet of paper and adopting the ‘dashboard’, which is native to the computer screen. This is also where you see the move from tables of numbers to visualizations.
Wave 1 BI was powerful, but it was slow. Everything ran through the IT data team as the developer tooling was too complex for business people to use for even the most rudimentary purpose. Simple requests like updating a calculation would sometimes take months or even years as they languished in a ticket backlog.
Wave 1 BI was great at answering well defined, agreed upon questions and distributing those answers to a huge audience. But when it comes to speculative data exploration or situations where you need data to make a decision now, this approach simply didn’t work.
This is, of course, where Wave 2 enters the scene.
Wave 2: Desktop data discovery
Wave 2 of business intelligence came hard and fast and in the form of a plucky visualization tool called Tableau2 that a reasonably data savvy business person could install on their desktop and, in a few hours or days, turn out a great looking dashboard. It was everything Wave 1 was not - highly visual, decentralized, easy to use and built for a digital-first world. This way of working ate old-school BI’s lunch over the 2010s. Let’s explore why.
Power to the business people
There are many important software features that help explain the rise of Wave 2 BI, but nothing is as important as the culture shift it kicked off in how we build analytics - decentralization and business control. This enabled two key things:
An end run around IT: Tableau was easy to use compared to wave 1 and could be procured as a desktop tool with just a few licenses. This meant a department or even a single individual could start churning out analysis without IT involvement, skipping that interminable ticket queue entirely.
Analytics by the analysts: By moving the BI function to data analysts, Tableau enabled the people who actually understood the data to explore and report on it in an intuitive, visual way. This lead to a huge productivity increase compared to Wave 1, which split this function between business and technical experts.
These two points are critical in understanding why Wave 2 was so successful. I would often tell IBM Cognos departments quite bluntly, ‘Tableau’s number one feature is that your users don’t have to call you to use data in their jobs.’ People didn’t like hearing that, but it was true.
As far as specific features of Wave 2, the critical ones are:
Visual and digital first: Wave 2 tools are built for visual data exploration and dashboard-style reporting on computer screens. They aren’t trying to mimic anything in the real world3.
Way better graphics: There was just no comparison between Wave 1 and Wave 2 when it came to the looks department. Tableau is really pretty.
Analytics at the edge: Building a desktop tool in the late 2000s was a bold thing to do. The ethos of the era was to push as much compute as possible to a centralized server and rely on an IT team to manage it. Wave 2 could be productively used and managed on a single analyst laptop.
Workbooks: Wave 2 took the workbook format of Excel as it’s organizational principle, which was already the default mode of content management for most analysts.
Wave 2’s apotheosis came in the form of Power BI, which took the Tableau formula and made it faster, easier and - most importantly - cheaper. All of this adds up to an easy to acquire, easy to use tool that requires little to no IT oversight and, in the hands of a smart analyst, can deliver great looking visuals quickly. This model was and is extremely successful at churning out data content. But as we’ll see, it’s run into significant challenges now that it’s become the ubiquitous, default mode of BI.
The data supply chain theory of value is collapsing
The most commonly held purpose of Business Intelligence is to deliver timely insights to decision makers, and the most popular metaphor for this process is that of the supply chain. Source systems contain raw materials; data engineers refine these raw materials into usable components and store them in a warehouse; data analysts assemble the components into insights and deliver them to decision makers, who then do with them what they will.
This is data as logistics. It starts with raw materials mined from the digital records of real world events and flows in one direction to the decision maker. The job of the analytics practice is to make this flow as smooth and timely as possible. In this world, empowering the analyst with great tools generates tremendous value because they are close to the decision, understand the data and can manage ‘the last mile’ of ensuring each insight is accurate and arrives at its destination in time to make a decision. What they require then is maximum autonomy to manage the last mile independent of upstream processes. This is Wave 2’s theory of BI value. Get the right box onto the right porch at the right time and do it as efficiently as possible.
Wave 2 BI tools are the world’s most efficient insight delivery truck.
If the metaphor is accurate - and I think it does describe the status quo nicely - it also comes with the same issues as the modern day supply chain. The evidence of this is all over:
Warehouses full of unused, unwanted data products that nobody asked for - or even worse, that they did ask for but turned out they didn’t at all need
Brittle pipelines that fail when the system dynamics change
Dozens or hundreds of dashboards or tables that show slightly different slices of the exact same thing
Conflicting records or metrics with no easy ability to discern which is accurate
Extreme duplication of work as data content or metrics are impossible to find and thus recreated over and over
So why is Wave 2’s theory of value collapsing? We’ve squeezed the lemon dry in terms of wringing additional advantage from delivering hopefully accurate, probably necessary, usually pretty dashboards to decision makers as quickly as possible. This did create tremendous value, but the business world has adjusted to this value and is now asking ‘what’s next?’ Just as you and I have adjusted to the novelty of Amazon delivering any cheaply made product on our doorstep in two days. The advantages of the system have been fully metabolized and the disadvantages and contradictions are becoming more and more clear. Wave 3 is on the horizon.
Wave 3: New tools, yes. But also a new metaphor
At the beginning of this post I used a word I’m told is unfamiliar to people who didn’t study Shakespeare4 or medieval history in college: Interregnum. This is the time between the death of the old king and the crowning of the new one. It’s a dangerous time, but also a time of wild possibility where the old order is upended and the powerful vie to define the new one. The emerging BI regime will be defined by new tools, but also a new philosophy of delivering value with data. And hopefully a new metaphor for how we do it.
What are the signs of the interregnum? During the peak years of a technology wave, there is so much value to be gained simply riding the wave that nobody challenges the prevailing order. Power BI - unquestionably the top BI tool in the world as I write this - emerged in the 2010s as a cheaper option to deliver the analytics supply chain. It challenges Tableau on price, but otherwise follows exactly the Tableau playbook.
As the wave nears it’s end, wild alternatives that attack specific weaknesses in the prevailing order start to emerge, as we see today. There has been a ton of interesting innovation in the BI space the last few years, but they are all point solutions attempting to solve specific, narrow-scope problems. They do it elegantly and create important new ways to tell stories and collaborate with data, but they aren’t trying to upend the enterprise BI market and it’s hard to see how they could evolve in that direction.
In fact the last truly important large scale BI tool was Looker, which was founded in 2012 during the transition period from wave 1 to wave 2 and has important elements of both built into it - combining enterprise metrics and modeling with an easy visualization layer. It’s been 11 years. It’s time for something new.
The value of a good metaphor
To successfully transform how we do business intelligence, we need more than just new technology. We need a new metaphor that puts in simple terms how the technology delivers value. The metaphor drives innovation by telling software vendors where and how to invest. It guides practitioners by telling them how to conceive of their data practice. It drives the culture of data teams. Until the post-supply chain metaphor emerges, we’re sort of grasping in the dark. I have some ideas what it should be.
Where do we go from here?
So, things are changing for analytics practitioners. It’s both scary and exciting. As someone who has been through one sea change before, I look forward to greeting the future as it arrives. Here on Super Data Blog we’re going to explore what that looks like in upcoming blog posts, as this one is long enough already! You can expect:
Part 2: The new BI and Analytics metaphor: Not a supply chain, but a tree
Part 3: Evaluating the tools of the future - who is moving in the right direction?
Part 4: How is the data team going to change going forward?
Let me end by asking you - what are the biggest challenges in analytics and business intelligence right now, and what do you hope the future brings?
Hey - you made it to the end! There must be something you liked here, why don’t ya go ahead and…
There are of course important developments in BI that take place before the 2000s and had a huge impact on the industry long after. Most important among these is the OLAP cube developed in the 1990s. However, OLAP was subsumed into enterprise reporting over the 2000s. And, frankly, it would take too much space to cover every stage of BI leading back to the decision support systems the 1980s. So we’ll start here.
I remember the first time I heard the word ‘Tableau’ outside the theater context - I was on a call with Gartner asking them how IBM Cognos stacked up against Microsoft’s BI offerings in the pre-Power BI days, and all the analyst could talk about was Tableau and what our plan was for ‘visual data discovery.’ This was when IBM and SAP were the clear magic quadrant leaders, so it struck me as quite odd and we pretty much wrote this feedback off - to our peril it turned out.
It is true that the original dashboards tried to mimic a plane’s cockpit or auto dashboard, with lots of circular gauges and garish indicators. This quickly fell by the wayside for the current dashboard paradigm of ‘lots of charts mashed together on a screen,’ which doesn’t mimic a real-world display.
Fun fact. I was actually a playwriting and theater performance major in undergrad. I read a lot of Shakespeare.
Love to read your stuff. Always super interesting and insightful!
You had me at "interregnum". Another FIRE article!!!