January 19th, 2010 by Mark Rittman
Those of you who keep an eye on this blog, or have been in the BI and OLAP industry for many years, will of course know of Nigel Pendse. Nigel was behind the original “OLAP Report” that offered a vendor-neutral view of the OLAP market and had run for as long as I’ve been in the BI industry, and more recently he started working with BARC (the Business Application Research Center) to help organize and produce the annual BI Survey, a survey of customers and implementors working with the complete range of BI, OLAP and PM technologies.
Recently, Nigel and Mark Handford contacted me to let me know that both the BI Survey and the OLAP Report have now been brought together into a new publication called The BI Verdict, which takes their existing content and extends it with additional coverage. Having regularly read and consulted both the OLAP Report over the past ten years, and helped publicise the BI Survey, I’d thoroughly recommend the BI Verdict and urge you to take a look, particularly if you are looking to start a new project or make a tool selection.
Anyway, with all this happening, it reminded me that I’d been meaning for some time to approach Nigel for an interview for this blog. I’ve got a particular respect for Nigel as he always seems to champion the BI customer and isn’t afraid to call the various vendors on their strategy, and he’s been in the business long enough to have seen it all before and have some perspective on current developments. Anyway, here’s my questions and Nigel’s answers, and I’d be interested to hear any feedback or comments on what’s been said. Thanks again to Nigel and to Mark Handford for making this possible:
[Mark Rittman] : “There’s been a lot of innovation in the BI tools market over the past few years, plus a large amount of consolidation. What do you see as the major trends and opportunities in the BI tools market at the moment?”
[Nigel Pensde] : “Actually, I’m disappointed by the level of innovation in the BI tools market, even before the industry consolidation, which will certainly make it even worse.
Many of the claimed ‘innovations’ are actually rehashed versions of ideas that have been around for ages. For example, in-memory was the original BI architecture from more than 40 years ago (with APL), and column-oriented databases are almost as old. Dashboards, too, date back more than 25 years, to the brief EIS wave.
But there’s certainly been a lot of consolidation, which is probably bad news for customers, as they’ll be paying higher prices for products that don’t progress as much as they would otherwise have done (or which may be discontinued altogether). The new owners of the BI products are much more interested in integration between their many products, than innovation in non-core products.
The products are also likely to get more technical to install and implement, which may be good for consultants, but not for users.
I suppose the best opportunities will come for smaller, creative new vendors who take advantage of the likely mismanagement of the products acquired by large vendors. Many of the BI products now owned by the large non-BI vendors will fail to move forward or will not be promoted aggressively, thus leaving gaps in the market for nimbler vendors.”
[MR] : “There has been a lot of talk around in-memory analysis, desktop analysis and column-store analysis over the past two or three years, particularly with the release of products such as Microsoft PowerPivot and Qliktech’s Qlikview. What is your opinion of these products, how much impact will they have on the market, and are they solutions suitable for enterprise customers?”
[NP] : “Of course, QlikView is hardly a new product – it’s been around since the mid 1990s. The product sells well because it’s aggressively marketed, easy to use (at least for simple applications) and very fast. But it’s less successful for large, complex apps, and organisations that try to use it for such apps are less likely to be happy.
Of course, QlikTech will undoubtedly beef up the product’s functionality, but it has to be careful not to do this at the expense of ease of use. For example, products like Essbase and Microsoft OLAP Services (renamed to Analysis Services in 2000) started out as being aimed at business users. Now, both are far more capable and scalable, but also a lot more complex, and no business users could even think of developing new apps themselves (again, good news for consultants!).
PowerPivot is interesting. On the one hand, it’s just a way of making Excel 2010 bigger and faster, but not functionally richer. In this guise, I’m not sure how big the uptake will be, even if it’s free – after all, just how many people regularly need to download and analyse tens of millions of rows of data on their desktops? I think this may be more of a way for Microsoft to persuade its customers to upgrade to Office 2010 much earlier than they might otherwise have done, and for some of them to then use the latest versions of Microsoft server products like SharePoint and SQL Server.
But PowerPivot uses the very impressive new in-memory multidimensional VertiPaq engine, which will also feature in various Microsoft server products, such as SharePoint and Analysis Services (in fact, VertiPaq was developed by the Analysis Services team, and it will be an important new engine for Analysis Services). In this guise, it will certainly attract customers to these server products, by being easy to deploy, very scalable and extremely fast (but it currently can’t handle the more complex cubes that the older Analysis Services engines manage).”
[MR] : “Oracle have made big investments in business intelligence and enterprise performance management in the last five years. What do you think of Oracle’s strategy in the market, and how does it compare to IBM and SAP, two other large consolidators in the market? Do you see OBIEE having an impact in the market, and has the Oracle takeover of Essbase affected take-up and usage of the tool?”
[NP] : “Oracle has acquired many BI and PM products (partly as a by-product of other non-BI acquisitions, such as Siebel), rather than investing in building successful new products of its own. As part of this process, Oracle has inherited Siebel’s philosophy of providing pre-configured BI applications for its many OLTP applications.
This is probably a smart strategy, as customers who have invested many millions in shiny new transaction apps are probably not averse to spending a smaller amount to have better reporting and analysis of their OLTP data. OBIEE obviously plays a big part in this strategy. However, I don’t think OBIEE will have any impact in the larger (ie, non-Oracle) BI market.
SAP may attempt a similar strategy, but is in a weaker position to do so, as Business Objects did not bring with it a set of successful analytical applications that plugged straight into OLTP apps. And this isn’t even an option for IBM, which is not in the ERP business. To a lesser extent, Microsoft already does this with its Dynamics range.
Essbase sales into the Oracle base will certainly rise, but will probably largely disappear elsewhere as the product’s development tapers down. One irony is that Essbase had some very good dedicated OLAP front-ends when Hyperion was independent, but not for much longer. This means Essbase users will be expected to access it from OBIEE, which is not at all optimised for Essbase’s rich multidimensional structure. And this certainly won’t be an attractive option for non-Oracle sites.”
[MR] : “A recent article on The BI Verdict website mentioned a possible “OLAP Revival” following recent initiatives from the big vendors. Why do you think this is the case, and do not tools such as QlikView and PowerPivot negate the need for big investments in OLAP technologies?”
[NP] : “I didn’t write it, but it was a commentary on the increased investment being made in OLAP by all the major vendors, which remains true. For example, PowerPivot is the first deliverable from Analysis Services’ new VertiPaq engine. And one could argue that QlikView is a simple OLAP tool that fulfils exactly the same role today that Cognos PowerPlay did in the mid 1990s (ie, simple, business-oriented, in-memory analysis of multidimensional data).
IBM is also promoting TM1 much more actively than its previous owners did, and it’s to be the primary engine in Cognos Planning (which has always had its own proprietary engines until now).”
[MR] : “From your work with the BI Survey and BI Verdict, what do customers tell you are the critical success factors, and most common cause of failure, on business intelligence and OLAP projects?”
[NP] : “You probably expected me to cite the usual problems: lack of executive sponsorship and data access/quality. But, perhaps surprisingly, slow query performance is actually the most frequently-reported problem. Also, organisations that choose BI products taking into account their performance derive more business benefits than those who choose on the basis of vendor factors, like corporate standards.
This should be a salutary warning when people get over-excited about BI in the Cloud – that’s probably just about the worst way possible to deliver fast query performance.
Another key factor is involving business users in the actual implementation of BI and PM projects (ie, directly involved, not just sponsoring them). Projects that include business users in the team are notably more successful than those implemented just by IT people. And specialist BI consulting forms are more successful than either in-house IT or other types of non-specialist consultants.”
[MR] : “According to your Wikipedia entry, you’ve been working in the BI and OLAP industries since 1973. What do you think has been the most important innovation in the industry whilst you’ve been working in it, and what innovation do you think is over the horizon that will have the most impact on BI customers and developers?”
[NP] : “Not for the first time, Wikipedia is misleading on this. Back in 1973, I was a graduate trainee in an engineering company, and not using any BI tools. That came a couple of years later, and I didn’t actually join what we’d now call the BI industry until 1977.
Although I wasn’t using Express, the tools I was using back then had similar UIs (ie, a noisy teletype), and were aimed at OR specialists, rather than what we’d now think of as business users. Ironically, it means that modern tools are in some ways less sophisticated than their predecessors were 30+ years ago.
For example, FCS, the 1970s predecessor of Essbase, included Monte Carlo analysis, but Hyperion had to acquire a separate product (Crystal Ball) to get this capability into its 21st century product line. I’m also pretty sure that Express in the 1970s had statistical features not available in today’s Oracle OLAP Option and Essbase.
One thing that’s gone full circle is the architecture: in the 1970s, BI software was typically accessed using what we’d now call cloud-based, multi-tenant SaaS, using thin clients — in other words, dumb terminals connecting to a timesharing network’s remote mainframes (often in another country). You didn’t buy or rent the software, but just paid for usage. And it worked, so I don’t think anyone can claim that recent developments are ‘innovative’.
For example, I was responsible for a successful oil taxation model called Petrofisc some 30 years ago. This model was created using open source, collaborative techniques, and was widely used by oil companies planning their North Sea investments. It ran on a SaaS thin-client architecture and was even used simultaneously by competing oil companies negotiating against each other.
Of course, today’s BI software is far easier to use. It has a decent GUI (at least for the end users – techies usually still have to write code), even if it doesn’t do much more than its ancestors. Many things are now done automatically (and so they should be, given the huge increase in computer power with relatively modest performance improvements), and of course the volumes of data and numbers of users are much larger.
Getting hold of that data is now much easier, thanks to the move from very proprietary (often home-grown) transaction apps to modern ERP apps running on relational databases. Standard ETL tools and data warehouses mean that modern BI products don’t need to include such features themselves, as older products like Express once had to do.
However, many apparently promising innovations haven’t taken off, including:
- Advanced visualisation – for many years, it has been expected that innovative new ways of visualising business data would be the Next Big Thing, but most have failed to catch on. Instead, we have stupid gimmicks like animated speedometers and artistic 3D data presentations that just obscure important information. But we do finally seem to be getting some genuinely useful techniques, like spark lines, micro and bullet charts in mainstream products now. Needless to say, I made heavy use of these visualisation options in The BI Survey 8, with not a 3D chart or speedo in sight.
- Expert systems – these were all the rage in the 1980s and early 1990s, and it was hoped that they would help managers interpret the results from their BI applications, and suggest actions.
- Self-learning systems – again, this seemed a good idea 25 years ago, but it failed to take off. The idea was that BI applications would learn what metrics you spent most time analysing, what you then did next and how you liked your data presented. They would change the way they worked accordingly (much like modern automatic gearboxes adapt to your driving style). Thus, your ad hoc, interactive browsing habits would form the basis for automatic, adaptive dashboards and exception reports.
- Integrated data mining for end users – in the 1990s, BI vendors enthusiastically promoted simplified data mining for business users. But this was a big flop, as no-one but statistical specialists felt confident with such techniques.
- Integration of non-numeric information with BI – this has always seemed like an overdue idea, but it’s still not deployed widely.
- BI for mobile devices – OK, this hasn’t been a complete flop, but it turns out that BI users on the move would probably rather use a laptop than a smart phone or PDA for viewing BI reports and alerts.
- BI extranets – greedy BI vendors saw this is a brilliant way of selling even more licences to customers who already had all the licences they could possibly use for their own staff. But it seems that most companies don’t feel any great need to expose their internal BI information to outsiders. Nor is it clear who should pay for a BI extranet. So, yes, there are indeed a few BI extranets, but they are rare.
And what innovations are just over the horizon? By definition, I haven’t a clue, but the chances are that they will prove to be yet more recycled old ideas.”
Thanks Nigel, and you can find more information from Nigel (including some free analysis on trends in the BI Market) at the BI Verdict website