Friday, June 11, 2010

Trip Report: Microsoft's Business Intelligence Conference

Hello Folks!

I'm just back from Microsoft's BI Conference in New Orleans. It was great to back in town after so many years.

I have to say that I really enjoyed the event, which was actually an event within the larger TechEd conference. In general, it was well organized and run, with few hitches. Very good show floor - with

more than a few Microsoft competitors; pretty good food and excellent WiFi. While there I had some great discussions with Microsoft folks, customers, partners and fellow analysts.

Not sure what the official attendance was. However, my guess is about 10,000 attendees for TechEd, who were allowed to attend BI sessions but not vice versa. To help gauge it, the BI keynote was held in a very large auditorium and probably had 1,000 people. Breakout sessions had overflow rooms. My own presentation on performance-directed culture was oversubscribed and filled both main and overflow rooms. And, to my surprise, there were a good number of senior IT folks (some CIOs) as well as Finance and other business professionals there.

Product Stuff:

From a product perspective, there's certainly a lot to talk about. At the highest level, Microsoft's strategy for BI revolves around Office, SharePoint & SQL Server. Under each, there are many components and options.

For instance, SQL Server includes the core relational database, Reporting Services, Analysis Services (OLAP), Integration Services (ETL) and data mining. Microsoft is also readying its release of its data warehouse appliance, based on the DATAllegro technology it acquired in 2008.

Office now features a new Excel add-on called PowerPivot, which is an in-memory data structure - supporting extremely large data models, rapid data loading and advanced compression. There's more coming, too, including a new SilverLight-based visualization tool on top of PowerPivot.

In addition to its core function as a collaboration engine, SharePoint Services includes the remnants of PerformancePoint (e.g., dashboards, scorecards, etc.) and the ability to store, index, and share PowerPivot models.

What does this all mean?

While Microsoft is late to the game for many of these new capabilities, they help to make them “mainstream”. For instance, in-memory, columnar data structures have been around for a couple of decades that I'm aware of (e.g., TM/1, Dimensional Insight, QlikView). Next, data warehouse appliances, offering high performance through parallel loading and query have also been around for a long while (e.g., Teradata, Netezza), and so on. So, this is a good thing in that it helps to make it "safe" for IT to begin to embrace "new" approaches to BI that would have been anathema heretofore. This is not unlike the role that Microsoft played with the release of Analysis Services over 10 years ago, which made OLAP (then ~ 20 years old) "safe".

What's the apparent strategy behind it all?

It seems to me that Microsoft may be trying to be "all things to all people". Each one of its offerings could well stand by itself - yet they apply to different audiences. In a sense it represents a “battle” between “bottoms up” and “top down”.

By bottoms up I mean user-driven solutions, which oftentimes operate without the knowledge or management of the IT department. These users abhor control and limits. They want to be free to explore.

Top down, is the polar opposite of this – where IT (or a COE) designs, delivers and maintains solutions for users.

Typically, BI vendors cater to one or the other segment, but not both. In my recent Wisdom of Crowds BI Market Study ™, I identified three categories of BI vendors: Titans, Established Pureplays and Emerging Vendors. Most Titans and Pureplays support a top-down model, while Emerging Vendors align most closely with the bottoms-up model. In this sense, Microsoft is quite unique in that it has viable solutions for both. This also presents it with a problem.

Allow me to explain:

The bottoms-up phenomenon is best characterized by usage of Microsoft’s Excel. While we may malign Excel for the chaos that it encourages - we also love it for the freedom it gives to us. Microsoft, in my opinion, is the proverbial poster-child for bottoms up! And, Microsoft Office with PowerPivot takes it to the next level. In fact, users may not even need a warehouse or multi-dimensional cubes anymore. They can quickly load large sets of operational data right into PowerPivot and start slicing, dicing and analyzing. It's also worth noting that PowerPivot (IMHO) and the advanced Silverlight viewer were probably the two technologies showcased at the event that generated the most excitement. And, given Microsoft's dominion over the spreadsheet market, it's likely to be a high-volume selling product.

In contrast, SQL Server and SharePoint Server appeals more so to the IT Department. They are robust yet complex products, requiring IT technical skills for design, implementation and support. They further the IT Department’s mission to standardize models and enforce policy surrounding user access and interactions. In this scenario, dashboards and scorecards are designed and built using PerformancePoint and Analysis Services and are delivered broadly (and consistently) to management and users across an enterprise via SharePoint Services.

Both are valid models. However, the question in my mind is: would using all of these classes of products work well together in organizations where the users and IT are not all singing from the same hymnal? Probably not!

I can imagine a scenario where:

- IT creates a relational data warehouse for users and generates some standard cubes in Analysis Services.

- To drive adoption, IT follows conventional wisdom, and gives users “free rein” to create and share objects in SharePoint.

- Users download data from SQL or Analysis Services and create extensive PowerPivot models and then share them via SharePoint - where they are extended, modified and enhanced - enterprise-wide (much more efficient than email or simple file sharing).

- These end-user models become the primary data source for analysis and decision-making (not SQL or Analysis Services)!

- So, what has given Excel a bad reputation can now be done on a far grander scale with Excel and PowerPivot!

This is not to say that Microsoft ought to abandon its bottoms-up or top-down BI product-line. However it must recognize its unique position in the market and deal with the fact that users and IT buy BI products for different reasons and in different ways. Trying to position the entire BI product stack to both constituencies simply doesn’t reflect the reality of the situation.

In the end, Microsoft needs to figure out who their customer is: user or IT - and then sell to one - not both. Sales and marketing messages should be bifurcated into user and IT and not try to sell benefits of one to the other.

I'll be digging into some key market trends and doing a deep dive into vendor and product rankings - including Microsoft - at my upcoming Wisdom of Crowds BI Market Study Webinar (TM) on June 22nd. Hope to see you there!!!!

And, of course, your comments are always welcomed!

Best,

Howard

Visit my Website

Twitter

LinkedIn

FaceBook



Blog Disclaimer Notice

3 comments:

  1. You must have missed the second "benefit" of the PowerPivot strategy. Yes, users are perfectly able to use PowerPivot to "dump" data out of SQL or wherever else and go their own way. However - as soon as they upload those (huge) spreadsheets to Sharepoint, the "server side" of PowerPivot takes over. Users can choose to download the whole spreadsheet as they have in the past for "local" analysis - but they also have the ability to simply "use" the spreadsheet remotely, with the analytics engine being on the SharePoint server rather than the end-user's Excel. In fact, the end user doesn't even need Excel.
    When (not if) such a spreadsheet becomes the "life" of the business, it traditionally has gone completely under IT's radar. They find out about these things too late... With the server-side PowerPivot sheets, however, there's built-in monitoring of which sheets are being used most, the query load, and all the other nice things an IT dept would like to know. They can then choose (if they wish) to "optimize" (take over) any one of those spreadsheets - to pull the data into their formal warehouse, to rearchitect with "professional" techniques - whatever.
    It definitely seems like they've found a way to make the best of both worlds - allow end-users freedom, but have that freedom show up on IT's radar.

    ReplyDelete
  2. Hi Howard


    Working in the Microsoft space, working in both I have found that the best adoption practices revolve around a mixture of self-service and governed solutions.


    Allowing users free rein to create their own solutions, and later bringing them into the managed fold is the approach we find works best in the mature organisations, however it is an ongoing process : monitoring of the usage of the user created reports/models etc is key, as well the creation of governance forums in which both users and IT are involved.


    As a rule of thumb, to be tuned per organisation, user's BI content is managed via the following rules :
    Created(or modified) within the last 2 months : ignored.
    Created/modified more than 2 months ago, accessed more than 2 months ago: email for versioning strategy (typically, archive, or move to quarterly/annual use) sent to creator
    If no action is taken, this repeated 3 times, then content archived.
    Created/modified more than 2 months ago, accessed recently/frequently/by multiple users (the limits for these rules are very organisation specific) - flagged for the governance forum, which will decide if it needs to be brought within the ambit of IT, and managed under SLA.


    The process of combining these 2 worlds of bottom up and top down is challenging, but the tools are making this process easier as they evolve - notably, the web analytics baked in Sharepoint 2010 mean that these capabilities are baked in , rather than needing to be built.

    ReplyDelete