PASS Summit Day 2

UPDATE – after you read this post, you should also read the follow-up here:
http://blog.crossjoin.co.uk/2010/11/14/pass-summit-day-2-the-aftermath/

The last few days have been quite emotional for me. I’ve gone from being very angry, to just feeling sad, to being angry again; I’m grateful to the many members of the SSAS dev team who’ve let me rant and rave at them for hours on end and who have patiently explained their strategy – it’s certainly helped me deal with things. So what’s happened to make me feel like this? I’ll tell you: while it’s not true to say that Analysis Services cubes as we know them today and MDX are dead, they have a terminal illness. I’d give them two, maybe three more releases before they’re properly dead, based on the roadmap that was announced yesterday. And this is incredibly hard for me to write because I’ve spent the majority of my working life, about 12 years now, working with them; I live and breathe these technologies; and I have built up a successful consulting business around them. Neither is it true to say that they are struggling in the marketplace: on the contrary they have gone from strength to strength even in spite of the fact that, apart from the important performance improvements in SSAS 2008, we haven’t had any substantial new functionality since SSAS 2005. SSAS has been the most popular OLAP tool on the market for years, has loads of very happy users, and continues to be used on new projects all the time. Hell, on stage the other day at the keynote there was a guy from Yahoo talking about his 12TB cube, which loaded 3.5 billion rows of data per day, and which he was planning to grow to 40TB! The SSD revolution has given SSAS cubes a massive boost. So this is one very successful product and no other company would be allowed to do what Microsoft is proposing to do with it because if they did customers would be up in arms, calling their account managers, and the account managers would go straight to the CEO and demand that the product was not only retained but given the development resources it deserves. But this is Microsoft we’re talking about, and they have the luxury of being able to ignore this kind of pressure from their customers and partners and do whatever they want. And they have quite convincing reasons for doing what they’re doing, albeit ones I’m having severe difficulty coming to terms with.

So let me get round to explaining in detail what was announced yesterday at the PASS Summit. Quite a few BI related things were aired that I won’t talk about in detail: the move to Visual Studio 2010 for all BI development, and the integration of SQL Management Studio functionality into VS2010 too; FileTable; the Master Data Services Excel addin; Data Quality Services; loads of new SSIS stuff including impact analysis and lineage; and there was yet more buzz on Project Crescent. But I’m going to concentrate on what came out in TK Anand’s presentation on the future of Analysis Services. Here are the main points:

  • The BISM – BI Semantic Model – is the name for the new type of Analysis Services database that gets created when you publish a PowerPivot model up to the server. It’s SSAS running in the special in-memory mode, and SSAS instances will either work in this mode or in regular MOLAP mode. In Denali we’ll be able to install a standalone instance of SSAS running in in-memory, BISM mode without needing Sharepoint around.
  • We’ll be able to create BISM models in BIDS, so we get full support for stuff like source control. The experience is very similar to what we get in PowerPivot today though; one of the points that was made again and again yesterday was that they wanted to make things as easy as possible for BI developers; the implication is that today the learning curve for SSAS is too steep, which is why many database people have been put off using it; I would argue that any rich, sophisticated tool is going to have a learning curve though and I bet nobody would dare to go to the C# community and tell them that C# is too intimidating, and wouldn’t it be nice if they had the friendliness and flexibility of VBA!
  • BISM models are the UDM 2.0. Everything that the UDM was meant to do in SSAS 2005, and didn’t, are serious objectives here: BISM aims to replace traditional SSAS and SSRS report models, and be good for the kind of low-level relational reporting that SSAS today simply can’t do as well as the high-level, aggregated data it handles so well today. Business Objects universes were mentioned several times as being a very close comparison. Project Crescent will only work against BISM models.
  • BISM models will support MDX querying in some cases (see below) but DAX has grown to become a query language. We only had a brief look at it and basically it seems like you use a CalculateTable DAX function to return a tabular result set. You can also define query-scoped calculations just as you do with the WITH clause in MDX today. That’s a gross simplification, but you get the idea. DAX queries do not do any pivoting, so you only get measures on columns; it’s up to the client tool to do any pivoting. It was remarked that this made it much easier for SSRS to consume. SSRS couldn’t deal with multidimensional resultsets, and so instead of fixing this they made SSAS less multidimensional!
  • BISM models are massively scalable. They have no aggregations, there are no indexes to tweak, but they demoed instant querying on a 2 billion row fact table on a fairly average server, roughly the same spec that I see most people using for SSAS installations today. They’re achieving massive compression on the data, often anything up to 100 or more times. Of course all the data has to sit in memory after it’s been loaded but they’re going to support paging to disk if it won’t; we’ll also be able to partition tables in the BISM so we can control what gets loaded when. There will also be perspectives.
  • Miscellaneous PowerPivot functionality that was demoed included: a nice new window for creating KPIs easily; new DAX functions for flattening out parent/child hierarchies, similar to what the ‘Parent Child Naturaliser’ does today in BIDS Helper (plenty of people, including me, pointed out that this was not proper support for parent/child hierarchies); a new RANKX function for doing rank calculations; Distinct Count will be a native engine feature, and you’ll be able to have as many distinct count measures on a table as you want; drillthrough will also be supported.
  • There will be role-based security in BISM, where you can secure either tables, rows or columns.
  • BISM models will also be able to operate in ‘passthrough’ mode. This is essentially ROLAP done right, and a lot of work has gone on around this; in Denali it will only be available for SQL Server and only if you’re issuing DAX queries, not MDX. In the future other RDBMSs will be supported, plus possibly MDX querying of BISM when it’s in passthrough mode. Essentially in this scenario when you query the model your query is translated direct to SQL, and the results returned are (as far as possible) passed back to you directly with the minimum of interference. In some cases, for example where there are calculations, BISM will do some stuff with the resultset before it hands it over to you, but the aim is to push as much of the logic into the SQL query that it generates. If it works well, it sounds like at long last we’ll have a serious ‘realtime’ BI option, though I’m still not sure how well it will perform; I suppose if there are Vertipaq indexes inside SQL Server and/or if you’re using PDW, the performance should be good.
  • There are only going to be a few improvements for regular, MOLAP-based SSAS – four bullet points in TK’s presentation! They are: the 4GB string store limit has been fixed; we’ll get XEvents and better monitoring functionality; Powershell support; and there’ll be some performance, scalability and reliability improvements.
  • BISM will not handle advanced calculations really in Denali. Yes, you’ll be able to do cool stuff in DAX expressions, but you won’t get the equivalent of calculated members on non-measures dimensions (so no time utility dimensions) or MDX Script assignments. ‘Advanced business logic’ is on the roadmap for post Denali, whatever that means exactly; the aim will be to support things like assignments but not necessarily exactly what we have now. To me this is going to be one of the main reasons why people will not adopt BISM in Denali – most enterprise customers I see have pretty complex calculations.
  • Role-playing dimensions, translations, actions, writeback and a better API (AMO will still work for creating BISM objects in Denali, but it is going to be difficult to work with and an object model that’s more closely aligned to BISM concepts will be needed) are all planned for beyond Denali.
  • There are going to be some tools to help migration from SSAS cubes to BISM, but they won’t get you all the way. Some redesigning/rethinking is going to be needed, and it’s likely that some of the things you can do today with SSAS cubes you might never be able to do in the same way with BISM.

MS are clear that BISM is the priority now. While MOLAP SSAS isn’t deprecated, the efforts of the SSAS dev team are concentrated on BISM and PowerPivot and we shouldn’t expect any radical new changes. I asked why they couldn’t have just kept SSAS as it is today and bolted Vertipaq storage on as a new storage mode (we will, of course, be able to use SSAS cubes in ROLAP mode against SQL Server/PDW with Vertipaq relational indexes) but I was told that it was seriously considered, but didn’t turn out to be easy to implement at all. The other question I asked was why they are abandoning the concept of cubes and explicitly multidimensional ideas in favour of a simpler, relational model, and they told me that it’s because multidimensionality put a lot of people off; I can see that’s true – yes, a lot of people have been converted to the OLAP cause over the years, but we all know that many relational people just can’t stomach/understand SSAS today. The vast majority of people who use SSRS do so directly on relational sources, and as we know while there’s a great demand for things like Report Builder, Microsoft has had nothing that worked really well to enable end user reporting in SSRS; BISM, as I said, is aimed at solving this problem.

So this is a radical departure for Microsoft BI, one that could go badly wrong, but I can understand the reasoning for it. I’ve been impressed with the technology I’ve seen over the last few days and I know that if anyone can pull this off, the SSAS dev team can. However, the fact remains that in the short term BISM models won’t be able to handle many enterprise projects; SSAS cubes, which can, will be seen as a bad choice because they have no long-term future; and we’re all going to have to tie ourselves in knots explaining the roadmap and the positioning of these products to all of our customers. There’s going to be a lot of pain and unpleasantness over the next few years for me and all Microsoft BI partners. Hohum. As I said, I’ve felt pretty angry over the last few days about all this, but now that’s turned to resignation – I can see why it’s happening, it’s going to happen whether I like it or not, and whether I kick up a fuss or not (I did consider trying to whip up a kind of popular rebellion of SSAS users to protest about this, but doubt it would have had any impact), so I might as well get on with learning the new stuff and making sure I still have a career in MS BI in two or three years time.

What do you think? I would really be interested in hearing your questions and comments, and I know the guys at Microsoft who read this blog would also want to see them too. I’m going to be in Seattle for the next two days and I’ll have the chance to pass on any comments that you leave here to the dev team, although I suspect some of them might be too rude to repeat. I certainly feel better just for having written this post and gotten things off my chest, and maybe you will too.

73 thoughts on “PASS Summit Day 2

  1. Chris,
    Whilst I am sad to see MDX go the way of the dodo I, unfortunately, completely understand this. MDX/Cubes/Multidimensional DBs/etc… are, whilst being really powerful, difficult to learn/comprehand – I know SQL folks that I have huge respect for technically and personally that simply cannot wrap their heads around these concepts.

    Having spent years trying to educate people into “getting” multidimensionality I view this as a necessary “dumbing down” I’m afraid.

    Fear not – there’ll still be a place for folks that you – that’s one thing I am certain of.
    -Jamie

  2. Chris,

    I have to admit, I went very quickly from incredulity (as in this is nowhere near April 1st) through anger to numb.

    As you point out, there are sound reasons for making the change, but like you I am struggling to understand how we, as BI professionals, are going to manage the transition. That inevitable period when it is obvious that SSAS is walking up the path to the retirement home and Denali is not a mature enough technology.

    I am also concerned by the term that keeps cropping up in your piece – ‘beyond Denali’ many of the things that you fit into this category sound like prerequisites to using the technology effectively.

    Like most BI people, I don’t consider myself to be particularly change adverse, but this does trouble me. I guess that as this is inevitable we will just have to see what exactly happens and in what order. I just hope MS can at least keep talking and listening to the loyal BI community that they have during the process.

    Keep Well and hope to see you – possibly near the seaside – in spring for SQLBits 8

    John

  3. As a DBA/Developer who has recently got the BI MCITP on SQL 2008, my reaction is mixed. It is disheartening if the news means companies won’t commit to SSAS projects.
    On the other hand, MDX is not easy to grasp quickly and was a weakpoint on my exam. The multidimensional concept was difficult at first but I was looking forward to working on something other than Adventureworks!

    The depth of SSAS is far greater than SSIS and SSRS (imho) and I’m under no illusion that without a project, the certification is worthless.

  4. Chris,
    I think it is really a travesty what is going on these days. Right now I am seeing a massive spike in the number of companies that are wanting to dive into ‘true’ data warehousing operations. So it has taken them a while to warm up to it and now they finally ‘get it’.
    I mean I have clients that are needing true BI ..so how is this new mode of thinking supposed to help me do predictive analytics based on month by month projections and customer demographics….sorry MSFT I just don’t see how I do that in the the ole’ 2 dimensional world.
    Get with the program Microsoft….everything can’t be point and click!

    Thanks for writing this up Chris! Good job!

    AJ

    1. Relational Databases are not 2 dimensional but N dimensional to begin with. Every column is a dimension. Star schema’s breaking with relation concepts was a mistake to begin with and was never necessary for BI. The only benefit to BI was speed and this was a result of its divorce with real time OLTP. Thank you Microsoft for dumping a unneeded non-mathematical data model.

      1. Chris Webb – My name is Chris Webb, and I work on the Fabric CAT team at Microsoft. I blog about Power BI, Power Query, SQL Server Analysis Services, Azure Analysis Services and Excel.
        Chris Webb says:

        We’ll agree to disagree on this one, I think…

      2. In my opinion, MOLAP has 2 major selling points:
        1. MOALP allows to use fully normalized slowflake data sources and avoid a pain of supporting denormolized data structures completely. Actually SSAS processes a snowflake faster, than star schema, so why people keep doing a monkey work denormolizing? (see http://jesseorosz.wordpress.com/2008/05/28/star-vs-snowflake-in-olap-land/ )
        2. MOALP gives pivoting, which is a major feature for accounting and ERP applications.

        Why MS could not solve MDX complexity by providing a 2 dimensional LINQ API is beyond me – see LINQ to MDX: http://agiledesignllc.com/products

        Now that MDX complexity is solved, should MS put MOLAP back into its throne? Or is there any reason not to?

  5. I can understand the disdain and reservation, but I’m actually very excited at the promise of BISM. Treating multidimensionality as a transformation captured within a semantic layer is very exciting. This helps us drive to non-Kimballesque physical models in the data layer, whether its a data warehouse or mart. The same data stored and managed can be used to support a wider array of reporting needs.

  6. Chris,

    You know your OLAP history, and I wonder if this feels to you as it feels to me, a bit of history repeating itself. As Oracle did to Express, it looks like Microsoft are now doing to Analysis Services. I just hope MS’s BI technologies don’t disappear down a hole for several years of re-engineering whilst competitors like Qliktech steal all their customers. Good move with the abandoning of MDX and the move towards relational querying – it worked well for Oracle with Express 😉

    So, will the great irony now be, that Oracle (through Essbase) will be the main standard-bearer for MDX….?

    cheers, Mark

  7. i am shocked!

    And that argument that the learning curve for SSAS is too steep – for me that is asbsolutely non sense – to be honest – i am a drummer from my background, not a programmer or something else….. and i love it so much towork with SSAS…..

    I have to think about a little bit, maybe i will say something more later…..

    Dietmar

  8. i think its good. it addresses many gaps in the whole ms bi story and lets face it, without a good front end to ssas nobody is going to use it. IMHO once proclarity (which was a great front end) got thrown under the bus, it was sign to things to come. ms needed a true metalayer…finally it is here.

  9. I cant same I’m surprised – as the announcements on Denali have come out this week its felt that there was an Elephant in the room. Effectively abandoning your current user base doesnt sound like great business sense. Maybe I just dont understand the new platform yet , but currently I support cubes with billions of facts being added per day with complex mdx calculations on top, Its not at all clear that the future MS platform will support this.

  10. jason – Charlotte, NC – MCSE in SQL Server BI 2012 MCITP in SQL Server BI 2008 BI Microsoft Community Contributor Award 2011 Passionate technologist, basketball player and a wannabe guitarist... Pretty much sums me up! :) Follow me in twitter at @SqlJason
    Jason Thomas says:

    Blasphemous, Outrageous & maybe even a sense of arrrogance, these were the first thoughts about Microsoft’s decision that came to my mind when I first read about this. I just don’t get it why they decided to give SSAS the death blow at a time when many of the clients are warming to it. Obviously, now I cant recommend to any of my clients SSAS because I am sure that it is going to be an obsolete technology within a couple of years. This would obviously pain a staunch MS BI guy like me.
    But then, I thought over and asked myself – isn’t this a knee jerk reaction, just our resistance to change? After all, in this world, technology would always be changing and we as software engineers are meant to keep ourselves abreast and updated. No technology is meant to last forever, one just gives way to a better one. So with a VERY heavy heart, I am pulling up my socks and trying to learn something new…

  11. WOW! I am initially disappointed and concerned about what that really means for MS BI professionals. PowerPivot currently is lacking enterprise level BI features and it feels like a real set back in the competitive landscape when reviewing other BI vendor offerings. However, many other successful BI vendors do have semantic layers over relational data sources with much nicer interactive, visualization capabilities Microsoft is missing. The art of designing a user friendly, correct, reliable semantic layer across multiple source systems will still require some of the same UDM data design skills with a different delivery mechanism. This move might be taking us a few steps backward with the intent to try and jump ahead down the road. There is definitely some catching up to do with the other BI vendors. It will be interesting to see how this all plays out and where Microsoft lands on the Gartner BI Platform Magic Quadrant next year.

  12. Chris,
    Whilst I am sad to see MDX go the way of the dodo I, unfortunately, completely understand this. MDX/Cubes/Multidimensional DBs/etc… are, whilst being really powerful, difficult to learn/comprehand – I know SQL folks that I have huge respect for technically and personally that simply cannot wrap their heads around these concepts.

    Having spent years trying to educate people into “getting” multidimensionality I view this as a necessary “dumbing down” I’m afraid.

    Fear not – there’ll still be a place for folks that you – that’s one thing I am certain of.

    -Jamie

  13. Hello everybody.

    Thank you for your comments and your thoughtful feedback. I am going to respond with an “unofficial” comment from me. TK will provide a full blog entry with more detail.

    First – I want to reiterate what Chris implied but seems to have been lost on some of the commenters: BISM and VertiPaq are all new features of SSAS. SSAS is gaining massive new capabilities in Denali. Nothing it taken away, nothing is deprecated. It is all net positive. I hope you can not only internalize it but also communicate it to others.

    I am going to use an analogy that I find very useful when thinking how the BISM and the UDM live together. (Remember – they both live inside Analysis Services)

    The best way I think of the relationship of the “MOLAP UDM” to the “VertiPaq BISM” is akin to the relationship between C++ and C# in the year 2000.

    • C++ is the powerful, mature tool, but is also complex and demanding. It can do anything, but it is really hard to master. The innovation pace is slow and there are not a lot of breakthroughs possible with the language.
    • C# comes along and offers flexibility, simplicity and time to solution, but it does not claim to be able to tackle the whole workload of C++.
    • Even with C# introduced, C++ is still used for all the high end heavy lifting and it is just as important as it had always been.
    • But C# can be used for a broad set of less demanding applications, dramatically lowering the bar to entry and reducing the costs.
    • While C++ will still stay with us forever, C# is advancing rapidly and is able to take on broader and broader workloads.

    While there are many commonalities between C++ and C#, they are not fully compatible and there is no transparent porting of apps from C++ to C#. Indeed, very few try to do such porting and each such attempt requires careful planning. C# is mostly being used for new applications instead of the old ones. And these new applications automatically get some new capabilities as a virtue of the modern platform.

    And the most important thing, Visual Studio – offering both C++ and C# is a much better product then the one offering only C++. It offers developers the option of choosing the right tool for the right task.

    Now – replace C++ with MOLAP UDM, C# with “VertiPaq BISM”, and Visual Studio with “SSAS” and you got the exact situation of today. (Here, I did it for you):

    • MOLAP UDM is the powerful, mature tool, but is also complex and demanding. It can do anything, but it is really hard to master. The innovation pace is slow and there are not a lot of breakthroughs possible with the language.
    • VertiPaq BISM comes along and offers flexibility, simplicity and time to solution, but it does not claim to be able to tackle the whole workload of MOLAP UDM.
    • Even with VertiPaq BISM introduced, MOLAP UDM is still used for all the high end heavy lifting and it is just as important as it had always been.
    • But VertiPaq BISM can be used for a broad set of less demanding applications, dramatically lowering the bar to entry and reducing the costs.
    • While MOLAP UDM will still stay with us forever, VertiPaq BISM is advancing rapidly and is able to take on broader and broader workloads.

    While there are many commonalities between MOLAP UDM and VertiPaq BISM, they are not fully compatible and there is no transparent porting of apps from MOLAP UDM to VertiPaq BISM. Indeed, very few try to do such porting and each such attempt requires careful planning. VertiPaq BISM is mostly being used for new applications instead of the old ones. And these new applications automatically get some new capabilities as a virtue of the modern platform.

    And the most important thing, SSAS – offering both MOLAP UDM and VertiPaq BISM is a much better product then the one offering only MOLAP. It offers BI professionals the option of choosing the right tool for the right task.

    As for the roadmap – MOLAP is here to stay. It will have new features every release (just like we have new important MOLAP features in Denali). Yes – BISM being less mature will see a faster innovation pace and being based on a more innovative foundation it will likely be the one creating exciting breakthroughs as we move forward.

    We worked hard to preserve the investments you made in the UDM and MDX. For example, the BISM supports ODBO and MDX. In fact – this is the only way Excel connects to it. All of the MDX lovers can still send MDX queries and create calculated members in the BISM. This is how Panorama works with the PowerPivot model. AMO works with the BISM as well as with the UDM. etc.

    Make no mistake about it – MOLAP is still the bread and butter basis of SSAS, now and for a very long time. MDX is mature, functional and will stay with us forever.

    P.S.
    We can do more, and we will do more to bring these models even closer together. You may have noticed that we have accelerated our release cycles and are working to deliver the product in fast iterations, instead of those big-bangs releases trying to do everything but taking forever to ship (remember SQL 2005??). So even if the integration is not everything you want right now – I hope you can be patient with us.

    (Chris – if you can publish it as a separate blog post it would be great).

    Thanks,
    Amir

    1. Hi Amir,
      it’s very nice what you tell us, but from you we don’t get nothing about what we get new in MOLAP.

      MOLAP SSAS hat a long list of todos and bugs too since 9-th version. What really come to 11-th version?

    2. Gavin Russell-Rockliff – BI guy. Passionate about helping people get meaning and direction from the mountains of data out there. I have found that the Microsoft stack does a great job of this, and have focussed my career for the last 7 years on implementing and deeply understanding all components of the Microsoft BI story. It's great to see that the story continues to develop.
      Gavin Russell-Rockliff says:

      Hi Chris,

      Thanks so much for this detailed feedback from PASS. (I missed it this year, and so I’m grateful to be able to follow the news from afar.)

      I’ve also spent the last 8 years learning, understanding and loving SSAS and MDX. (especially coming from a SQL reporting background, cubes were heaven when I found them!)

      So, thanks Amir for clarifying that the last 8 years were not a waste. And I’m very glad to hear that there is a solid roadmap for both flavours of SSAS!

    3. Thanks Amir for this post. After my initial reading of Chris’s post I was not very happy about the direction of MS BI as a lot of my clients had reservations about SSAS until we convinced them to implement it, and now they love it. It is also gaining a lot of traction with smaller to mid size companies in our area finally, and I (like Chris) was dreading the conversations with my clients…especially our recent implementations. After reading your post it makes me feel a lot better about the future of SSAS MOLAP Cubes. Thank you.

    4. Hi Amir/ Chris,

      I have been a Microsoft BI professional having earned my bread & butter using this technology.
      I have been an avid fan of Microsoft, but with the current approach I feel sad & it might lead to my departure from Microsoft technology to other stack to have my bread & butter in right ways.

      My intention of writing here is not to malign anything but to give a right context of what I am thinking about.
      MDX & SQL are two way apart in terms of technical implementation but if you give the right approach both are same it just require a right approach of mind to learn.
      Everything is possible in SQL whatever we do in MDX , but the kind of analysis & time series dimensionality offered by SSAS is great.
      Let me be very frank here. Not very large number of companies use SQl server as their DW project , most of them either use their competitor product like DB2, Oracle, Teradata etc. But they certainly use SSAS for their multidimensional analysis. I had seen at many places . Till date their is no better tool then pro clarity for SSAS analysis.
      Believe me the day this news comes out most of these corporation will throw the SSAS & they will jump for hyperion. Its not that Hyperion is a better product but the only reason it will have continuity. I am talking it based on my experience. In BI world people do not care for money , they care for stability & easiness.
      Let people talk whatever but I am dead sure powerpivot will not be as big as of SSAS in any way.neither it will offer any mature functionality in anytime future.
      Now lets talk of the gaps , it would be around 5 years( whole development & learning path) & I fear that will break the whole existence of Microsoft BI. The SQL server might become just a mere tool for storing of some data & some reporting.
      Lot many things have been killed by Microsoft in BI space & they returned nothing on place of glorifying the current technology. Other BI vendors have already taken the lead.
      Let take a pledge to save the current stack as these are a mature stable, It just require a right thought to bring it to the best guys.

  14. After a time of thinking, i really want to say something more…..

    First of all: sorry for my bad english, that makes a little bit harder for me to express myself.

    There are a lot of very interesting opinions about all that, and the most important for me: food for thought!

    Without knowing a lot about BISM, my first thought was – especially MSAS and MOLAP are the things Microsoft has in BI, all the others not! SAP BI uses ROLAP – and the resultat, extremely poor performance!!! Some said it already before, where MS BI will be in the Gartner BI Quadrant next year?

    And all that while the BI Market,especially the MS BI Market in germany is growing up extremely – never before so much MS BI Conusltants where searched in germany like now…… never before so much customers where willing to go on with MS BI, and yes, a lot want to leave Cognos, SAP BI, Micro Strategy to go on with MS BI….

    And in such good times for MS BI such news….

    But also some of you said – it’s normal that technologies are changing…..

    So let’s see what BISM will bring- but to be honest – i am very sceptical…..

  15. Chris – Coming from the other side of the Pond (Oracle), i can truly understand the points that you have highlighted here especially on the MOLAP & SQL adoption. Having said that, i have been a staunch supporter of anything MOLAP (coming from Cognos PowerPlay, Oracle OLAP & Essbase background). One thing though – i guess from my perspective End-User adoption drives product innovation. Based on what i have seen, every vendor now in the BI area (Recently Oracle, Cognos & forthcoming BOBJ release) are now pushing for Multi-Dimensional reporting even for relational sources. So, its kind of hard for me to understand the MS direction. Any day, any end-user with zero technical knowledge (no SQL or MDX knowledge) will prefer MOLAP reporting over SQL reporting(ask Accountants – they will want to choose member 2008 {MDX} over choosing an Year column & then applying a 2008 filter {SQL} – they love Essbase or any MOLAP tool for that). Atleast thats what i have seen so far. Also, SQL as a language is extremely limited when it comes to multi-dimensional reporting. One thing MS did well though was it brought the MDX language. I hope Oracle continues to improve it.

    -Venkat

  16. I must say as an MS Bi consultant and SSAS expert, I have never really believed in PowerPivot and DAX …
    Chris If what you seem to think about AS dying in 2 to 3 releases is true, MS BI professionals specialized in OLAP should consider working on power pivot right now… and gain as much knowledge as they can .. and quickly. MS is going to kill many BI vendors specialized in AS, the transition for Vendors, Customers, reps and specialists will be very hard.
    On the other hand if what’s Amir says is true and MS is going to keep AS the way we know it until Vertipaq BISM is mature enough so that AS and BISM can be merged without losing features then this could be a very good thing.
    I’m still wondering why MS wouldn’t “simply add” this in-memory storage in AS in the meantime … If they reallly wanted to keep the AS we know and like alive and competitive with other in-memory products.

    I’m wondering if BISM and this new MS strategy is not the reason why Mosha left in the first place. Predictable?

  17. Chris,

    I totally agree with you, but also see where Microsoft are trying to push this to lower the entry point and therefore increase adoption.

    My main concern surrounds the more complicated features of SSAS like p-c hierarchies, many-to-many measure groups, utility dimensions, scoped calculations, and (a big one for us) AS Stored Procedures.

    If traditional SSAS is to stay around for a while it would be nice to at least get some performance and scalability enhancements (maybe we’ll have to create a datamart with all the tables using Vertipaq column stores, the put the cube into ROLAP mode?). If it’s eventually going to die then BISM will need to incorporate all of the above mentioned structural / calculation features in order for the product to really serve the full customer base.

    While SSAS as we know it may die I think it will take a good number of years (at least 2+ releases of SQL Server after Denali?). Are decision makers still stuck in the old-school way of thinking that they wont have to re-develop after ~6-10 years anyway?

    What are the other options? TM1, Essbase, Pentaho? how viable are any of these for large cubes with very complex calculation requirements?

    Yacoob

  18. What is forgotten in this blogpost and all the comments is that it’s not about us BI specialists neither the technology. It’s about our customers who have problems or ideas that need to be fixed/realized, rather yesterday then tomorrow.

    In that line I understand the decisions taking place at Microsoft and gives us – the specialists using it – a cutting edge over the competition, while still able to fallback on current technologies (MOLAP) to get the job done. As a result our future is going to be brighter then ever…

  19. This feels like a Perfomance Point deja vu. But with one major difference; imho we are getting something better.

    The basis for a being a BI consultant still lies in understanding what information is and how to make this easy to get to. Wether this is through SSAS or Powerpivot or BISM.

    Just the other day I adviced a client to use Powerpivot (Finance department) versus the SSAS solution I delivered for the Marketing department.

    Furthermore; I see sandboxing being a viable project starting point with BISM, more then with SSAS. (Sandboxing is something that allready being used in SharePoint 2010 projects.)

    But are we getting something better? mmm we are getting another technology. Which will prob. perform better. But when I look at the ‘problems’ my clients have with MS BI solutions it isnt the technology it is the lack of end user tooling for the analytic user.

    An analytic user being the marketeer that wants to use Excel to retrieve all relations which have only donated money on project x (not any other project). Translated to SQL this would be a NOT IN (in SSAS I would have to write a named set everytime the SET would change).

    Well; I see my challenge in my work still lies in the same area -> defining information together with the customer.

  20. My concern is that this will push customers away from Microsoft BI. (in the short term) Microsoft killed ProClarity and replaced it with a less than robust dashboard. But at least you could always make the argument that even though the client tools were lacking, the SSAS server was state of the art. Now that their server direction is changing, I fear that most companies will abandon the platform all together and maybe look at Microsoft again “when things settle down”.

  21. I think its about time MS started to think about teh relational/OLAP hybrid model, that eliminates the Aspects of Dimensional Modelling, but allows the aggregation of data at ease.

    I think this new concept is correct, but needs the embedding of relational data reference, simply put use the relationships to construct a automated model that is a virtual model in memory, that can then be saved as a OLAP object, that way both aspects of BISM and OLAP are fully utilised as required, keeping in tack the current user base happy, and importantly the usage of MDX (rich set functionality) can still be used.

    Turning in one direction, will result in what has happened to Oracle/BO and other venders, where the complications of development are hard to carryout.

    I think the BISM model with a little bit more thought is the future, a lot more work is needed, in-memory quick fire aggregation is not the whole answer to the information needs of today, people are achiveing this using many a tools to achieve this, how about BISM with the mebedding of programs to shape the data from the aggregation partitions before compiling a dataset for reporting is one example…..

  22. The way I look at it: BISM is SSAS with a powered up Rolap engine inside. When you execute a query against a columnar engine you don’t have the overhead of fetching full records which explains the performance difference against a row based engine. These queries are generated by the model in BISM.

    Next to this BISM creates more under the skin (works top-down) rather than bottom-up. So BISM is a knife and SSAS a swiss-knife as working top-down will not give you as much control over the detail like working bottom-up.

    I’m nuisant how BISM and SSAS operate in-between and how easy we can switch between the knife and the swiss-knife without having to do redundant work . We could imagine BISM creates ROLAP schema under the skin for aggregated operations within vertipaq.

    But when reading this part of Amir’s comment , I agree with Chris that MDX is on a dead road.

    “While there are many commonalities between MOLAP UDM and VertiPaq BISM, they are not fully compatible and there is no transparent porting of apps from MOLAP UDM to VertiPaq BISM. Indeed, very few try to do such porting and each such attempt requires careful planning. VertiPaq BISM is mostly being used for new applications instead of the old ones. And these new applications automatically get some new capabilities as a virtue of the modern platform”

    Nevertheless how the *@# should a customer which is serious about his investments start to operate self-service reporting given this information? The customer will buy a 3rd party tool, this is a terrific announcement for qlickview, spotfire and any other as their banners will announce “Microsoft is re-inventing its BI, do you like to try immature stuff”? .

    The best thing microsoft could do is buy a contender like microstrategy and end this playground for internal MS engineers and fire some of them as: report builder, the acquisition of proclarity which was targeted at self-service reporting has been dismantled, and the intertwinning of ssrs and ssas have prominently failed because of flaws in SSRS.

  23. Chris, I think many people had the same initial reaction as you did—even if not every one of them is willing to share their thoughts as openly. Hats off to you for being so articulate!

    Amir’s position is reassuring, that Vertipaq is additive to SSAS in the foreseeable future. I think this is a very important point. The existing MOLAP technology will still be there, and Microsoft is recommending it be used for some use cases in the Denali product cycle. However, even though the message was stated clearly that “MOLAP is not dead!” (repeated several times, as I recall), one of TK’s bullet points noted that future investments will be in Vertipaq—which seems to imply that the future MOLAP engine will not differ in capability from the current 2008R2 release. I think for many the excitement of Vertipaq is tempered by this, as MOLAP is an engine we’ve all learned to get some pretty ingenious solutions from, and we were hoping to see what more it could do with further enhancements.

    For myself, I’m very excited about the new possibilities that Vertipaq can provide in the enterprise cube space. I admit I’ve been hesitant to fully embrace Vertipaq for enterprise solutions until now because its current form lacks many features needed in an enterprise cube design. The product team’s briefing and demonstration reassured me they “get” what our clients need from cubes, and gave me hope that a transition to Vertipaq for many client solutions is feasible even with the planned limitations in the Denali release. I do think they got the priorities right.

    Selecting a data storage engine always has “gives and takes”…just as MOLAP vs. Relational does. Vertipaq gives a lot…especially in performance, integration, new semantic layer and reduced learning curve for OLAP modeling. True, its limitations in Denali will be “takes”–and we may find other “takes” during CTP cycle (after all, there’s no such thing as a free lunch!).

    However, I have a feeling that in real-world client solutions the balance will more often point toward Vertipaq than MOLAP in the Denali time-frame. Luckily we’ll still have choice when it doesn’t while we all learn what Vertipaq will ultimately mean for our clients and the state of the art in OLAP technology.

  24. Chris, I suspect this debate will rumble on for some time (not least at my place of work!). My initial take on it is that BISM won’t be up to the job for us in Denali – at least not on its own, and its interesting that Amir appears to be rowing back from the impression you appear to have been given.

    I also agree with Venkat that like it or not businesses are multidimensional and the lack of things such as p-c hierarchies and role-playing dimensions is a big problem. I also think that MDX is a splendid query language and that the main reason for the lack of MDX knowledge is the ubiquity of the SQL language (MDX suffers in ‘looking like SQL’) and the relative lack of opportunity for developers to really cut their teeth on it. I for one am not looking forward to having to write umpteen JOIN statements if thats whats required.

    Ultimately, however I think our value as BI professionals isn’t necessarily an expertise in a particular technology, but the ability to be able to model and understand a business and derive useful insights for it. If MS can’t provide all of the tools do this in the future there are plenty of competitors – as already mentioned – that will and using MDX!

    I think MS are smart enough to respond to this market however. After the initial hype I’m not sure how well received PowerPivot really was. It seems a bit niche. If the same thing happens to BISM in Denali I’m fairly confident MS won’t throw the baby out with the bath water.

  25. simplebi – I feel like my indefinite quest - is to find the truth about analytics. The true value proposition of what an enterprise can get - from BI.
    Harsh says:

    Chris,

    I am still absorbing the news – but something you said strikes me as the now new way of doing things:

    “SSAS cubes in ROLAP mode against SQL Server/PDW with Vertipaq relational indexes”

    … this until BISM fully matures… (5 yrs.?)

    BTW – My initial reaction to BISM path is mostly positive. We really needed a semantic model – if this is going to help with something like an “entity model” for OLAP – then I am all for it. And let’s face it – processing MOLAP was only going to get this far…

    My understanding of MSFT OLAP portfolio:

    1. MOLAP – for MultiDim Complex calc, Small/Medium size data
    2. SSAS/ROLAP/PDW/VertiPaq – for MultiDim Complex calc, Large size data
    3. BISM/ PowerPivot/DAX – for Small / Medium / Large data, Medium complexity calc.

    I think it’s awesome that we have more platform options. But the anxiety is still on the client side.

    There is just no good client tool that takes advantage of all of the MOLAP capabilities. Excel for OLAP doesn’t quite cut it. And Crescent/ new efforts are aimed at BISM.

    Anyone agrees / disagrees?

  26. I’m not sure I necessarily agree with the comments that SSAS MOLAP is only for complex scenarios or large enterprises. If you have a reasonably well-designed star-schema data source its a doddle to set up an SSAS cube. Usually I only allocate a day or two for an initial design and build. I might spend this long on a single SSRS report! Once the cube is up and running you can give business users access via Excel or a third-party client and they can be querying the data within minutes. I’ve done this at small businesses with comparitively simple requirements with a good deal of success.

    Of course the key to this is to have a well-designed source, and this can take a great deal of time and effort to create. One wonders if this is the real target of BISM. If so, this is potentially a very big deal because you could do away with data warehousing ETL and all of the time and expense that entails.

    Thats a big if though and even relatively small and straightforward BI solutions require such things as historical reporting (think type 2 changes, point-in-time reporting), auditing, and data lineaging which I’m not sure you could ever do using a semantic layer.

    It’ll be interesting to see how advanced the data integration capabilities are and how well the BISM will play with MDM-type solutions.

    I also wonder how well the UDM and BISM will integrate. Will you be able to ‘drill across’ from one to the other? Will BISM be able to use the UDM/OLAP as a data source in some way? If so, it would be a much more enticing product.

    Its always good to have a new tool in the toolbox though, and there’s no question Vertipaq has the ‘wow’ factor. In any case I’m very much looking forward to the first CTP that includes BISM.

  27. Chris Webb – My name is Chris Webb, and I work on the Fabric CAT team at Microsoft. I blog about Power BI, Power Query, SQL Server Analysis Services, Azure Analysis Services and Excel.
    Chris Webb says:

    I know I’ve been somewhat quiet since this last post, but a lot of things have happened over the last few days. I’ll post again soon with some more details.

    1. jason – Charlotte, NC – MCSE in SQL Server BI 2012 MCITP in SQL Server BI 2008 BI Microsoft Community Contributor Award 2011 Passionate technologist, basketball player and a wannabe guitarist... Pretty much sums me up! :) Follow me in twitter at @SqlJason
      Jason Thomas says:

      Is there a forward button? Really waiting for your post…

  28. Chris thanks for sharing this with us!

    Microsoft is moving towards its “democratization” vision of BI with BISM. This is good as long as it offers a choice between both models, UDM and BISM, as it is pointed out by Amir and T.K.

    Cheers.
    Nikos

  29. I was at first also very annoyed to say the least but after reading this post from TK, im a lot happier than I was. It indicates that the UDM and OLAP model will still be the premier tool for true enterprise complex BI models.

    http://blogs.technet.com/b/dataplatforminsider/archive/2010/11/12/analysis-services-roadmap-for-sql-server-denali-and-beyond.aspx

  30. Hi All,

    I have been working with ProClarity BI tool and Analysis Server/SSAS for 12 years now. I have to say every now and then Microsoft does some dumb things that even a 5th grade student will not do. They purchased ProClarity (god know for how many millions) and killed it.

    Well, they are using very small parts of ProClarity on that stupid PPS. No one were interested in buying PPS, so they dumped it and now they are calling it SharePoint 2010 with analytics.

    Now they have realeased another donkey called “PowerPivot” and claims it’s a horse that can run 100mph. I am very sorry to say it is not even as fast as the regular pivot table connected to SSAS. So every 12 to 18 months they design some stupid software and expect customers to buy them.

    In my 12 years of experience, I have not seen one tool in the market that works as good as ProClarity. So why did MS buy it and killed it. God only knows.

    Wait till some business folks, get multiple sources into PowerPivot and screw up all the results and MS would again come back in 2 years and say, we are going back to SSAS.

    I am leaning away from MS, I will not even buy a home PC with windows on it. Apple or Google or anything else but Microsoft…

  31. I fully support Microsoft. Day when I saw PowerPivot I realized that it’s future of SSAS. Look at today’s BI market. New ideas, new paradigms gonna change BI: in memory analysis, column based storage, massive parallel processing make OLAP tools like SSAS outdated.
    Main weakness of SSAS are:
    1) Complex to develop, educate and test
    2) Difficult to work with large DW with frequent updates.
    Overal weakness of MS BI stack is lack of metadata layer. UDM should have been such layer but because of difficulties (and strict requirements to sources, for instance) not many reporting solutions really utilies it. I expected that MS would make a such layer and they did, at least for reporting (and, still not for ETL).

    It’s always sad when some good old techniques are dropped but, it’s life.

    P.S. I’ve spent with SSAS 6 years

  32. I am also heavily invested in SSAS as a MS BI consultant. Like many here, I had anticipated that VertiPaq storage would be incorporated into SSAS, positioning it well into the future. So it is with great apprehension that I have to change direction at a time when SSAS is now viewed mature and highly capable.

    I am however not dismayed by the technology direction (after Amirs clarification), but I am struggling to map out how to provide architectural guidelines for clients, given that we have a mature SSAS with a competing BISM. Perhaps MS must focus as much on how to position BISM in the phase between now and “beyond Denali”. It is important that we, who assist the adoption process, are equipped to provide proper guidence to clients.

    I generally work off a base (reference) architecture. I want to make sure that I adapt this accurately and that I am able to substantiate this architecture with MS reference material (and that it makes sense :-))

  33. As a fellow consultant making my living on MSFT, my biggest gripe (having previously been an Oracle ERP/BI developer) is all these paradigm shifts result in my BI team being less effective at the Client site.

    We spend more time trying to manage the technology than delivering business value. Changes like these only create more confusion to the customer, more time we devote to ‘aligning the stars’, and a general discontent of my staff for trying to make it all work.

    back to Oracle.

  34. I have been thinking that maybe the overall Microsoft plan is to put the MOLAP SSAS functionality into the new Azure cloud environment and keep the BI Semantic Model as the user interface to this environment.

    I would bet that all the BI consultants (me included) will have to learn about the upcoming BI Azure functionality.

    In a sentence, I think that Microsoft is planning on moving SSAS to the cloud.

  35. Am the only one really f-ing confused by the BISM and PP, and the we’re-depricating-SSAS statements, and the no-we’re-not-depricating-SSAS-because-that-will-make-the-MS-practitioners-mad statements, and other baffling statements?

    I can see adding capabilities beyond what a traditional cube can provide, but not if the goal is to undermine the whole current market and foundation of current implementers.

    Greater products (and companies) have gone to pot when your market is confused as to what you’re offering. They need to keep it simple becauese obviously the Windows juggernaut will not be able to fund these misadvantures for too much longer. And miffing your most prominent practitioners by confusion and obfuscations are only going to make for more skiddish buyers and will scare current MS BI supporters away from the platform. Maybe they don’t care because it’s self-serve?

    Don’t anyone tell me that there wasn’t a technical way to extend a fundamental SSAS cube and provide backward compatibility. MS used to bend over backwards to keep prior technology working going forward…are they too arrogant, conceited, or lazy now?

  36. One more comment on this: I’m going to predict that when you add up all of the DAX and other things you have to do to get a PP model to do the same as an ordinary cube (functionally), you’ll still end up with a lot of confusing architecture that will be no less simple to understand… just spend 15 minutes browsing through some of the PP forums and see how the folks get it to do what they need the model to do for the client. No less complex.

    It’s data itself that is multi-dimensional. I find that most people, when trained correctly, can very intuitively understand multi-dimensional concepts. Again, data itself is multi-dimensional in reality, it’s relaitonal data that more often a mos-modelling of how the data really is, and the cube is an attempt to “un-realationalize” it back into how the data is really related (by de-normalizing, and so on)… not the other way around.

    My guess is that it’s the interfaces that many cube are falling short. Proclarty was prettygood, but I’ve seen way better before that. Excel’s rendering of a cube data and multi-dimensional data is pretty pathetic…there is about 50 things taht suck about how it presents the model.

    Time will tell if Microsoft’s thinking is right on this… if I were betting, I’d bet against it though.

  37. So the only argument that Microsoft had against MOLAP was MDX complexity, was not it?

    All right, given that this MDX complexity is completely solved by LINQ / SDX now (see http://agiledesignllc.com/LINQX_vs_LINQ_vs_MDX ), should they revert their focus back to MOLAP?

    Why? -Because BISM will never do what MOLAP does today (e.g. pivoting and effective work with normalized snowflakes ) . And also because an existing investment into MOLAP is sizable and deserves some respect and support.

Leave a Reply to richCancel reply