Migration From Analysis Services Multidimensional – Your Feedback Needed!

Do you have Analysis Services Multidimensional cubes in production? Although I know it’s a long time since I last posted any Multidimensional/MDX content here I hope I still have some readers who do. If so, then you may be able to help me.

The reason I ask is that in my current job at Microsoft I’m working with some colleagues to investigate what it is that prevents people from migrating away from Analysis Services Multidimensional to Analysis Services Tabular, Azure Analysis Services, Power BI or indeed any other BI platform. Is it missing features? Is it organisational intertia? Cost? Is it the fact that your Multidimensional cubes still work well and there’s no point in migrating when you wouldn’t see much benefit? Something else? Has the idea of migration ever even crossed your mind?

In particular, what I need is:

  • Examples of Analysis Services Multidimensional cubes you have in production. All I want is the Visual Studio project or an XMLA script of the database, I do not need or want your data. Please leave a message for me here if you’re willing to do this and I’ll let you know where to send your cubes to.
  • Your thoughts on this subject – please leave a comment below. You know how I love a good argument discussion!

I already have plenty of ideas and theories regarding this topic, but what I need is hard evidence (hence the request for the cube definitions) and quotes from actual customers.

Last of all, don’t read too much into this: it’s a research project, nothing more. I can’t comment on, or make any promises about, the future of Multidimensional or new features that might be added to Analysis Services Tabular or Power BI.

53 responses

  1. Hi Chris,

    I could send the XLMA through but to be brief the power of SCOPE assignments in MDX is what does not allow to migrate certain applications to tabular. Not that there is no way to do so at all but all the possible tabular solutions are rather awkward compared to the elegance and shortness of SCOPE statements.

  2. Hi Chris, from my experience at my current employer there is a fear of change. All the cubes/processes/ETL are interlinked (I harshly call them inbred) and so making any change or improvement is a massive task as so many stakeholders have to test their data and approve the change to get it into production. There are massive opportunities for our business to improve data literacy but I don’t think I am doing a good enough job of communicating to leadership the opportunities. I will pass on your article to our head of IT to see if we can be involved.

    All of the above is pre Covid, now we have the issue of reduced earnings which will be another barrier for change.

  3. Hi Chris, I worked for a client 2 years back using multidimensional analysis service. Why we took that way was only because first phase of the project (by Microsoft India) used it and I have no clue till date why on earth it was considered when reports had to be built in Power BI. A single instance of analysis service could only only hold one type of model so we had to go for the same model. I was new to power bi with multidimensional so couldn’t stress enough to change the approach. As a result we suffered too much to build complex reports in power bi using multidimensional analytics cubes. We had built ssis, ssas (multidimensional) and reports on power bi (with limitations of 2018 version). Even we had to implement dynamic row level security in SSAS part. Now when all is built and in production 2 years back I don’t expect that client would go back to build it again. Migration means SSAS part and power bi part has to be rebuilt! I can give you the xmla of the structure for research purpose and any other details needed. But it would not be surprising if client doesn’t consider migration at this point at it would involve cost and time again.

  4. Chris, lack of support for parent-child hierarchies stopped us. Parent-child is the most natural way I know to specify business unit and accounting hierarchies. We hoped for improvements, such as the ability for a node to have multiple parents, and improved performance. Flattened hierarchies improved performance, but it was excessively troublesome to keep building them from parent-child representations in the database. Eventually, we stopped paying attention to tabular. And because multi-dimensional has not improved, we have designed our own subsystem for rapid aggregation; modern hardware and software technology make this not so difficult. Leaving Analysis Services, also cuts off the cost of SQL Server licenses, which substantially exceed the license fees for the software packages that we used to build on top of it.

    • +100. If MS can’t make DAX same useful as MDX, maybe they should make better support for SSAS MD in their tools like Power BI (you can’t even Create calculated measure in ssas md live connection…)

      • Agreed! Microsoft really needs to refocus on MD. Tabular might be great for small, department level, projects, but MD has the features you need to scale, both at the feature level, as well as data size.

        These are two different products, for two different needs. Microsoft must stop trying to think Tabular is the replacement for MD. It is not.

      • MSFT dispersed all from the old SSAS Panorama team or lost them. As result we didn’t see any improvements over past time

  5. I used to created Analysis Services cubes for an insurance company even before the advent of BI tools like ClikView and Tableau.

    My cubes became obsolete not because of tendency towards fancy features – like web-based reports, generating/publishing, et al – like what the Clikview’s/Tableau’s provide; BUT instead because of the tendency to simplify. Users did not want to leave Excel or learn any more than they already know of Excel – and only wanted a “way” to make it easy to get data then use their Excel skills. You would think that this means Power BI, but not exactly. Yes, many took to Power BI, but a good majority still think Power BI requires skills that will take them outside their comfort zone. Somehow there is this “basic” set of Excel skills – that many end-users do not want to get outside of. I do not understand exactly what is outside/inside as I am technical enough NOT to recognize it – but it is there! I guess one way of illustrating this: if there are people who call themselves Power BI/Power Query Experts – and differentiate themselves to get better money or prestige than regular Excel users, then this proves my point! The majority of Excel population will think that Power BI is not for them – perhaps they will encounter something in Power BI that will stomp them/stop them in their tracks.

    As I said, I am not exactly sure what causes this phenomena or what Microsoft can do about it. I have some suggestions:

    1. Fully integrate – remove names like Power BI (Power Query is already part of Excel so you are half-way there). But fully remove “Power” branding and go back to just pivot. I know you are already doing this but most “regular Excel guys” do not use PowerQuery where they have to OWN up to SQL code or some procedural extract process- this seems to be a no-mans land – this has been my observation for the past 15 years.

    For example, rename PowerBI desktop as Excel Server!!! The old Excel Server was not really a server 🙂 This new server will have PBID as well as the ability to schedule macros/addin execution.

    2. Then provide guarantee/sociability/marketing/evangelization that the “new features” (nudge/nudge/wink/wink) are really for regular Excel guys.

    I can have some more suggestions, of how I personally work with these “regular Excel guys” and provide added value to them – an approach to successfully integrate PBI or SQL without breaking the tough “regular guy” image. Can talk more if you want.

    • I don’t think you can change the Power BI brand at this point. You need to push your users to the future. We had slow traction on Power BI at first, but our Excel people have figured out Power BI is the future. Sending people to the Microsoft Dashboard in a Day seminars was the turning point. We also have done a lot of promoting of Power BI and a little bashing of Excel. We compare Excel to land line telephones, VCR tapes, and film cameras. They still work, but only grandma still uses them. The rest of us have moved on.

      • Sorry to disagree. My case in point – Redmond re-brands Office 365 to Microsoft 365. IMO, Power BI has a perfect narrative to be re-branded to Excel (e.g. Excel Server 2.0 because of standalone scheduling). The idea behind is to do with innovation diffusion theories which says that Excel population (1 billion) is more likely to adopt new innovations (that Microsoft sells) if they feel they have accepted it before. In theory-speak, instead of having a few “innovators” (aka Power BI users), you suddenly have 1 billion “early adopters” simply by re-branding.

        To summarize and to keep on point with this thread, Power BI is the best way to intermediate for MDX, AS, Tabular, or any non-MS cube tools for that matter (arguably making it moot to migrate away from these). However, Power BI lacks one feature – as an innovation it fails to get organizational adoption inertia. I put it to Microsoft to draw upon innovation diffusion theories to see how to jump start the adoption curve, e.g. by rebranding.

      • You may be right that rebranding could help reach the Excel crowd. That isn’t under your control though. The best strategies for internal adoption are to create hype, make training available, and show success stories. Microsoft didn’t call it Excel 2.0, but you can! Tell them it’s like watching VCR tapes when there’s Netflix. Point people to Microsoft’s free Dashboard in a Day training. Show cool dashboards.

        At my company, availability of Pro licenses and Premium capacity were the 2 biggest barriers. Once we made it simple to get a license and a Premium workspace, adoption has been robust.

      • Thanks, I guess you are beginning to understand what I am saying/predicting:

        IMO, within a few years, PBID could become say, SSES (SS Excel Services), just like PowerQuery got folded into Excel desktop! This will jump start adoption.

        Meanwhile more power to you for doing your best to get robust adoption of PBI in your workplace.

    • As MS tends to call all the things “Power” these days, I think it is more likely that the next version of Excel will be renamed into “Power Excel” rather then Power BI loosing it’s “power” part.

  6. Security on perspective level, cell level security, effort for rebuilding,

    A migration tool would help.

    Costs for In-Memory Models

    Performance for direct query models

    The aggregation support must become more mature.

  7. Some customers have to stay on premise due to regulations but composite models, dataflows, and dataset sharing not available in PBIRS. As tabular model focused on in memory usage, direct query sources are very limited in SSAS. If company want to offer self service metadata layer without moving all data to memory and data size is huge, Tabular model is not the answer. Comparing SAP BO Universes with Microsoft stack, especially in on premise world, Multi dimensional cubes are still used by customers I guess.

  8. Main reason for us is we are on SQL16 and we need the features of Tabular on SQL19. So not only would we have to migrate from multidimensional to tabular we would need totally new servers (new OS required for SQL19 as well) probably different memory requirements as well, then install and migrate everything to SQL19. Seems like a huge amount of work when SSAS MD is working perfectly fine for us.

  9. Hi Chris,

    On of the main reason for not migration is also how labor intensive it is to migrate a complete Cube solution to tabular. Beside the migration of the solution also a lot of reports are build based on the Cube. Those would need to be rebuild again.. and checked etc..

    In theory, if you create a tabular with the same name convention on the attributes / dimensions and measures it would all keep on working.. that would be the dream…

    I have working on conversion of xmla code to json,.. but running into multiple issue’s.. not on the conversion part, but you then get confronted with all the differences…. For example,.. mdx query uses the key of a attribute to create mdx,… you don’t have key / name properties in tabular..

    If you would like to change thoughts about this subject don’t hesitate to contact me.

    Nice to see you talking about cubes again ;-).

    And thanks for bringing attention to this subject.

    Regards,
    Diederick

  10. The main reason for us is that the shareholders do not trust keeping data in the cloud. Even in Microsoft cloud. There must be substantial saving in order to justify such migration.

  11. Hi Chris,

    Although not necessarily what you asked for, we recently (a year ago) migrated from multidimensional to tabular, and there were a couple of issues we ran into. Some issues we simply just accepted and got rid of the feature that didn’t work, and some we found workarounds for.

    – All our powerbi dashboards and excel sheets connected to the multidimensional model would break. Even if we gave everything the same name, excel or powerbi would still not keep the visuals or tables when changing the data source to the new model. Since we have a lot of dashboards and sheets around, it took a lot of time to remake some of these on the new model.
    – No default member. We used default member for how we handle currencies, where if you did not specify a specific currency, we would pre-select EUR (we’re mostly European based). To do this in tabular, we need to build this “select EUR if nothing selected” logic into every single financial measure, which led to a decent amount of duplicate code, not to mention something everyone has to remember to do for every measure. There are other examples where we needed this default member, and we had to duplicate the code for this logic to every measure that would normally be filtered by default.
    – No dynamic format strings. Yes you can use the “format” function, but this is not practical, as PowerBI will interpret the value as a string instead of a number then, so most graphs won’t work. We used dynamic format strings for currency fields as well. (Showing euro sign, dollar sign, etc, depending on selected reporting currency) In the latest RC version this is now possible via calculation groups, but this was not available when we migrated. Our “fix” for this was simply not using dynamic format strings and showing all currencies with the same format string. (So no eur, usd, etc signs)
    – SCOPE functionality was really useful for a lot of different things. We managed to do all of the things we did before in dax, but a lot of the solutions involved more or duplicate code.
    – The work required and risk of something going wrong. Ultimately, we had to stop all feature development for around 1.5 – 2 months (we also had to do some changes in our data layer to accommodate), and there is always the risk that you somehow end up with different results due to complex MDX calculations being translated into DAX incorrectly, or some edge case being missed. We had a very thorough testing environment with automated tests that we could re-use for tabular, but it was still a sizable risk. Ultimately it was 100% worth it since it allows us to develop much quicker, but it was pretty hard to sell initially.

    Hopefully any of the info above can be useful to your research!

    Kind regards,
    Ron

  12. We still have many Multidimensional cubes and are currently focused on implementing new demands with Power BI Premium. It is going to be a lot of effort to move those existing mdx-cubes to Tabular, i.e. dataflow & datasets, so in part we want to understand our existing PBI environment better to make the right design decisions and secondly, we will have to move users away from their existing way of working (using Excel) to PBI and need the time and manpower to do that.

  13. Hi Chris,
    I manage all the back-end reporting copy of the SSAS data for the Virginia Waiver Management System. I use MDX queries thru the msmdpump.dll from Excel VBA. All queries are slices to get back to the original RDB tables: ‘SELECT NON EMPTY CROSSJOIN(fields list) on rows, Measure on columns FROM [Model]’. I had no choice. The SQL Server was put into Tabular mode, and from then on this SQL Server lost the ability to listen to SQL queries. So I spent months learning MDX.
    They have not allowed the cube to be copied, which I understand would be a 1-line command. Seems to me that a local copy of the cube may be most useful.
    At least it is fast. The few dozen slices only take 15 minutes each morning to stream from the data center thru my PC and back to another data center.
    Contact me any time!

  14. We are in the process of converting our cubes to a Power BI Premium model. Microsoft has addressed many of our concerns regarding missing features over time. The remaining problems are:
    -No default member. We use this heavily in our cubes.
    -No object/column level security.
    -No integrated Git support.
    -Underwhelming development tooling. Power BI Desktop just isn’t designed for enterprise work. It’s a pain to switch back and forth between PBI, Git, Tabular Editor, DAX Editor, Visual Studio, and ALM Toolkit.
    -We have ~1300 measures and the MDX conversion to DAX is a huge amount of work!
    -No external data refresh from Excel Online to a Power BI Shared Dataset. What?!

    There are other sticking points as well, but they less important or we’ve found ways around them.

  15. Two things that come to mind that are still lacking support and create very cumbersome workarounds (note, these are not necessarily showstoppers):

    Parent/Child Hierarchies
    Default Member along with the IsAggregateable property

  16. A few things. A need for a migration tool is key, rebuilding our multidimensional footprint would take a large effort for what seems like a small value right now.

    The other main reason we haven’t moved to tabular or recreated them in Power BI is sheer costs to run In-Memory models, it’s just not there yet. Large instances are just cost prohibitive today.

    I will also second comments on:
    – Underwhelming development tooling in power BI. Power BI Desktop just isn’t designed for enterprise work. It’s a pain to switch back and forth between PBI, Git, Tabular Editor, DAX Editor, Visual Studio, and ALM Toolkit.

    – Direct query is interesting, but performance is a blocker there.

  17. I use MDX queries to SSAS 2016 Tabular server to return CROSSJOINs – slices that are the original RDB tables. Only because: 1. Server is set to Tabular, so it will not hear SQL queries. 2. They will not permit cube copy to a local copy. 3. The cube was not designed properly to include time, business rules such as Status, etc.
    This is the data for the statewide Virginia Waiver Management System reporting system.

  18. This isn’t what you asked, but other feedback to throw into the mix…

    As you are directing people to move to Tabular, we are having internal debates about moving from Tabular to Power BI Desktop. The original driver for SSAS was to have a single datasource for multiple Excel reports (pre-Power BI Desktop). With Power BI datasets as a valid datasource, some of our users are wanting to get out of Tabular.

    The problem isn’t SSAS as much as SSDT (SQL Server Data Tools) in Visual Studio. Our end users are not IT people and Visual Studio is a finicky, unstable beast. We constantly get errors, crashes, and mystery problems when our VS files get out of sync with SSAS because someone forgot to save in VS or more likely because VS crashed. VS isn’t very stable (though it is much improved from 2 years ago). I get an error message that doesn’t seem to mean anything every time I deploy a model. Our last sev 1 error took so long to resolve, we don’t bother asking for support unless it’s broken.

    PQ sucks in Visual Studio. It uses menus instead of the ribbon and it’s hard to find basic functions. Working with queries in a large model is a nightmare. We need to see all the queries in the model, and VS doesn’t work that way. It only shows the query for one table at a time. To see another table, you have to close PQ and navigate to the other table and reopen PQ. The workaround is to use Expressions, which is a pain. Our less savvy users mix up queries and expressions all the time, and you end up needing to look at the queries anyway. You also can’t re-order expressions in the PQ window (you can, but VS doesn’t save it). Once you get over 20 expressions, it’s a mess to find anything.

    We also do a lot of imports of folders of CSV files, and PQ can’t do that without creating functions and sample files–3 additional queries for every CSV folder. It clutters up the model and is extremely problematic to clean up. They are hard to rename and delete, and generally we don’t want a function. We want all the code in the main query, not in a function. I don’t like the way Power BI Desktop creates functions for processing a folder of CSV files either, but at least Power BI Desktop names the queries better and groups all the function queries into a folder and lets you reorder everything.

    Version control and archiving in SSAS is too difficult for our users. They cannot handle code repositories. Managing a single .pbix file would be much easier for them. Using Tabular also means they need a dedicated IT guy on the team. With just Power BI Desktop, they would definitely need less IT support. SSAS skills and web articles are hard to come by. Power BI Desktop is very popular and well supported by comparison.

    All that to say our users dislike Visual Studio so much they are ready to ditch Tabular for Power BI Desktop.

    There is no easy way to move from Tabular to Power BI Desktop, so anyone creating new models should think twice about where to create them. If you’re coming from Multi-dimensional, chances are the model is big enough that Power BI Desktop is going to be a problem, but I wouldn’t do everything in Tabular. Small models and user-supported models are probably best done in Power BI Desktop.

    We have experimented moving our SSAS models into Power BI Desktop. The PQ Source statements have to be rewritten, but the rest tends to work. Some of our facts tables run out of memory when we process them, so we are looking at work-arounds such as processing the models on a server with more memory, doing partitioning, etc.

    I’m not sold that Power BI Desktop is the answer, but given the lack of attention on SSDT in Visual Studio, SSAS does not feel like a strategic platform. It feels like a hastily built tool that has been abandoned for lack of use. I rarely hear SSAS mentioned in any training or webinars these days, even from the Italians!

    I would love to hear any thoughts or opinions on our situation, or even good places to ask the question. Where do SSAS Tabular people hang out?

    • An alternative which works well for me for RDB up to 300,000 records per table is to use xlSQL and Excel files. All scripts fit in 1 2MB workbook. All WHERE filters fit in 1 autofilter worksheet.

    • –3 additional queries for every CSV folder.
      If you are ready to hand author the query – you need 1 line code
      = Csv.Document(Binary.Combine(Folder.Files(“C:\DATA\CSV1”)[Content]))

  19. From my perspective, as an end-user who understand tabular all thanks the Power Query and Power Pivot in Excel and Power BI, these are things to consider:

    > IT department tend to be more interested in larger-scale and new technology trends. Attempting or proposing to replace traditional OLAP with data lakes or other data virtualisation tool like Hardoop.

    > Non-IT department making advances using Power BI, and this lead to creation of the new Tabular data models instead using the existing. This is especially the case when people have more SQL training than MDX training.

    > People want (and believe they need) all their data in one place. They are convinced an OLAP cube is the answer.

    > It is hard to prove the Cost Benefit case for change.

  20. Hi Chris, in a nutshell I am moving away from SSAS multidimensional and towards Snowflake + Power BI. Also moving away from SSIS: the two (SSIS & SSAS) together require too much work for developing and maintenance.
    Cheers!
    Francesco

  21. We are in midst of moving away from cubes to snowflake/power bi, faced a lot of challenges along the way and still not over it.

    1. Even though claimed by Microsoft that Power BI is supposed to have all the power of SSAS, is it really so? What enterprises really want is a Self service BI backed by Enterprise level data model with all facts & dimensions so that they can drag and drop F or D at wish. Not sure Power BI is there yet to handle the scale. And the incremental refresh feature is unstable and huge limitations in terms of memory/capacity.

    2. At a theoretical level, the columnar databases were supposed to replace OLAP (ar atleast in my view). Does it then make sense to have one data model (Star Schema in Snowflake) and then replicate it all over again in Power BI?

    • “the columnar databases were supposed to replace OLAP”
      you can do that by using Columnstore in a ROLAP Cube

  22. our company have both MD and tabular models. From my experience, the tabular model is easy to build but certainly is not as flexible as the MD model. The main reasons for us not converting the MD to tabular is because the tabular model does not support the following features:
    1. Ragged hierarchy.
    2. Parent-child relationship.
    3. Named set function.
    4. Easy to manipulate dimension members using MDX with client tool, such as Excel.

    I truly these features will be available in tabular model in the future!

    Thanks,
    Wenchi

  23. Features:
    I miss little bit the Key, Name, Value logic of Multidimensional Attributes in Dimensions, but i know how to handle

    Cost/Time:
    Replacing Existing Cube Modells by Tabular Models cost Time. Its the Datamodell migration but also the Report Migration. Time is rare.

    Anyway as far as i remember it was recommended. Build new Modells Tabular but keep old Modells as they are.

    Some Tools are not able yet to handle DAX.

  24. Chris, Ron Oudgenoeg covered quite a bit of the key issues, along with the lack of parent-child support. And that Tabular is memory only, so limits size, But, from a design stand-point, Multi-Dimensional is Better! If you are serious about building data warehouse data models, Multi-Dimensional is just the tool to use. I only use Multi-Dimensional because it works, MDX calculations are simple. And i know it will handle the future extensions that the business WILL ask for in the future. Tabular is still years and years away. It is slow, cumbersome, and has a terrible design model. Tabular seems to be OK for simple, quick-and-dirty stuff, but not real world, serious designs. So, please Tell Microsoft – Multi-Dimensional is really the better tool.

  25. Hello, as a consultant to several clients in Argentina, I can tell you what the restrictions are in companies.

    The biggest problem is that many of these multidimensional cubes beyond being working very well, their migration to tabular is not simple, in most of the cases that I have to see, you have to do a project from 0 and this is what limits my clients to migrate them! Yes, the new cubes are made in tabular but the legacies do not allocate budget for their migration for this reason to do it almost from 0

  26. I have a cube in production on premise at danish foreign ministry. The cube is one version of the truth for DANIDA reporting and number controlling and its almost a process tool with actions to source systems, drillthrough in calculated measures, member properties info. Especially actions and drillthrough is core features – i think drillthrough might be possible to replace now but they are not ready for cloud yet so it would be tabular on premise and there maybe actions using cell coordinations might be a problem ?

  27. Chris, you mentioned this is a research project. What research model lens are you considering? Your research questions appear to point to the adoption journey of Power BI (and by extension Analysis Services). Has your research project considered models of adoption of technology-based innovations (e.g. Power BI)? There are a few to consider:- E.Rogers’ original Diffusion of Innovation (DOI), Technology Acceptance Model (TAM) and Actor-Network Theory (ANT).

    I am also a researcher on this topic. I specifically researched technology-based innovations (TBIs) and its impacts on actors in the organization. See link to my doctoral dissertation- http://hdl.handle.net/1959.14/286395

    In my dissertation, the theoretical model was Actor-Network Theory (ANT) and one of my findings was that patterns of adoptions of TBIs (such as Power BI) are not linear where acceptance is implied, but rather progressive and precarious. In other words, success stories are not enough (e.g. successful conversions of MDX to Power BI, or even emergence of successful Power BI users) but rather through the slow way that current systems (inscriptions) are strengthened and new inscriptions appear. And typically too, expect that any successful adoption of Power BI (or the migrating away from AS MD cubes), will have unintended consequences such as the changing of experts v. end-user relations (e.g., end-users create their own communities where the experts are disintermediated).

  28. G’day Chris,
    we attempted to migrate multi-dimensional cubes to tabular for a client ~2 years ago however ran into performance and memory issues on the 64GB server. The cubes were pretty large and some of the short-comings in Tabular bit us harder than we anticipated. Some of the short-comings in tabular have since been addressed (e.g. partitioning) but certainly not all. We were engaged at another site to reverse engineer a Tabular model BACK to MultiDimensional as the server RAM requirements were growing at a significant rate each month with the Tabular models.
    A lot of our issues are covered by other comments already posted here, however the biggest issues we had in migrating off multi-dimensional to tabular included:
    – cost of migration (and the SLOWWW development environment in VS for Tabular)

    – Dynamic dimension security support in MD is much more flexible. in this instance dynamic security was being driven by AD group membership controlling access to members across multiple dimensions.

    – Lack of support for Parent-Child or Ragged dimensions

    – Lack of support for Named Sets

    – The need to DUPLICATE dimension structures instead of a single role-playing dimension is costly for LARGE dimensions (& just plain annoying for the developer)

    • Agreed.
      We have a tabular model, the size will grow rapidly soon. Given the nature of in-memory DB, a model size is capped by the how big memory a system can have. We have already started spending a lot time to looking a way to control the size the model.

      • I think Microsoft should make effort to improve Multidimensional Engine and improve its integration performance with Power BI Desktop instead of eliminating it.

      • Absolutely agree. Some scenarios are good for Tabular however in the scenarios, in particular, within Finance/FP&A etc MD is presently irreplaceable. Lack of proper support for MD SSAS in Power BI is smth that holds us from considering Power BI as the unique strategic platform.

  29. Hi Chris,

    I’m glad to hear something like “Microsoft didn’t forget Multidimensional totally”, even if you try not to stimulate expectations too much.

    Here is some input from my side.

    My main point is: please do not look only at what Multidimensional can do and Tabular can’t do yet. Have a look at what people would have liked to see from Multidimensional in the past years where we had hardly any new features on it.

    I’m responsible for a Financial Reporting Application in one segment of a German telecommunications provider. We switched from one of the pure multidimensional databases (like Infor Olap / Alea, Essbase, TM1) to Analysis Services 2008R2 in 2010, meanwhile we are on version 2017 still multidimensional.
    We are still using something like Inforessbasetm1 for planning since even AS Multidimensional is not really sufficient for this.

    The main point is Hierarchies. I list the points that are relevant for me.

    1. Full flexibility with many-to-many and unary operators

    It is typical that there is a need for elements being aggregated in more than one parallel hierarchy, in combination with aggregation signs like +1 -1, for example:

    Europe
    – Austria
    – Germany
    – Poland
    – UK

    Europe w/o UK
    – Europe
    – UK (-1)

    ————

    Accounts: US-GAAP as a parallel structure to IFRS, HGB as a transition from IFRS

    IFRS
    – Thousands of accounts

    US-GAAP
    – Thousands of accounts with different aggregation

    German HGB structure
    – IFRS plus / minus some accounts

    Actually I discussed this issue with you, Chris, after a SQLBits session that you did (2011 or 2012 in London).
    You confirmed that the combination of n:m and unary operators didn’t work.

    2. Data Entry elements

    It is typical to have leaf level elements holding data for values that are not as granular. So even if a hierarchy looks well structured in levels, not ragged/asymmetric/m:n, the data entry elements still don’t fit well in the base level dimension table.
    (at this point I must say that I’m not very familiar with Tabular yet, at least I haven’t seen a concept for this in Tabular yet).

    3. Security / visual totals

    Security mechanism must allow to restrict elements on all levels. I never liked the Visual Totals approach in multidimensional but at least it more or less did what was needed without using cell permissions.
    Example: in an org dimension with levels segment/legal_entity/cost_center I would like to able to grant permissions on one legal entity. The respective person should neither see any segment (even in an dimension element browser) nor any cost center.

    Besides from Hierarchies:

    4. Measures

    Some of the client tools we are using with Multidimensional hardly make any distinction between measures and standard dimensions.
    This fits better for me than a very measure-based approach.

    Example: what would be the best measures in the following list of structural elements?

    Accounts: profit, expenditures and several Thousands more
    Process: entry – adjustments – adjusted – consolidation – consolidated – special factors (sf) – sf adjusted
    Unit: 1, T, Mio
    Flow: opening balance – deductions – additions – closing
    Period: month to date – year to date
    Versions: actual, forecast, runrate, budget

    Accounts would be my first candidate, when I just think about “what do I want to measure”. But it is not really realistic.
    In PowerBI, some visualizations for variances suggest that versions are the measures. I don’t like this way of modelling.
    I know that with calculation groups there is now a technical mechanism do deal with the issue that typically this question ends up with having many combinations of the elements above as measures – but why would I want this?

    The implementation as a dimension, together with the possibility to design custom member calculation, works very well.

    5. Data Entry

    Again an example where even Multidimensional is not very strong, but at least there is a concept for it that allows combining data entry with analysis in the same front end.

    6. Multidimensional at all

    I don’t know to express this in a short way – I just prefer the multidimensional approach for financial applications. This is not data driven, it’s structure driven. It is normal to design the model dimension by dimension and start with an empty cube that is filled in totally different ways, depending on whether we have actuals, plan, financials, additional kpis and so on.
    Showing me some data while I’m doing my modelling is a waste of screen area.

    Having said all this, we actually just start using Tabular in addition to Multidimensional.
    Even for the same data like data from account systems:
    – reporting -> multidimensional
    – analysis -> tabular

    We will see whether this view changes by the time we get more experience with Tabular.

    Regards
    Michael

  30. We rely on slowly changing dimensions over big data (150 M rows). Finding it hard to research and identify the best solution for this in SSAS Tabular as much as we want to use Tabular.

  31. Hello Chris,

    We have cube and building a new cube integrated with Power BI Reporting . but initial level We are getting the following issue .
    Cube size 200 MB
    but failed to import into Power BI . how we can avoid the situation .

    Regards,
    Sanjeev

  32. The key/name is the biggest road block. Multi-dimensional has key and name for a column, we have reports, Excel, Power BI using them. After moving to tabular, this column can only have either key or name. All the reports, dashboards that use these columns will be broken. There is not solution to this issue.

  33. I know this was posted a while a go but I haven’t been on your blog in a while.

    So my biggest issues for not changing to Tabular is:

    1.Scope statements – We use these a lot to add in extra functionality for end users, some of which we could get around by using variables in DAX but that means duplicate code for all measures. And some are impossible.

    2. Default members – We have 100’s of fields which use this and again improves design experience and user experience

    3. Partitioning – Whilst Tabular does have partitioning it doesn’t help query performance, we have over 2.2bn rows in our biggest table by about 50 columns, when we tried to put this into Tabular as soon as 2 or more users started using it, it grinded to a halt because the formula engine is single threaded and in columns. In Multidim all the data is partitioned to how users query the data which handles concurrent users with ease

    4. Cost – The cost of Azure SSAS is too high, vs SQL licence. We renew our licence every 5-7 years, too put the model in Azure it would cost the same in 2 years! That’s even switching off during out of office hours.

    There are other issues but these are my main points, I agree with a lot of the comments that Tabular isn’t the only future and MS should definitely improve multidim as well, improve distinct counts for example. There is a place for both I think.

  34. The current direction of tabular technology seems to focus on the Power BI users, not on AAS. The AAS product still requires lots of TLC, but is not getting it.

    It should be a higher priority to build great developer tools for AAS. The developer tooling in visual studio is truly terrible, and I’ve worked in VS my whole life, in the context of lots of other technologies including C#.Net and multidimensional modeling. I dread every moment I have to work with tabular models in visual studio. I could give you a very long list of problems but there is no point because everyone is already aware to them. And I’ve been told that Microsoft has lots of unkept promises related to making improvements in VS for AAS.

    On the Azure side of things, I am shocked by the inefficency and the high cost of tabular technology. Microsoft seems to be committed to squeezing money out of this AAS platform, without making it a very good value. Simple features like perspectives require you to upgrade your subscription. And AAS is extremely inefficient where memory is concerned. It is also unable to keep up with some of the newer features that are standard with other types of PAAS offerings, like the support for private VNETs.

    If multidimensional cubes were available in PAAS, it would solve a lot of my problems. I have financial cubes with parent/child accounting hierarchies which still need to be migrated to Azure and it pains me to contemplate modeling them using the AAS techniques (with fixed/flattened levels). These finanical cubes also have very sophisticated MDX scripts which will take a tremendous amount of work to rebuild, without introducing any real advantages at the end of it.

    Now that we’ve endured a long transition away from multidimensional, things are actually starting to look bleak for developers on the tabular side as well. The managers of this technology are now promoting “power BI premium”. This will probably mean that AAS tooling will get even less attention than it did in the past. And the current Visual Studio tooling problems that we have today will remain unresolved forever. I think Microsoft made a deliberate decision to abandon their professional BI developers, and instead they have catered to their “power users”. These are people who have less demanding needs, yet this set of customers is more appealing and doing business with them is probably a lot more lucrative.

    When Amir Netz claims that he is listening to customers, it is not to people like me. Perhaps Steve Ballmer needs to make a visit to him and to the other managers of the BI products at Microsoft and yell at them in their offices – “developers developers developers”.

    Since BI developer tooling is not improving much at the moment it might be a good time to take a break from Microsoft BI products, and evaluate some competing options. Perhaps eventually some of the current managers of the Microsoft BI tools will retire and things will then start to change for the better (… how much longer until Amir Netz retires anyway?) People have been asking for multidimensional in Azure for many, many years. How is Microsoft listening to these customers?

  35. Hello Chris/Fellow Experts,

    I have used Multidimensional cubes a lot in the past. I am to implement new analytics database for a new product that is growing. I am looking at the short comings of tabular of not having parent child support, No Named Sets/Calculations and all, I am not sure that if I should implement it in SSAS Tabular. On the other hand I have had people saying that there are very less architects who are implementing the analysis databases in MD mode now after the release of SSAS Tabular. I do have a strong feeling as Tabular might not give an expected performance/output on the following points.
    1. Lack of features available in Multidimensional (like parent child relationships, Named sets, calculations)
    2. Since it is mostly used in In-Memory as the concurrent users and data grows, RAM requirement keeps increasing.
    3. Not sure if Direct Query mode would return an acceptable performance if the data in the table grows greater than 50 million rows.
    4. Haven’t heard of any largescale implementations of SSAS Tabular with DW sizing greater than 2 TB.

    I have done good amount of research on case studies online but I still would want to hear opinions from your guys. what do you guys suggest should we proceed, MD or SSAS Tabular?

    -Rudra

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: