AS2005 Calculation Engine Performance

I’ve been talking about measure expressions (and Mosha has been plugging the fact you can do the same thing in MDX Scripts) for a while now, but up until now I’ve not got round to running any tests which compare their relative performance. However, I’ve been doing a PoC over the last week or so which has given me just this opportunity and I thought I’d share my results…
The scenario I had to implement was as follows: I had a sales fact table which showed the number of Units for a particular SKU of a Product sold per month (plus various other dimensions), and I had a prices fact table which showed the Price of each SKU per Month, and I wanted to be able to analyse sales values. Fairly straightforward so far, I just created a view which joined the two fact tables on SKU and Month and then multiplied Units by Price to get the Value. However, I also had a requirement for the user to be able to pick any given Month and to be able to look at Values for a range of Months to be calculated as if the Price for each SKU was frozen at the Month they had picked – this would allow them to remove the effect of price changes from any changes in sales Value. You can probably see that you might want to do something similar when using exchange rates.
Anyway, I originally tried to use measure expressions to implement this ‘fixed price’ requirement but after a lot of thought (and discussion with Jon) came to the conclusion that it wasn’t actually possible. I had one physical Time dimension but this was present in the cube twice as two role-playing dimensions, one linked to the Units measure group and one to the Price measure group, and the fact that these two dimensions weren’t common to both measure groups made the numbers aggregate up incorrectly. Even when I tried linking the Units Time dimension to the Prices measure group and the Prices Time dimension to the Units measure group using many-to-many dimensions, which I thought should work in theory, I still didn’t get the correct numbers. It obviously needs a regular relationship in order to work properly. So I went ahead and used an MDX Script calculation instead and was pleasantly surprised with the results.
In order to properly compare the performance between ordinary measures, measures which have measure expressions and measures whose values are calculated in the MDX Script I created a test cube based on a subset of my data. I took six months of sales data which resulted in a fact table of approx 7 million rows, the whole of my prices fact table (approx 6 million rows), and the whole of my Product dimension (170000 SKUs) and built a cube with no aggregations or partitions and four measures: Units, Value (calculated within the fact table and a regular measure), Value calculated using a measure expression of the form [Measures].[Units]*[Measures].[Price], and a regular measure exactly the same as Units but whose value was overwritten in the MDX Script with a calculation (based on Mosha’s recommendations here) similar to:

([Measures].[VALUE – SCRIPT],
LEAVES([Product]), LEAVES([Time]))

= [Measures].[VALUE – SCRIPT] * [Measures].[PRICE];

I checked that all three Value measures returned the same result and then ran a very big query (26000 rows and 4 columns) to see how they compared, stopping and starting the Analysis Services service before each query to try to reduce the effect of caching. On my fairly ordinary 2-way Windows 2003 box running the June CTP this mega-query ran for 13 seconds for both the ordinary Value measure and for the measure expression version, which I thought was pretty amazing. The MDX Script version ran for 18 seconds which I thought was still very good. Subsequent runs of each query on a warm cache all ran in 9 seconds. Obviously the next step would be to install AS2K on the same box and see how performance was on the same cube when using a calculated member to do this calculation, but I unfortunately I haven’t got the time to do this and in any case I can’t imagine it would perform anywhere near as well.
So, what can we take away from this? It looks like the claims for big performance improvements in the AS2005 calculation engine might be justified. Where you can use measure expressions – and in most classic currency exchange rate scenarios you will be able to use them, even though I wasn’t able to – they seem to perform almost as well as if the value was precalculated in the fact table. Of course I didn’t test what impact writeback has on query performance, and this is the other essential factor in most financial applications, but in AS2K you always had to design you cube to account for the possibility that values might have been written back to the cube even if they hadn’t been, and even then performance could be bad. And even if you can’t use measure expressions, it looks like a pure MDX Scripts approach can still perform really well.

Office 12 BI

Marco Russo, lucky guy, was at the PDC and saw the session on Office 12 BI. I’ve had a Google and can’t find anyone else with more details – if any of you did see this presentation, leave a comment and tell me what you thought!
It sounds like the Office guys have finally turned Excel into a useable BI platform, and this will be an additional squeeze on all those small Excel addin vendors out there. The integration with the new Excel server bits that Marco hints at would be the real killer – I have often had BI users wanting to take their worksheet full of OLAP data and publish it to the web as a live report.

PASS 2006 European Conference

I got an email today advertising the PASS 2006 European Conference in Barcelona next February. I went to (and spoke at) the last one in Munich and found it very useful – certainly the level of technical detail was way ahead of your average conference, especially as far as Analysis Services went, and I’d recommend it. I’m going to submit an abstract for a presentation too; I guess it will be on something like ‘Introduction to MDX Scripting’.


I was having a look (for no good reason) at the Free Patents Online site at and while I was there, I remembered the photo on the OLAP Report website (at if you have a subscription) of the big pile of cube/nick-nacks, each one representing a patent, that is to be found in Amir Netz’s office. So I thought I’d have a look and see just what the content of these patents was to see if they were interesting. Who better to start with but our man Mosha? Here are the results a search on his surname brings up. Hmm, if you can wade through the reams of legalese there are some insights into how AS2005 might work (assuming that it does work according to the patent) but a lot of it is statin’ the bleedin’ obvious.
I can’t mention OLAP patents without the infamous Timeline – read this thread for an idea of what they’re about. Memo to self: remember to patent the wheel at some point over the weekend.

Scorecard and Office BI webcasts

There’s a whole bunch of new webcasts available which look like they were recorded at a partner airlift for Office Business Scorecard Manager. They are:
I’ve only watched the last one (hey, I have work to do you know!) but it’s very interesting – not just from a Scorecards point of view, but as the title suggests it’s more about the general direction of Microsoft’s Office BI strategy. The speakers spend a lot of time talking about how BI will be one of the big themes of Office 12 (Excel and Sharepoint) although they don’t go into too much detail – it sounds like the functionality that was in the Excel addin and maybe some of what was in Data Analyzer (RIP) will become native to Excel 12. To be honest the fact that there is a stated Office BI strategy is heartening news to those of us who have watched this area over the last few years. It does seem like there will be some half-decent functionality in Excel at last, and maybe by the time Office 13 comes along a good – if basic – 100% Microsoft BI platform will be in place.
Even more webcasts are now available: