Intelligencia OLAP Controls

Andrew Wiles announced yesterday the availability of a beta version of his new MDX query builder component, ‘Intelligencia OLAP Controls’:
I had a quick demo of it this morning and I was very impressed. It has some really quite clever ideas in it such as the spreadsheet-based calculation functionality, and while it doesn’t do absolutely everything I’d like (it is still a beta, after all) it does an awful lot and Andrew is very open to feedback for what needs to be added. If you’re interested in checking it out you can download it here:
The product is aimed at ISVs, in-house developers and consultants who want to incorporate MDX querying functionality in their own products. I’ll be blunt: the company that really should be looking at this, and perhaps licensing it for use in Katmai and/or future versions of Office, is Microsoft. The control’s Office 2007 look-and-feel gives a tantalising glimpse of what a power user would want to see in Excel when connecting to AS and it puts the Reporting Services MDX query builder to shame.

Want to work for Microsoft?

The AS user education team are having a hard time trying to recruit someone who knows MDX. Is this something you, dear reader, would be interested in? Here’s the job description I got from Neil Orint:

Do you have a background in Analysis Services and MDX? Are you looking to put your mark on the next version of SQL Server content deliverables? Want to work on a writing team where your development team counterparts are as passionate about your content as you are? If so, the Analysis Services User Education team is looking for an experienced technical writer to assist us in delivering top-notch customer facing technical documentation.

The successful candidate will have a strong background in technical writing, a working knowledge of MDX and OLAP; possess solid project management and planning skills and a passion for learning new technologies. Strong communication skills are a must. As a member of our team you can expect opportunities to:
· Help define and execute upon content strategies and prioritizes
· Listen, analyze and respond to customer feedback
· Learn the entire spectrum of Microsoft’s business intelligence offering – Integration Services, Analysis Services, Reporting Services, Office and more.
· A practical understanding of MDX and OLAP.
· A history of developing assistance content for end users in a variety of delivery formats
· The ability to learn new tools, technologies, and processes quickly and independently

Please contact:

Please email Neil on the address given if you want to discuss it further.

Batch Reporting With SSIS and OfficeWriter

I’m a long-standing fan of OfficeWriter, the tool that’s just been licensed by Microsoft for possible inclusion in Katmai Reporting Services, and recently I was engaged by Softartisans to write a few articles for their website (yes, that means I was paid). Here’s the first of them, on how to create a batch reporting solution using their components and Integration Services:
Seems a bit of a weird thing to want to do when you can do the same thing in Reporting Services, but as I say in the article there are some advantages for using Integration Services for this task. I’m going to write another article next week on using OfficeWriter with Excel 2007 and Analysis Services which I’ve got some fun ideas for…


Following on from the Teradata post, I was just wondering how well the newly announced Dataupia (their website is rubbish at explaining what the product is – and I can’t even pronounce the name – but see this article for background: would work with Analysis Services, if you were using Dataupia underneath SQL Server and using AS in ROLAP or HOLAP mode. Could make for an interesting story for scalability if it does what it says on the box.

Teradata/Analysis Service White Paper

(Via BI/BPM – The SeeQuel) Here’s the first fruit of the partnership between Microsoft and Teradata that was announced earlier this year: a paper describing how to use Analysis Services in ROLAP mode with Teradata:
I’ve heard of people trying to do this for years, usually experiencing a lot of pain along the way, and by all accounts the situation still isn’t ideal but it sounds like it’s getting there.

AS2005 MDX Course Now Available

<Shameless Advertising Plug>
As I’ve mentioned before, I do all of my training activities through Solid Quality Mentors (also known as Solid Quality Learning) and earlier this year they persuaded me to write an MDX course for them. You can see the outline on their website here:

If you’re interested in booking this as a private course please contact Solid Quality through the address on the site. If you’re in the UK you’d definitely get me teaching it and I’d probably cover certain other European countries too, but the good thing about Solid Quality is that they have a network of top-notch BI people not only in the US but also in Central America and continential Europe too, many of whom will be teaching this class as well.
I think there’s a real need for in-depth MDX training out there – even people who know Analysis Services really well sometimes struggle with it, and you can only get so far without understanding the fundamentals. With PerformancePoint on the way the market is only going to grow and I’ve deliberately kept the first day or so as platform-agnostic as possible with a view to adapting the material for SAP BW, Essbase, TM1 and Mondrian MDX at some point in the future.
One last thing to mention: Solid Quality have also got a lot of other Microsoft BI courses if you’re not just interested in MDX. Take a look:
They’ve even got an Analysis Services data mining course written by Dejan Sarka:

UPDATE: I no longer work with Solid Quality, so if you want MDX or Analysis Services training then come straight to me! You can find out more at
</Shameless Advertising Plug>

Live on stage at the BI Conference

OK, last BI Conference-related post… you’ve heard about the technical stuff, what else did I get up to while I was in Seattle? Erm…

That’s Christian Wade on guitar and me doing the whole Phil Collins singing drummer thing. Thank goodness the dvd hasn’t made it to YouTube yet… If you’re ever in Seattle I can definitely recommend a visit to the Experience Music Project!

Using Linear Regression to Calculate Growth

A few blog entries back I showed the MDX I used to calculate a seasonally-adjusted total in my chalk talk at the BI conference. This is useful but if we’re looking for a calculation that we can use for the Trend property of a KPI it’s not the whole story – we still need to find a way of expressing how much a value is growing or shrinking over time. Although previous period growth calculations are a lot more useful with seasonally-adjusted values, we can use simple linear regression (and it has to be simple because, as I said, I’m no statistician) to do a better job.

The starting point for understanding how to use linear regression in MDX is (surprise, surprise) Mosha’s blog entry on the subject:

However, the function that’s going to be most useful here is the linregslope function. If we’re looking at the values in our time series and trying to find a line of best fit for those values with the equation y=ax+b, linregslope returns the value of a in that equation, ie the gradient – when the value of x increases by 1, y increases by the value of a. Here’s an example of how to use it:

with member measures.gradient as
lastperiods(3, [Date].[Calendar].currentmember) as last3
, [Measures].[Internet Sales Amount]
,rank([Date].[Calendar].currentmember, last3)

select {[Measures].[Internet Sales Amount], measures.gradient} on 0,
[Date].[Calendar].[Month].members on 1
from [Adventure Works]

The trick with using this function in MDX with a time series is to be able to work out what values you want to pass in for the x axis. Here I’ve used the lastperiods function to get a set containing the current member on the Calendar hierarchy, the previous member on the Calendar hierarchy and the member before that, in the first parameter of the function; at the same time I’ve declared a named set and then used that with a rank function in the third parameter to return the values 1, 2 and 3 for each of these three members.

This gets us the slope, then, but I was thinking it would be better to express this value as a percentage – but of what? The current period’s value? Or one of the preceding two periods values? I have to admit I don’t know which would be correct. Can someone help me out here? Please leave a comment..

Modelling Goals and Thresholds in Measure Groups

Before I carry on with my chalk talk series, I have to own up to something: I didn’t actually want to present on the topic of KPIs, and when I found out that I was going to have to talk on the subject I fired off a few emails to people who spend more time with KPIs than I do to ask them if they could suggest some interesting things to talk about. One of these people was Nick Barclay, co-author of ‘The Rational Guide to Business Scorecard Manager 2005’ (which I shall be reviewing very soon – it’s a good book), and he pointed out that while all the examples of KPIs he’d seen hard-coded goals and thresholds into the MDX code this was not a good thing – users want to change their values all the time and ideally you’d want to be able to let them do this themselves. Why not store these values in a measure group, allow users to change the values using writeback, and then use these values within the KPI definition somehow?

Actually modelling how the values should be stored in measure groups was very straightforward. In my demo I showed two fact tables, one for the Goals and one for the Thresholds, with one measure each. I also created a KPI dimension for both of them to allow multiple goals and thresholds for different KPIs to be stored in the same measure group; for the Goal measure group I added the Date dimension at the granularity of Calendar Year (so there was a column in the fact table containing year names) and for the Threshold fact table I also created a Threshold dimension. This Threshold dimension contained one member for each threshold to be used: Very Bad, Quite Bad, OK, Quite Good and Very Good; there was also a numeric column containing the values -1, -0.5, 0, 0.5 and 1 which represented the numeric values each threshold gets normalised to and which I assigned to the ValueColumn property of my sole attribute when I built the dimension.

One this was done and the measure groups were added to the Adventure Works cube, I showed some ways to allocate the Goal values down from the Year granularity at which they were stored. Here’s the scoped assignment for the simple allocation which simply splits the values equally by the number of time periods in the year, so for example each month shows 1/12 of the year total:

SCOPE([Date].[Calendar Semester].[Calendar Semester].MEMBERS, [Date].[Date].MEMBERS);
ANCESTOR([Date].[Calendar].CURRENTMEMBER,[Date].[Calendar].[Calendar Year])

Here’s the code for doing the weighted allocation by the previous year’s Internet Sales Amount measure values:

SCOPE([Date].[Calendar Semester].[Calendar Semester].MEMBERS, [Date].[Date].MEMBERS);
(PARALLELPERIOD([Date].[Calendar].[Calendar Year],1,[Date].[Calendar].CURRENTMEMBER), [Measures].[Internet Sales Amount])
(ANCESTOR([Date].[Calendar].CURRENTMEMBER,[Date].[Calendar].[Calendar Year]).PREVMEMBER, [Measures].[Internet Sales Amount])

A few things to note here:

  • In both cases, because I’ve set IgnoreUnrelatedDimensions to false on the measure group, to get the year’s Goal measure value I can simply reference [Measures].[Goal] – the values for the year are copied down automatically to the attributes below the granularity attribute.  
  • Although normally when you assign a value to a regular measure with an additive aggregation function the assigned value gets aggregated up, when you assign to a regular measure below the granularity attribute of a dimension no aggregation happens, similar to what you get with a calculated measure.
  • The assignment SCOPE([Date].[Calendar Semester].[Calendar Semester].MEMBERS, [Date].[Date].MEMBERS) means ‘scope on everything from the Date attribute (I’ve included the whole attribute here, All Member and the leaf level – everything on a dimension exists with either the All Member or the leaf members of the key attribute) up to and including the Calendar Semester attribute but no higher’.

Moving onto the thresholds, we need to find a way to apply the threshold values we’ve got in our measure group to the measure we’re interested in. Here’s a calculated member definition that does this:

CREATE MEMBER CurrentCube.[Measures].[Internet Sales To Goal Status] AS
, ([Measures].[Threshold],[KPI].[KPI].&[1])
([Measures].[Internet Sales Amount]/[Measures].GOAL)

What I’m doing here is creating a set that always contains the first member of the Threshold dimension, the ‘Very Bad’ member, and then filtering on the set that contains every other threshold to return the members for whom the threshold measure is less than the value of Internet Sales Amount. I then get the last member in that set, which represents the threshold with the highest value that is less than Internet Sales Amount, and use the MemberValue function to get the normalised value (the value between -1 and 1) that I assigned to that member.

%d bloggers like this: