SQL Server Best Practices Analyzer CTP

I’ve just downloaded the CTP of the SQL Server Best Practices Analyzer (see Paul Mestemaker’s post here http://blogs.msdn.com/sqlrem/archive/2007/02/21/SQL-2005-BPA-Feb-CTP-released.aspx for more details) and, shock horror, it analyses Analysis Services databases and Integration Services packages too! The advice it gives is quite sensible – things like ‘set up attribute relationships’ and ‘put distinct count measures in their own measure group’ – so it’s definitely worth checking out.

You can download it here:

SP2 is released

You can download it here:
And from the "What’s new in SP2 readme":
For more information about SP2 in the Microsoft OLAP blog by Mosha Pasumansky, see the Additional Resources section on the SQL Server 2005 – Analysis Services page on the Microsoft TechNet Web site.
Mosha, it’s official, you are the new BOL!

Anyway, seriously, as I said on Friday there’s some new stuff in the samples:


The Aggregation Manager is a tool which lets you view, design, edit, and add aggregations based on the information collected in the query log.


The ASTrace utility provides you with the ability to capture an Analysis Services trace and log it into a SQL Server table. The table can then be queried later or read using SQL Server Profiler.


The Analysis Services Upgrade Verification tool lets you compare Multidimensional Expressions (MDX) query results and performance between a Microsoft SQL Server 2000 Analysis Services databases and a Microsoft SQL Server 2005 Analysis Services database. This tools helps you verify that your upgrade from SQL Server 2000 to SQL Server 2005 was successful.

… it looks like there are several new AMO samples. But ASUV – I hadn’t heard about this, but it sounds quite interesting. I’ll have to take a closer look.

Flatland: A Romance of Many Dimensions

After a long, hard day struggling with your cube, what better way to relax than with some multidimensional entertainment? Over the years I’ve had lots of conversations with people in the BI industry about the problems people have with learning to think multidimensionally, and something that has cropped up several times is the book "Flatland: A Romance of Many Dimensions", written over a century ago by Edwin Abbott. It’s a book I’ve been meaning to buy for a while, and a quick browse on Amazon reveals that several other people have written similar books; indeed, as Christian Wade pointed out to me last year, there seems to be a whole internet subculture to do with imagining what life with more or less dimensions would be like, for example:
Anyway, the reason I’m blogging about this now is that Flatland, with a little bit of updating (as is always the way), has been turned into a film:
Seems to have some good reviews too… I wonder if it’ll ever get a release in the UK? They should organise a gala showing at the Microsoft BI Conference!

SP2 Coming Monday?

I’ve heard the rumours it was almost ready, and now according to Russell Christopher, SP2 could be ready for download on Monday:
But then again, I’ve been told that SP2 was ‘coming soon’ since about October… From an AS point of view the thing to look out for will be the new version of the samples, which will include a tool for designing aggregations manually (discussed in the AS2005 Performance Guide) and also a bit of code showing how to run traces.

Microsoft Dynamics CRM Analytics Foundation

Earlier this week Microsoft released something called ‘Microsoft Dynamics CRM Analytics Foundation’:
The name’s a bit of a mouthfull, but basically it’s a shared-source solution (including AS2005 databases) for building BI solutions on top of Microsoft Dynamics CRM. It’s available through Codeplex, and you can find out more about it on the project page:
I guess that there are going to be other similar packs available for other Microsoft products, Dynamics and otherwise, soon. Making it shared source is a nice idea too, although free stuff never quite has the same credibility as something you’ve paid for – not that customers won’t be paying partners to implement and customise all this, I suppose.

Lose those single quotes!

Although they’re no longer necessary in AS2005 MDX and I no longer bother with them, I’ve seen a lot of examples of people still using single quotes in MDX in calculated member and set definitions. Up until recently I thought this was just a matter of taste and that it did no harm to leave them in, but last week I realised that if you do leave them in it makes debugging MDX queries much harder. To illustrate, run the following two MDX queries:

FROM [Adventure Works]


FROM [Adventure Works]

Both return errors. The first gives this error message the single cell returned:
VALUE #Error Query (1, 30) The dimension ‘[BLAH]’ was not found in the cube when the string, [BLAH], was parsed.

but the second gives this error message in the same place:
VALUE #Error The dimension ‘[BLAH]’ was not found in the cube when the string, [BLAH], was parsed.

So you can see if you don’t use single quotes and there’s an error somewhere in your calculated member definitions you get the row and column where the error was found (highlighted in bold); if you do use single quotes you don’t get this useful information. For some of the three or four page queries that I sometimes have to debug this can save a lot of time…

Twelve ‘Next Big Things’ that didn’t happen

On one of my periodic visits to the OLAP Report website I noticed this commentary, which struck such a chord I had to link to it:

I don’t think I’ve seen so many marketing concepts skewered so accurately in one place. Several of the dozen are relevant to current Microsoft product strategy, and while you could argue Nigel & co are unduly cynical I think a little cynicism is necessary to counter the overwhelming amount of hype that the BI software industry generates.

Committing Writeback Transactions in MDX

Today I answered a question on the MSDN Forum about writeback by saying that it was only possible to commit a writeback transaction using code. Turns out I was wrong:
If you execute a BEGIN TRAN statement in MDX before you do your UPDATE CUBE then execute a COMMIT TRAN statement, then the writeback changes get committed to the cube; alternatively you can execute a ROLLBACK statement and any changes you made with UPDATE CUBE are lost. This works for AS2K as well as AS2005.

PerformancePoint First Impressions

I was meant to be at the third day of the UK PerformancePoint airlift today, but because of the snow we’ve been having here in the south of England I decided against the trip to Reading and stayed at home to build a snowman instead. However, I thought I’d go ahead and blog about what I learnt on the first two days; Charlie Maitland was/is also there and has also been blogging too, here and here. It seems like the PerformancePoint hype cycle is getting up to speed – there’s certainly a lot about it in the press, eg at twdi http://www.tdwi.org/News/display.aspx?ID=8313 – so I might as well add my thoughts…
First of all, what is PerformancePoint anyway? It’s really two separate things: first of all there’s the monitoring and analytics stuff, which is partly the next version of Business Scorecard Manager, partly the server-based dashboard, analytics etc components that came with the Proclarity acquisition; there’s also the budgeting and planning, ‘business performance management’, application which was codenamed Biz#. Initially I was a bit confused about why all this was being bundled into one product: I can see that the people using the bpm functionality will want to use the monitoring and analytics functionality, but the vast majority of the people using the latter will be people upgrading from the old Proclarity components (or moving away from equivalents like Panorama) and won’t have much direct interest in the bpm side of things. However I suspect that this is another case of Microsoft doing some clever bundling. The companies who are getting their dashboards/scorecards for a reasonable price (and the price is very reasonable, $20000 per server plus $195 per CAL) will suddenly find they’ve got this bpm application thrown in and decide to give it a try, in the same way that many companies came to Analysis Services because they got it free with SQL Server.
The course itself was almost all to do with the bpm functionality, with only a few demos of the dashboards and scorecards, so for the rest of this post I’ll just be talking about the bpm functionality. So, then, the key question… is it any good? Do we like it? Of course it’s impossible to say how good or bad a product is going to be based on an early CTP, but in general I was quite impressed. A lot of thought has obviously gone into it and it will definitely be a better option that trying to build your own financial application using Analysis Services. I wish I had some experience with other similar products, especially some of the better-established AS-based competitors like Outlooksoft, to be able to make a comparison but I don’t; I doubt that most people who evaluate or go on to use PerformancePoint will be judging it against its competitors though and I’m sure that as a V1.0 it’s missing a lot of the features that they have.
Down sides? At the moment the UI is a bit of a disaster, more VB4 than Office 2007 in terms of look and feel and usability, and a lot of work is going to have to be done before the September release in this area. There’s also, in my opinion, a tendency to use parent/child hierarchies rather too much (though admittedly many hierarchies in financial applications have to be modelled as parent/childs) and I would be worried about the performance implications of this although the program managers present at the course did say that the AS dev team had implemented some changes in SP2 which will mitigate this. Kristian Wedberg also made some valid points about the lack of support for integrating existing cubes and dimensions into a new PerformancePoint application which I fully agree with: at the moment you have to build every dimension measure group from scratch but realistically if you already have an AS-based application you’re going to want to reuse dimensions and measures from it, probably using linked objects.
There’s also something of a tension apparent in the design, stemming from the intent that non-technical users should be able to do as much of the work of creating and maintaining a PerformancePoint application as possible. If I recall this was always what Nigel Pendse liked about Applix TM1 in his reviews in the OLAP Report, so it’s a valid goal, but as always there are some things that are best left to technical people. The parent/child hierarchy issue is a case in point: after I had raised the question I had a good discussion from one of the guys there from Solver, a partner who has been doing a lot of work with PerformancePoint, and he said that to get around the problems associated with large parent/child hierarchies they’d found that they needed to break up the dimensions into smaller pieces and only add the relevant pieces to models/cubes. Now this is a perfectly valid answer and indeed PerformancePoint makes it easy to do just this, but it’s something that a business user should never be expected to know about or understand despite the fact that dimension design is a task that the UI wants to make easy for business users. Similarly, there’s the option to populate your dimensions either manually by typing in member names or by loading data from a csv file, both of which can be done by a non-technical user through the UI, or by pushing the data directly into the staging db using something like SSIS, clearly a technical person’s task and by far the most robust option. My feeling is that the division of tasks between those that can be done by a non-technical user and those that can’t could have been made better and that if, say, the model design tasks had been implemented in Visual Studio à la BIDS for the techies and just the maintenance and day-to-day management had been left for the non-techies in a separate UI, the product would have been stronger. But I understand that’s a contentious opinion and that this feeling is based on a very limited knowledge of the product and financial applications in general, so I might change my mind…
I don’t want to dwell on the negatives too much anyway, which I know I’m inclined to do, because as I say my overall impression was favourable and we’re still about six months from RTM. Before going on the training I thought that the bpm side of PerformancePoint would be something I might steer clear of in my day-to-day work, but now I feel enthused to go out there and do something with it. There’s a healthy amount of MDX used in its calculations and reporting so that has to represent a business opportunity for me!

Microcharts 1.1

I blogged about Microcharts last year, but I’ve just heard that version 1.1 has been released and one of the new features is the ability to use them in server-based reporting tools like Reporting Services, as well as in Excel. Take a look at the following two videos which showcase the products:
There’s also a good article on building a dashboard on Adventure Works in Excel 2007 that has some good examples of how to use the new AS cube spreadsheet functions like CubeMember:
I see Nick Barclay has already worked out how to use them in PerformancePoint scorecards:
Before we get all excited about Excel 2007 as a client tool, I thought I’d also point out this recent post from Marco Russo highlighting a massive drawback with using it on cubes that have time utility dimensions:
What an absolute howler, especially given that I remember Excel 2000 had the same problem and it got fixed in a later release (either XP or 2003) after a lot of complaining. Top priority for SP1 please! 
%d bloggers like this: