Proclarity Migration Roadmap (or lack thereof)

For those of you who commented on my recent post asking what the future held for existing Proclarity users, some interesting news. My fellow SQL BI MVP Thomas Ivarsson asked whether there were any plans for helping Proclarity users migrate to PerformancePoint and got this reply from Alyson Powell Erwin:

http://social.technet.microsoft.com/Forums/en-US/ppsmonitoringandanalytics/thread/b4e9bd35-62ce-4ca5-bd1f-05133b30bcc9

Here’s the text:

There will not yet be a migration from ProClarity 6.3 to PerformancePoint Services for SharePoint 2010.  Customers can continue to use ProClarity throughout its current supported lifecycle date of July 2012 for mainstream and July 2017 for extended.  We are still working on the roadmap for ProClarity but it is likely that you will not see a migration path until the O15 timeframe. 

So, in effect, three and a half years after Microsoft first announced they were buying Proclarity, they still have no roadmap for migrating existing Proclarity customers onto a new platform. I’m sorry, but this is just not good enough; I don’t think they could have come up with a strategy that would be more damaging to Microsoft BI if they had called up Larry Ellison and asked him to contribute some ideas. Development on Proclarity finished three years ago, almost, and they’re saying that there probably won’t be a migration story until Office 15 – which is likely to be about three or four years in the future! That’s effectively telling some of the most serious, committed Microsoft BI customers to bin their existing solutions and start again from scratch, and I can’t tell you how angry that makes me feel. It seems to me that Microsoft don’t have a BI strategy any more, they have a sell-more-Office (and especially MOSS) strategy. That’s fair enough, Microsoft have to make money somehow, but in there’s no point expecting SQL Server BI to drive sales of Office in the future if they’re busily driving away the existing customer and partner base. It’s a classic case of killing the goose that laid the golden egg.

Here’s what Microsoft should do:

  • Round up whatever members of the Proclarity dev team that are still in Microsoft and get them to work on a new stopgap release of Proclarity. It doesn’t need to add much new functionality, but it does need to update the UI and make it look a bit less like a VB6 app circa 1998.
  • Either stop pretending that Excel will meet the needs of power users and let the Proclarity fat client live for a few years longer, or add functionality to Excel that will bring it up to the required standard. Richard Lees has just published a good list of what needs to be done here (I can think of a few more myself, such as support for ragged hierarchies that use HideMemberIf), and while some of these issues are addressed in Excel 2010 not all are. Excel 2010 is just bringing Excel up to the levels of functionality that most third party SSAS clients had in 2005. And again, I can’t wait until Office 15.
  • Publish – and commit to – a clear roadmap showing how existing Proclarity customers can be migrated to the new Office BI platform. At the moment most Proclarity customers feel completely abandoned and have no idea what to do (as the comments in my recent blog post demonstrate).

In the meantime, if I was one of the remaining third party SSAS tools vendors I would be wondering if it was possible to create a wizard that would migrate existing Proclarity briefing books onto their own platform. I would imagine it might generate a few leads…

Farewell to the Excel 2003 addin and the BI Accelerator

Reading the SQL Server technical rollup mail I get sent as an MVP (the same information’s also available at http://blogs.technet.com/trm/archive/2009/10/01/october-2009-technology-rollup-mail-sql-server.aspx) I noticed that two old products have just been retired: the Excel 2003 Analysis Services addin, and the BI Accelerator. A little more information on this is available on the download pages here:
http://www.microsoft.com/downloads/details.aspx?displaylang=en&familyid=dae82128-9f21-475d-88a4-4b6e6c069ff0
http://www.microsoft.com/downloads/details.aspx?displaylang=en&familyid=a370fbc9-98b1-4f5c-b09b-6f1bf08e9292

I quote from the Excel addin page:
”The Excel Add-in for SQL Server Analysis Services has been removed to avoid customer confusion about support for this component. As noted in the details that accompanied the release of this product, Microsoft does not provide any support for this add-in and has no plans to release future versions. Newer versions of Excel include most of the functionality that is provided by this add-in; these newer versions are supported according to the Microsoft Product Lifecycle.

To be honest I’ve not even looked at either of these products for years, but at least in the case of the Excel addin I wonder how many people are still using it? If you have no choice but to use Excel 2003 (and I’m sure a fair proportion of Excel users still are) then it was an invaluable upgrade for Excel 2003’s built-in SSAS support. More to the point, the BI Survey 8 (which collected data from mid 2008) had 21.8% of Analysis Services users claiming to use it, more than double the number that were using Panorama Novaview and only 5% less than were using Proclarity. At first that seemed an improbably high number to me, but on reflection I think it could be more or less accurate: as BI consultants and developers we tend only to see ‘new’ BI projects, but what about all those projects we delivered 4+ years ago and haven’t seen since? They’re chugging along happily, ‘just working’ with no obvious need to upgrade, and their users are the people who are likely to be using the Excel addin. They won’t stop using it because of this announcement, but it might start them thinking about what they should upgrade to – probably Excel 2007, but maybe something else.

And Proclarity users are in the same situation: they have an ageing tool that is no longer supported, and need to think about upgrading to something. But what? At least with the Excel addin there’s Excel 2007 but in the case of Proclarity there’s no obvious answer – it’s not just that PerformancePoint/Excel Services/SSRS don’t have the same functionality, but if you’ve got several hundred briefing books your users aren’t going to be happy about rebuilding them in some new tool. I don’t want to go off on yet another rant about Microsoft’s idiotic client tools strategy, but I’m worried that we’ll start to see a series of migrations away from the Microsoft BI platform as a result of this issue.

DataWarehouse Explorer

Continuing my occasional series of SSAS client tool reviews, here’s another contender in the post-Proclarity power-user market: DataWarehouse Explorer, from Dutch company CNS International.

DWE is a standalone, ‘rich client’ application that gives you a lot more functionality than you get in Excel pivot tables and as such is competing in the same market that Proclarity Desktop Professional used to dominate and which is still pretty crowded. There’s also a web-based portal that you can publish reports to (see here for full details on the architecture) but if you want to build queries you need to do it on your desktop.

So what’s it like? I liked it: it’s not got any flashy features that mark it out particularly, but it does everything it needs to do and it does so well. Probably the best thing is the UI – a nice Office 2007 look-and-feel and most importantly very clear and easy to use. As someone who has spent plenty of time working with Analysis Services over the last ten years or so, when I start using a new client tool I expect to be able to do what I want to do very easily: I know all the basic concepts of cubes, I know the Adventure Works cube, and I know the queries I want to run, so if I can’t work out how to do something then I lay the blame on the UI design. And if I can’t do something there’s not point expecting an end user to do it. In the case of DWE I had no problems at all and in many respects it’s much easier to use than something like Proclarity or Excel. Here’s a screenshot:

DWE

The filter dialog provides a good example of how they’ve got the UI right. Filtering is something that every worthwhile client tool needs to do, but it’s easy to make it confusing for the user especially when you’re applying multiple conditions. The DWE filter dialog is uncluttered, shows all the filters you’ve already set up, makes it easy to add new ones or delete existing ones, and has a number of nice touches like the way it automatically formats any numeric conditions you enter to match the format string of the measure you’re filtering on.

DWE Filter

Other features worth mentioning include:

  • It mimics Excel 2007’s in-cell data bars and conditional formatting very closely. I like those features in Excel and things like this make DWE very easy to pick up for Excel users.
  • There’s a ‘Notes’ pane where you can add text commenting on the query you’ve built.
  • In the slicer pane, you can search for hierarchies by name – useful when you’ve got a lot of hierarchies and dimensions:
    image
  • Similarly, the slicer pane can organise the hierarchies on slice according to which ones you’ve explicitly selected something on, ones where there is an implicit selection (for example because there’s no All Member or a specific Default Member has been set), and ones where there is no selection:
    image
  • There’s a ‘Cube Dictionary’ feature that allows you to look at the metadata of objects on the server, for example to check the aggregation method that a measure uses:
    image
  • The UI can be switched between English, Dutch, Portuguese and Spanish.
  • You can hide more difficult functionality by setting the ‘User Level’ option to ‘Basic’ or ‘Intermediate’ rather than the default of ‘Advanced’. Fewer buttons and options improves ease-of-use for new or less competent users.

Overall, then, a good product and one worth evaluating if you’re looking for a desktop-based SSAS client tool.

Intelligencia Desktop Client

DISCLAIMER: since I licensed my SSRS custom data extension for SSAS to iT-Workplace, and since this technology is used in Intelligencia Desktop Client, I benefit financially from sales of this tool!

If you’re a regular reader of this blog, you’re no doubt aware that about a year ago I came up with an idea for a custom data extension for SSRS that makes it much easier to work with SSAS data sources, which subsequently became part of the Intelligencia Query product (which I blogged about here and has since gone through several releases). iT-Workplace, the company that sells Intelligencia Query, also produces a .NET MDX query-generator component suite called Intelligencia OLAP Controls (used in Intelligencia Query) which is aimed at third parties who want to add MDX query capabilities to their own apps, and midway last year I suggested to Andrew Wiles of iT-Workplace that he wrap these components in an exe and create his own standalone desktop client tool – and this became Intelligencia Desktop Client (IDC hereon), which I thought I’d review here in my continuing series on SSAS client tools.

IDC is distinctive because it deliberately doesn’t compete with most other Analysis Services ad hoc query tools – it’s aimed very much at the planning and budgeting market. At present the only version available is the Standard Edition which gives you query building and reporting functionality; at first impressions it does very much what other advanced ad hoc query tools like Proclarity do, but it has a lot of functionality important to financial users such as the ability to construct complex asymmetric sets on axes that many such tools lack. In fact it’s as much about creating forms for budget data entry via writeback as it is for querying and reporting; the closest comparison to make is with the PerformancePoint Excel addin although for it’s people who have built their own financial applications from scratch in Analysis Services rather than used PerformancePoint. The Enterprise Edition, which is still a CTP, will I believe offer yet more data entry and modelling functionality – I think Andrew wants to move towards incorporating cube building capabilities too.

Some features to note:

  • Creating query-based calculations is very easy, and it has an innovative spreadsheet-formula-like approach to doing so that financial users will feel very at home with:
    IDCCalcs
    Unfortunately you can’t copy a calculation from a single cell to a whole range, yet, but I’ve asked for that for a future release…
  • It has a lot options for formatting the resulting table for printing or inclusion in a document:
    image
    This ties in with its more mature sister product Intelligencia for Office 2007 which takes the form of Word and Excel addins, and is aimed at producing print-quality documents which incorporate live links to OLAP data.
  • This formatting functionality is also useful because IDC can publish queries to Reporting Services:
    image
    Depending on what your requirements are this could be a very easy way of generating SSRS reports based on SSAS data. I wouldn’t go as far as to say that it makes IDC a proper SSRS report design tool since it doesn’t support the creation of any of the more advanced SSRS features; in fact IDC doesn’t have any charting capabilities (although I know this might be in the pipeline) so you can’t create reports with charts.
  • It has an ‘MDX Mode’ where you can turn off the navigation pane and enter whatever MDX you want, with the query results being displayed in the grid; very useful for those times when you have to write the MDX for a query yourself. It even has Intellisense!
    image 

SiSense Prism

A few months ago I announced I was going to do a major series of reviews of client tools here… well, that fell flat (probably because it takes a bit too much effort to install and test one), but at least here’s one more review: Prism, from SiSense. Here’s their website:
http://www.sisense.com/

Strictly speaking it’s not just an Analysis Services client tool because it can work with data from a number of different sources such as relational tables, Excel and even Google spreadsheets and Amazon S3. I don’t know too much about their internal architecture but it seems to be based on storing the data retrieved from all these different sources in some kind of in-memory store, so I suppose in that way it’s similar to what will be coming in Excel with Gemini. They do treat Analysis Services as a data source seriously, though, and in fact one of the guys behind the company is Elad Israeli, who was behind a tool called MDXBuilder that those of you with very long memories might recall; so for the rest of this review I’ll concentrate on the AS client tool side of things.

First impressions are very good: the UI is very modern, uncluttered and easy to use. There are a few wrinkles in that there is no explicit support for AS2008 yet, and I had to go through a few hoops to get it to connect on my laptop which only had AS2008 installed; also they don’t show hierarchies grouped into dimensions, just a flat list of hierarchies from all dimensions, which is a pain when you have a lot of dimensions and hierarchies – they really should support folders etc. Since I’ve already mentioned this to SiSense hopefully this will be changed soon.

The tool itself is focused on creating dashboards and the starting point is a blank sheet on which you can drag ‘widgets’, which in turn can be hooked up to various data sources to display data. Examples of widgets are pivot controls, various different types of charts, images and textboxes, gauges, calendar controls, dropdown boxes and so on; it’s reminiscent of Reporting Services (but concentrates more on application building rather than pixel-perfect formatting) and PerformancePoint in this respect. I have to say that I found that I found the process of building a dashboard to be exceptionally easy and intuitive, and I was very impressed – I was able to put together something that worked very quickly, and it handled layout and formatting in such a way that even someone who is generally rubbish at report design like me could create a dashboard that looked professional. Here’s a screenshot of one I put together quite quickly:

prism1

One other very cool feature is the way that complex selections can be generated using a visual workflow, called ‘Questions’ in the product. You can read more about it on their blog here:
http://community.sisense.com/blogs/siblog/archive/2008/11/19/250.aspx
…but the easiest way of thinking about it is as something like the SSIS dataflow for MDX sets (similar to something I blogged about a while ago). Here’s an example of that returns the top 10 Dates by Internet Sales Amount unioned with the bottom 10 Dates by Internet Sales Amount where Internet Sales Amount is greater than $500:

image

I think this is the best way I’ve seen of letting users set up complex filters, although it probably is still only something a power user could understand.

At the moment Prism is just a fat client, so with no web-based version (yet) sharing dashboards is a matter of emailing .psm files or putting them on a network share; this will be a deal-breaker for some people. SiSense have, though, in my opinion made the right decision in implementing the functionality they have got very well before rushing off to tick all the boxes on potential buyers’ checklists and doing so badly. Overall, if you’re in the market for a desktop BI tool that supports Analysis Services as well as other data sources I can recommend taking a look at Prism.

OLAP PivotTable Extensions new release

For some reason I’ve not blogged about this before, but anyway the ever-industrious Greg Galloway has just released a new version of his OLAP PivotTable Extensions:
http://www.codeplex.com/OlapPivotTableExtend

It’s an Excel addin that gives you useful new functionality in Excel 2007 pivot tables connected to Analysis Services, such as the ability to add private calculations, view the MDX behind the pivot table, and (in the new release) search for members and other things. Definitely worth a look, and useful too if you’ve ever wondered how to work with the Excel pivot table in code.

I’m hosting a webinar for Panorama

As I’ve mentioned, my recent thoughts on client tools (see here and here) have prompted a lot of interest around the Analysis Services community. One result is that Panorama have asked me to host a webinar for them where I get to sound off about the state of the client tool market before they show you their latest stuff. Yes, I’ll be paid for it but I’m not going to be promoting their products directly (I feel like I need to justify myself!), just repeating my standard line that if you want to do anything serious with Analysis Services you should at least check out the range of third-party client tools available rather than stick blindly with what MS gives you – and given that Panorama are the largest vendor of third-party client tools for Analysis Services, they deserve to be on the list of tools to check out. Here’s the link to sign up for the webinar:

http://www.panorama.com/webcasts/archives/2008/webinar-with-cwebb-oct-21.html

Softpro Cubeplayer

While I was at SQLBits I had the pleasure of meeting Tomislav Piasevoli (who has come all the way from Croatia especially), someone who has been very active on the Analysis Services MSDN Forum recently and who knows a lot about MDX. His company, Softpro, sells an Analysis Services client tool called Cubeplayer and he very kindly gave up his lunchtime to gave me a detailed demo. As I said recently the general feeling of frustration surrounding Microsoft’s client tool strategy has made me look again at the third-party client tool market and decide to review some of these tools here, and this look at Cubeplayer is the first in the series. Remember, if you’ve got a client tool you’d like me to look at, please drop me a line…

The first thing to say about Cubeplayer is that it’s a tool for power users and consultants, not the average user who might want to browse a cube. As such it’s going to appeal to the fans of the old Proclarity desktop client, which it vaguely resembles in that it’s a fat client with a lot of very advanced query and analysis functionality. It’s not part of a suite – there’s no web client etc – but it includes dashboarding functionality that’s only available through the tool itself, and also has the ability to publish queries up to Reporting Services.

What can it do? Well, the web site has a good section showing video demos of the main functionality, but here are some main points:

  • It can certainly do all the obvious stuff like drag/drop hierarchies to build your query, as well as more advanced selection operations such as the ability to isolate individual members in a query, drill up and down on individual members or all members displayed. It also has a number of innovative features like the ability to click on a cell and drill down on it, which means that you drill down on every member on every axis associated with this cell.
    cubeplayer
  • It also supports all the more advanced types of filtering and topcounts that you’d expect from a tool like this; in fact, it seems to do pretty much anything you can do in MDX. This does give you an immense amount of power and flexibility, but sometimes at the expense of ease-of-use. Take the old nested topcount problem, solved in MDX using the Generate function: any advanced client tool will have to handle this scenario and Cubeplayer certainly does in its Generate functionality, but would you even expect a power user to be able to understand what’s going on here and remember they have to click some extra buttons to get this to happen?
  • There’s a nice feature where you can choose to display the data in your grid either as actual values, ranks, percentages or other types of calculation. This makes it really easy to make sense of large tables of data.
    showas
  • It has a whole load of built-in guided analyses such as ‘How Many?’, ‘Show Me’, ABC analysis (for segmentation) and Range analysis. This for me is a real selling point – I’ve been asked several times, for example, about doing ABC analysis with Analysis Services and I’m not sure I’ve seen another tool that does it.
    abc
  • Another cool thing I’ve not seen before is the ability to put two queries side-by-side and, if they have selections on the same hierarchy, do operations like unions, intersects and differences on the selections.
  • There’s an MDX editor (with intellisense and other useful stuff) where you can write your own queries. Again, not something that even a power-user might want to do, but if you’re a consultant who knows MDX it’s a useful feature for those times when you know you can write the query but you can’t get the query builder to do exactly what you want.
  • You can generate Reporting Services reports from a query view. You probably already know my opinions on the native support for Analysis Services within Reporting Services, and this certainly does make it much easier to create cube-based reports.

Overall, definitely worth checking out if you’re in the market for this type of tool. There are a few criticisms to be made: as I said, I’m not sure it’s as easy to use as it could be although this is partly the price you pay for the richness of the functionality; there are some strange lapses in UI design such as the way all dialogs have ‘Accept’ and ‘Cancel’ buttons with icons on, instead of the usual plain ‘OK’ and ‘Cancel’; and charting is competent but not up to the standards of the best visualisation tools (I think many vendors would do well to look at their tools and ask themselves "What would Stephen Few think?" – it might not be very complimentary).  In my opinion, though, it’s a very strong and mature tool and the positives far outweigh the negatives.

One final point: Tomislav mentioned he was looking for reseller partners outside Croatia. If you’re interested in this I can put you in touch with him.

Google, Panorama and the BI of the Future

The blog entry I posted a month or so ago about XLCubed where I had a pop at Microsoft for their client tool strategy certainly seemed to strike a chord with a lot of people (see the comments, and also Marco’s blog entry here). It also made me think that it would be worth spending a few blog entries looking at some of the new third party client tools that are out there… I’ve already lined up a few reviews, but if you’ve got an interesting, innovative client tool for Analysis Services that I could blog about, why not drop me an email?

So anyway, the big news last week was of course Google’s announcement of Chrome. And as several of the more informed bloggers (eg Nick Carr, Tim McCoy) the point of Chrome is to be not so much a browser as a platform for online applications, leading to a world where there is no obvious distinction between online and offline applications. And naturally when I think about applications I think about BI applications, and of course thinking about online BI applications and Google I thought of Panorama – who incidentally this week released the latest version of their gadget for Google Docs:
http://www.panorama.com/newsletter/2008/sept/new-gadget.html

Now, I’ll be honest and say that I’ve had a play with it and it is very slow and there are a few bugs still around. But it’s a beta, and I’m told that it’s running on a test server and performance will be better once it is released, and anyway it’s only part of a wider client tool story (outlined and analysed nicely by Nigel Pendse here) which starts in the full Novaview client and involves the ability to publish views into Google Docs for a wider audience and for collaboration. I guess it’s a step towards the long-promised future where the desktop PC will have withered away into nothing more than a machine to run a browser on, and all our BI apps and all our data will be accessible over the web.

This all makes me wonder what BI will be like in the future… What will it be like? Time for some wild, half-formed speculation:

  • Starting at the back, the first objection raised to a purely ‘BI in the cloud’ architecture is that you’ve got to upload your data to it somehow. Do you fancy trying to push what you load into your data warehouse every day up to some kind of web service? I thought not. So I think ‘BI in the cloud’ architecture is only going to be feasible when most of your source data lives in the cloud already, possibly in something something like SQL Server Data Services or Amazon Simple DB or Google BigTable; or possibly in a hosted app like Salesforce.com. This requirement puts us a long way into the future already, although for smaller data volumes and one-off analyses perhaps it’s not so much an issue.
  • You also need your organisation to accept the idea of storing its most valuable data in someone else’s data centre. Now I’m not saying this as a kind of "why don’t those luddites hurry up and accept this cool new thing"-type comment, because there are some very valid objections to be made to the idea of cloud computing at the moment, like: can I guarantee good service levels? Will the vendor I chose go bust, or get bought, or otherwise disappear in a year or two? What are the legal implications of moving data to the cloud and possibly across borders? It will be a while before there are good answers to these questions and even when there are, there’s going to be a lot of inertia that needs to be overcome.
    The analogy most commonly used to describe the brave new world of cloud computing is with the utility industry: you should be able to treat IT like electricity or water and treat it like a service you can plug into whenever you want, and be able to assume it will be there when you need it (see, for example, "The Big Switch"). As far as data goes, though, I think a better analogy is with the development of the banking industry. At the moment we treat data in the same way that a medieval lord treated his money: everyone has their own equivalent of a big strong wooden box in the castle where the gold is kept, in the form of their own data centre. Nowadays the advantages of keeping money in the bank are clear – why worry about thieves breaking in and stealing your gold in the night, why go to the effort of moving all those heavy bags of gold around yourself, when it’s much safer and easier to manage and move money about when it’s in the bank? We may never physically see the money we possess but we know where it is and we can get at it when we need it. And I think the same attitude will be taken of data in the long run, but it does need a leap of faith to get there (how many people still keep money hidden in a jam jar in a kitchen cupboard?). 
  • Once your data’s in the cloud, you’re going to want to load it into a hosted data warehouse of some kind, and I don’t think that’s too much to imagine given the cloud databases already mentioned. But how to load and transform it? Not so much of an issue if you’re doing ELT, but for ETL you’d need a whole bunch of new hosted ETL services to do this. I see Informatica has one in Informatica On Demand; I’m sure there are others.
  • You’re also going to want some kind of analytical engine on top – Analysis Services in the cloud anyone? Maybe not quite yet, but companies like Vertica (http://www.vertica.com/company/news_and_events/20080513) and Kognitio (http://www.kognitio.com/services/businessintelligence/daas.php) are pushing into this area already; the architecture this new generation of shared-nothing MPP databases surely lends itself well to the cloud model: if you need better performance you just reach for your credit card and buy a new node.
  • You then want to expose it to applications which can consume this data, and in my opinion the best way of doing this is of course through an OLAP/XMLA layer. In the case of Vertica you can already put Mondrian on top of it (http://www.vertica.com/company/news_and_events/20080212) so you can already have this if you want it, but I suspect that you’d have to invest as much time and money to make the OLAP layer scale as you had invested to make the underlying database scale, otherwise it would end up being a bottleneck. What’s the use of having a high-performance database if your OLAP tool can’t turn an MDX query, especially one with lots of calculations, into an efficient set of SQL queries and perform the calculations as fast as possible? Think of all the work that has gone into AS2008 to improve the performance of MDX calculations – the performance improvements compared to AS2005 are massive in some cases, and the AS team haven’t even tackled the problem of parallelism in the formula engine at all yet (and I’m not sure if they even want to, or if it’s a good idea). Again there’s been a lot of buzz recently about the implementation of MapReduce by Aster and Greenplum to perform parallel processing within the data warehouse, which although it aims to solve a slightly different set of problems, it nonetheless shows that problem is being thought about.
  • Then it’s onto the client itself. Let’s not talk about great improvements in usability and functionality, because I’m sure badly designed software will be as common in the future as it is today. It’s going to be delivered over the web via whatever the browser has evolved into, and will certainly use whatever modish technologies are the equivalent of today’s Silverlight, Flash, AJAX etc.  But will it be a stand-alone, specialised BI client tool, or will there just be BI features in online spreadsheets(or whatever online spreadsheets have evolved into)? Undoubtedly there will be good examples of both but I think the latter will prevail. It’s true even today that users prefer their data in Excel, the place they eventually want to work with their data; the trend would move even faster if MS pulled their finger out and put some serious BI features in Excel…
    In the short-term this raises an interesting question though: do you release a product which, like Panorama’s gadget, works with the current generation of clunky online apps in the hope that you can grow with them? Or do you, like Good Data and Birst (which I just heard about yesterday, and will be taking a closer look at soon) create your own complete, self-contained BI environment which gives a much better experience now but which could end up being an online dead-end? It all depends on how quickly the likes of Google and Microsoft (which is supposedly going to be revealing more about its online services platform soon) can deliver usable online apps; they have the deep pockets to be able to finance these apps for a few releases while they grow into something people want to use, but can smaller companies like Panorama survive long enough to reap the rewards? Panorama has a traditional BI business that could certainly keep it afloat, although one wonders whether they are angled to be acquired by Google.

So there we go, just a few thoughts I had. Anyone got any comments? I like a good discussion!

UPDATE: some more details on Panorama’s future direction can be found here:
http://www.panorama.com/blog/?p=118

In the months to come, Panorama plans to release more capabilities for its new Software as a Service (SaaS) offering and its solution for Google Apps.  Some of the new functionality will include RSS support, advanced exception and alerting, new visualization capabilities, support for data from Salesforce, SAP and Microsoft Dynamics, as well as new social capabilities.

XLCubed (and a rant about Microsoft’s client tool strategy)

The other week I stopped by in Maidenhead to see the guys at XLCubed, and to take a look at their latest stuff. XLCubed have been around a long time and their Excel addin AS client has always been one of the best out there, but with the improved Analysis Services support in Excel 2007 (especially with the introduction of ‘convert to formulas’) and the Proclarity acquisition has put a squeeze on the client tools sector. A lot of the third party client tools out there, XLCubed included, are better in a lot of ways than the equivalent Microsoft offerings but it’s often hard to explain to someone who isn’t very experienced with Analysis Services what the advantages are and why they represent a good reason to buy a non-Microsoft product. So, in order to survive, you need a clear, unique selling point and XLCubed now have one in the form of Microcharts after they bought Bonavista Systems last year (I blogged about the Microcharts product in its original form here). Microcharts gives you the ability to create sparklines, bullet graphs and other in-cell charts, which is not only impressive when used in conjunction with regular Excel and Reporting Services (with or without AS as a data source) but enters the realm of extreme coolness when you see how it’s been integrated with XLCubed.

Here’s just one example of the kind of dashboard you can build with XLCubed:

CIODashboard_550X 

You can see a whole page of sample dashboards here:
http://www.xlcubed.com/en/Demo_Overview.html

Nice, eh? I should also mention they have an excellent data visualisation blog that’s well worth a read:
http://blog.xlcubed.com/

While on the subject of client tools, can I veer off on a tangent here and criticise Microsoft’s strategy in this area? In my opinion (and just about everyone I’ve met agrees with me, not least disgruntled ex-Proclarity employees) what they’ve done has actually harmed the core Microsoft BI market over the last two years. Before the Proclarity acquisition it wasn’t an ideal situation, for sure, since telling customers that they had to buy their client tools from a third party looked bad. But what Microsoft have done is bought the leading third-party client tool and effectively chucked it in the bin, saying people should use Excel and PerformancePoint instead. Excel 2007 is a good client tool but a) a lot of companies are still on Excel 2003 and before, and are not going to upgrade just for the sake of a BI project, b) it has nowhere near the kind of advanced functionality that the Proclarity desktop tool had and never will, and c) it still has a few glaring problems (see here for example); PerformancePoint too is encouraging but very much a version 1.0. Microsoft’s long release cycles for both mean that we have to wait way too long for any upgrade in functionality, and in the meantime we’re left with a vacuum: the third party client tool market has been weakened because now all customers will want to use Microsoft client tools as a first choice, but these client tools are not yet up to scratch. Why on earth didn’t they carry on developing the Proclarity product line for a few more years until a smoother transition could be made? Why the prejudice against standalone client tools? Once again I’m left with the feeling that senior people in Redmond have little idea what’s going on in the real world and more importantly are insulated from the impact that their decisions have on the bottom line. On the positive side, though, Microsoft’s actions have given companies like XLCubed the breathing space they needed to innovate and survive.

%d bloggers like this: