31 Oct 2008 08:22
Having your users walk into a cold cache on a morning is something that will cause them to have slower running queries initially. Cache warming takes care of a lot of that by executing typical queries against the newly processed cube and thereby building the cache ready for your users when they get in. There are a number of ways to warm the cache available.
Chris Webb has already blogged about how he has done it and we thought we would show you a variation on that theme.
UPDATE: The Trace File Source Adapteris now available.
The package is relatively simple.
Let’s take a look inside the Data Flow task to start with.
The component TFS_MDX (Trace File Source) reads a Trace file saved from Profiler and puts the data into the pipeline. The CSPLIT_MDX(Conditional Split) separates out the MDX queries by looking for an EventSubclass value of 0.
After we have got our MDX queries separated then we perform a conversion to DT_WSTR() on the TextData column in order to be able to Sort on it and remove duplicates (You cannot sort on TextData as it is a DT_NTEXT datatype).
After we have sorted our data we then terminate our pipeline in a Script Component acting as a destination. On the property pages of the component we make a reference to our OLEDB connection manager that points to our SSAS server because we are going to use its ConnectionString property inside the component. In the script Component we add a reference to the Microsoft.AnalysisServices.AdomdClient library. Our script then looks like this
We use ADOMD.NET to execute our MDX queries against our SSAS server.
This method should prove useful in preventing users having a cold cache and experiencing slow running queries of a morning.