The execution plan of SSAS processing engine triggers a sequence of processing tasks based on the object’s current processed state and its dependencies with other objects. This is what happens when the SSAS cube is processed from SQL Server Management Studio (SSMS). It is possible to replicate this execution plan close enough in an SSIS package so that the cube is always processed and refreshed with latest data.
The package shown in screen capture 1 will do just that. Using SSIS Analysis Services Processing Task, the dimensions are processed first followed by the partitions, measure groups and cube. Finally the indexes of all objects are rebuilt.
Screen Capture 1 – SSIS Package to Process SSAS Cube
All dimensions related to the cube is first processed using Process Default option. If the cube has been deployed for first time or re-deployed, depending on the structural changes the cube might be in an unprocessed state. Process Default will bring the dimensions to a bare minimum processed state. On the other hand, if the dimension is already in a processed state and only requires a data refresh this task will have no impact in the processing of the cube.
The next step would be refresh dimension data using Process Update option. It is important to choose Process Update instead of Process Full for dimensions. Process update will refresh the dimension data without taking other cubes that depend on the shared dimensions to an unprocessed state. Process Update option works only if the dimension is in processed state, that’s why Process Default should precede Process Update. Dimensions that have rigid attribute relationship should use Process Full option and that’s a design consideration.
Similar to dimension the partitions, measure groups and cube is first processed using Process Default option to bring the facts to a processed state. As with dimensions, if the cube is already in a processed state and only requires a data refresh this task will have no impact in the processing of the cube.
The fact data is refreshed in cube using Process Data option. Process Data option works only if the object is in processed state, that’s why Process Default should precede Process Data. Unlike Process Update for dimensions, Process Data drops and rebuilds entire partitions from scratch. If the partitions are date based and grow over time, only the affected partitions needs to be processed. In that case replace the Analysis Services Processing Task with AMO Code in Script Task to process partitions. You will also need some logic to determine which partitions to process as described in the last section of this post from Vidas Matelis. For cubes with static partitions like those based on geography or lines of business, this method will work fine.
The last step would be to rebuild indexes of all dimensions, partitions, measure groups and cube using Process Index option. This will improve query performance and as a pre-requisite requires the facts and dimensions to be in processed state.
Set the Processing Order to Parallel in the Analysis Services Processing Tasks to speed up processing time (Screen Capture 2).
Screen Capture 2 – Processing Order – Parallel
The above sequence of processing is a robust technique to bring any cube from any state to processed state and refreshed with latest data without bringing down any other cubes that might depend on the shared dimensions. It is important to note that Process Update causes the size of physical file of the dimension attributes to grow. Older versions of SQL Server have a 4GB limit on the file size beyond which processing throws exceptions. It’s a good practice to occasionally process SSAS database using Full Process option to shrink the physical file size.
It’s extremely useful to handle onWarning event in the event handler. SSAS logs useful information about processing errors as warnings rather than exceptions. For example processing errors due to missing attribute key records (The attribute key cannot be found) are logged as warnings
Bill Anton (@SQLbyoBI) August 20, 2013
João Lopes (@SQLSniper) August 20, 2013
Tiago Quaresma (@TQuar3sma) March 06, 2014