Business Intelligence – Oracle

Archive for March 8th, 2009

Hyperion Essbase 9.3.1 – Clearing regions in ASO cubes – Using rule files and Report Scripts – Part 2

Posted by Venkatakrishnan J on March 8, 2009

In the last blog entry here, we saw how the new feature introduced in the 11.1.0 release helps us to clear regions of an ASO cube easily. But majority of the customers/users are still on the 9.3 or earlier releases of Essbase. Unfortunately, in the prior releases of Essbase there is no out of the box way of clearing portions of an ASO cube easily. If you are on the earlier releases, then you might find this and the next blog entry useful. Depending on the type of data clearance required, portions of an ASO cube can be cleared using 3 approaches

1. Subtract existing data with a new dataset (exact replica of what was loaded before) using rule files – For this to work, backup of the older data that was loaded should be available.
2. Extract portions of an ASO cube using Report File or MDX and reload into the same cube with subtract load.
3. Use incremental slices to maintain volatile data. Clear the volatile slices and reload the data – This is applicable only if prior data load needs to be cleared. This does not allow data clearance for a certain point in time.

We shall discuss the options 2 and 3 alone since the option 1 is pretty straight forward. But generally option 1 is not used that much since rarely do we have the copy of an already loaded data lying readily somewhere. So, lets start with the second option. Just for illustration purposes, lets assume that we have loaded 3 months of data (Jan to Mar) into the ASOSamp->Sample cube. In the month of April, it has come to the notice of the Essbase admin that the portion of January data which was loaded earlier was not correct and hence a reload for that month alone is desired. Further more, it was also noted that the erroneous data originated from some of the “Electronics Online” stores. Also, the erroneous data was for all the Sales related quantity (Units) for all types of Digital cameras in the Western Region. So, the first step in the data clearance activity is to take an extract of the prior loaded level-0 erroneous data using a report file. The report script is given below as an example (even MDX can be used. But right now extracting data directly from MDX is available only from ODI or through the JAPI)

<COLUMN ("Measures")
<ROW ("Years","Time","Transaction Type","Payment Type","Promotions","Age","Income Level","Products","Stores","Geography")
"Curr Year",
<DIMBOTTOM "Promotions"
<DIMBOTTOM "Income Level"
<DIMBOTTOM "Digital Cameras"
<DIMBOTTOM "Electronics Online"

The output of the report script is a tab limited text file containing all the erroneous data.

        <a href=""><img src="" alt="" /></a>

The idea is to use the same output text file to reload back into the cube using a simple rule file. So, lets fire a simple maxL script which would run the above report script and export the data into a text file.

export database 'ASOsamp'.'Sample' using server report_file 'Export' to data_file 'Export.txt';

Now, create a simple rule file using this text file as input as shown below


Also ensure that this rule file subtracts from existing data.


Now, import this back to the database and this would automatically subtract the values from the database thereby clearing it. Now the cube is ready for a reload again. In the next blog entry we shall see the third and final approach which is handling the data clearance activity through multiple slices.

Posted in All Posts, EPM, Hyperion Essbase | 3 Comments »