Business Intelligence – Oracle

Archive for January, 2009

Oracle Data Integrator 10.1.3.5 and Oracle OLAP – RKM and IKM – Using Oracle OLAP as a Target – Part 2

Posted by Venkatakrishnan J on January 6, 2009

In the blog entry here, we had seen how the new RKM for Oracle OLAP works. Today, we shall look at the other KM(IKM) which would be used for loading data directly into Analytic Workspaces. I had noted earlier that, the Oracle OLAP RKM just reverses the OLAP cube to its underlying source tables. So, basically, it is not Oracle OLAP aware per se and it does not support all kinds of data extract and load. This is the major difference in the Oracle OLAP RKM and Essbase RKMs. Essbase RKMs treats Essbase itself as multidimensional data store. So, all data extract through MDX, Report Scripts etc are supported. We shall look at how ODI uses the Essbase RKMs in a later blog entry. Lets proceed with our test on the Oracle OLAP IKM which would be used to load an Oracle OLAP AW Cube. To test this out we shall start out with a new cube in the Global Analytic Workspace. This cube mapping is provided below as a reference.

In our case, the UNITS_FACT_TGT is a table that is used as a source for the cube. We shall create another table UNITS_FACT_SRC which would be used as source to populate data into this target table. Our idea is to simulate an environment wherein the target table would be populated temporarily before loading into Oracle OLAP. This target table would contain only the incremental data.

CREATE TABLE UNITS_FACT_SRC AS SELECT * FROM UNITS_FACT;
CREATE TABLE UNITS_FACT_TGT AS SELECT * FROM UNITS_FACT WHERE 1=0;

Now, lets create an interface which would be using the above 2 tables (one as a source and the other as a target).

In the flow control, ensure that you are choosing the “IKM Oracle AW Incremental Update”.

This is very similar to the normal Oracle to Oracle data load. The only addition are the 4 options which are specific to AW loading. The 4 options are

1. REFRESH_CUBE – This will enable the automatic start of data load into Oracle OLAP at the end of data load into the target table.
2. AW_OWNER – Owner of the Oracle OLAP AW
3. AW_NAME – The Analytic Workspace name
4. CUBE_NAME – The cube whose data needs to be refreshed.

This IKM assumes a straight data load into the Oracle OLAP. The data load cannot be customized. The only way to do that is to call custom OLAP specific Pl/SQL packages. But again, i was expecting that to be available in the IKM itself. Now lets execute the interface and see what the Interface does under its covers.

Let us take a look at the Olap Refresh option to see the exact code.

declare
xml_clob clob;
xml_str varchar2(4000);
isAW number;
v_awname varchar2(50);
v_owner varchar2(50);
v_cube varchar2(50);
v_dimension varchar2(50);
v_line varchar2(250);

cursor c1 is select DIMENSION_AW_NAME from
olapsys.all$olap2_aw_cube_dim_uses
where
aw_owner = v_owner
and aw_name = v_awname
and aw_logical_name = v_cube
and DIMENSION_AW_NAME<> 'TIME';
begin

v_awname := 'GLOBAL';
v_owner  :=  'GLOBAL';
v_cube   :=  'UNITS_FROM_ODI';

sys.dbms_aw.execute('AW ATTACH '||upper(v_owner)||'.'||upper(v_awname)||' RW');
DBMS_LOB.CREATETEMPORARY(xml_clob,TRUE);
dbms_lob.open(xml_clob, DBMS_LOB.LOB_READWRITE);
v_line := '  <BuildDatabase  Id="Action7" AWName="'||upper(v_owner)||'.'||upper(v_awname)||'" BuildType="EXECUTE" RunSolve="true" CleanMeasures="false" CleanAttrs="true" CleanDim="false" TrackStatus="false" MaxJobQueues="0">';
dbms_lob.writeappend(xml_clob, length(v_line),v_line );
dbms_output.put_line(v_line);
open c1;
loop
fetch c1 into v_dimension;
exit when c1%notfound;
v_line := '    <BuildList XMLIDref="'||v_dimension||'.DIMENSION" />';
dbms_output.put_line(v_line);
dbms_lob.writeappend(xml_clob, length(v_line),v_line );
end loop;
close c1;
v_line := '    <BuildList XMLIDref="'||v_cube||'.CUBE" />';
dbms_output.put_line(v_line);
dbms_lob.writeappend(xml_clob, length(v_line),v_line );
dbms_lob.writeappend(xml_clob, 18, '  </BuildDatabase>');
dbms_lob.close(xml_clob);
xml_str := sys.interactionExecute(xml_clob);
dbms_output.put_line(xml_str);
dbms_output.put_line('finish');
sys.dbms_aw.execute('AW DETACH '||upper(v_owner)||'.'||upper(v_awname));
end;

The above is a java code and you would see calls to the DBMS_AW package (for attaching and detaching the AW). The actual data load is done through a custom XML load parser. This can be customized according to our needs. But again this is not something that can be done from the front end. This would have to be changed in the backend class files of ODI.

After i saw how the RKM worked, i was pretty sure how the IKM would work. The above confirmed my guess. In a future blog entry i would be comparing this with the Essbase specific RKMs and IKMs.

Update: John Goodwin has an excellent series of articles on ODI and Essbase,Planning connectivity. I would recommend everyone to check that out. Thanks John for letting me know.

Advertisements

Posted in All Posts, OLAP, Oracle Data Integrator | 2 Comments »

Hyperion Essbase 9.3.1 – Currency Conversion in ASO Cubes – Using Level0 Export from BSO & ASO Aggregation – Approach 2

Posted by Venkatakrishnan J on January 5, 2009

In the previous blog entry here, i had covered an approach to do Currency Conversion in ASO Cubes. In most cases, that would suffice unless the currency rate dimension has more than 1000 or more dependent members. In such cases where are a lot of interdependent members, we can use another approach wherein the conversions happen in the BSO cube but ASO is used for reporting and aggregation. The basic idea behind this approach is provided below

The above approach works only if you have data coming in at the lowest level in your BSO cube. If there is any data at the top levels(either as an user input in Excel Add-in or through the data load), then those values would be ignored during the import into ASO.To understand this better, we shall use the same ASO & BSO outlines which we built out in the articles here and here. To recap, our BSO outline is same as the outline in Sample->Interntl database. The only addition is the new member called “Rate Product” in the Product dimension to accomodate rate loads.

The assumption is that we would be loading local currency data only at the lowest level. We would be using the same rule files that we built on the blog entries above to load the data and rates. To give a perspective on the input data and rates, their corresponding screenshots are provided here

Now use the below calculation script to convert the local currency to USD.

SET CACHE HIGH;
SET CALCPARALLEL 2;   

FIX("Sales",@LEVMBRS("Market",0))
"Actual" = "Act" / "Act"->"Rate Product";
"Budget" = "Bud" / "Bud"->"Rate Product";
"Actual @ Bud XChg" = "Act" / "Bud"->"Rate Product";
ENDFIX;

The above calculation script will basically convert the local currency to USD numbers. And the USD numbers would be stored under the Actual and Budget members. Once this is done, the next step is to take a level0 export of the BSO cube using the below MaxL script

export database 'Sample'.'Interntl' level0 data to data_file 'D:\BSOExport.exp';

The above step completes the first phase of the currency coversion. In the second phase, we would have to prepare the ASO outline in such a way that the import of data from the BSO cube happens correctly. We shall start with the same ASO outline that we migrated from BSO in the approach 1. There are a couple of changes which are needed on this ASO outline. They are

1. Remove the formula from the Actual, Budget and Act @ XChg Budget members.
2. Convert the dynamic hierarchies to stored hierarchies in the Scenario dimension

The converted ASO outline is shown below

Once the ASO outline is ready, the next step is to import the exported level0 BSO data. This is achieved through the maxL script shown below

import database 'CurrASO'.'CurrASO' data from data_file 'D:\BSOExport.exp' on error abort;

Once the data is loaded inside the ASO cube, aggregations need to be performed. Once the aggregations are done, lets verify the results in HVE.

The entire process described above can be automated through a combination of batch(or shell scripts in Unix) and maxL scripts. Of course, the scripts given above are not final and binding as there can be multiple variations depending on the requirements. But this should give an idea of how to go about doing currency conversions in ASO as well as BSO cubes.

Posted in All Posts, EPM, Hyperion Essbase | Leave a Comment »

Oracle Data Integrator 10.1.3.5 and Oracle OLAP – RKM and IKM – Using Oracle OLAP as a Source – Part 1

Posted by Venkatakrishnan J on January 4, 2009

As you might probably be aware, the latest version of ODI 10.1.3.5(and also the prior release) supports a new set of KM’s which can extract and load data into Oracle OLAP Cubes. We shall try to understand how this connectivity works in this release. There are 2 aspects to this connectivity. They are

1. Using Oracle OLAP as a data source – This is supported by a Reverse Knowledge Module called “RKM Oracle Olap (Jython)”.
2. Incremental data load into Oracle OLAP – This is supported by a Integrating Knowledge Module called “IKM Oracle AW Incremental Update”

Both the above KM’s are available in the default install of ODI. One would have to manually import the KM’s from the {ODI_HOME}\oracledi\impexp directory. We shall be using the Global Analytic Workspace for all our testing. This Global AW contains 4 dimensions

1. Channels Dimension – Containing one normal Level Based Hierarchy
2. Customer Dimension – Containing one normal Level Based Hierarchy
3. Product Dimension – Containing a value based hierarchy
4. Time Dimension – Containing the normal level based fiscal time hierarchy

Also this AW contains 2 cubes, one having one measure each. The below screenshots would give you an understanding on how my AW is structured.

To use this AW as our data source in ODI, we need to create the physical connection in the Topology Manager of ODI.

Once the physical connection is created, lets create a logical connection to denote the usage of this connection as a source.

Also tie the physical and logical connections together by means of the Global context as shown above. This will ensure that the logical connection will use the physical connection defined above. Once this is done, let us login to Designer and import all the out of the box KMs provided by ODI.

As a next step lets create a model and import this AW using the RKW imported above. While creating the model, we would have to use Oracle as the Technology and then assign the Logical connection that was created above.

And in the Reverse tab, choose the context and use the customized option as shown below.

Ensure that you are choosing the RKM properly in the dropdown. Also, enter all the the Analytic Workspace related connection details as shown above. Now for the Reversing operation to work there are 2 jar files that are needed. They are

1. olap_api.jar
2. awxml.jar

These 2 files need to be copied from the Oracle database home i.e Copy the 2 files from {ORACLE_HOME}\olap\api\lib to {ODI_HOME}\oracledi\drivers. The documentation states that awxml.jar alone is sufficient. But i was not able to get the Reverse functionality to work with just that single jar. Only after i copied over the olap_api.jar file over to the drivers directory, the reverse started to work. Once the files are copied over, a restart of the designer is required. After the designer restart, just click on reverse from the Model that we created. The reversing operation is in itself treated as an execution and can be tracked using the Operator.

Once the reverse operation is successful, we should be having the imported objects in the Model as shown below.

If you observe closely, what we have basically obtained is not actually the OLAP object itself. Instead, what the RKM has done, is that it has used the XML API (using awxml.jar) of the Global AW to reverse engineer all the dimension and fact tables from the mappings defined in the AW. If you see, we have obtained 3 dimension tables and 2 fact tables. Also, one dimension table is missing which is the value based product dimension. I believe the current version of RKM does not support reverse engineering Value based hierarchies. So, the question now is if we incorporate these imported dimension/fact tables into our interface, will we be extracting data out of the cube or from the fact tables? Unfortunately, i believe the current version does not support direct extract of data from OLAP using these 2 KM’s (if data is required directly from the cubes one would have to create custom SQL Views on top of the AW and then use them in the interface). To test this out, lets create a sample dummy fact table with no rows as shown below

CREATE TABLE UNITS_FACT_TEST AS SELECT * FROM UNITS_FACT WHERE 1=0

We shall be using the above table as our target in the interface. For the source, we shall use the imported UNITS_FACT table. The screenshots of the interface is shown below

As you see, we are treating this as a normal oracle table in our interface. So, when we execute this interface the data extract would happen only from the underlying fact tables instead of the Cube AGGMAPs. I was expecting a bit more advanced form of the KMs to support Oracle OLAP natively. But considering the fact that the cube level data can be exposed through SQL Views this is not much of a hinderance as it is possible to use them to extract data directly from the cube. Neverthless, it would be interesting to see what the Incremental Update IKM does. My hunch is that this will support data load into the fact tables and in turn will kick start the cube aggregation. We shall get into more details on the AW data load using ODI in a future blog entry.

Posted in All Posts, OLAP, Oracle Data Integrator | 2 Comments »