Business Intelligence – Oracle

Archive for January, 2009

Oracle Data Integrator Enterprise Edition – New bundling – ODI and OWB merged

Posted by Venkatakrishnan J on January 31, 2009

Well, this news has been floating around for sometime and now it has been officially released. Oracle Warehouse Builder and Oracle Data Integrator have been merged together into a single product called Oracle Data Integrator Enterprise Edition. This has a primary impact from a product pricing/purchasing standpoint. Going forward its not possible to purchase OWB (Enterprise ETL) and ODI seperately(i believe this is the case, not completely sure though). The official details of this are available below

1. Oracle Data Integrator Enterprise Edition

2. Oracle Data Integrator – Statement of Direction

So, now with a single product(from a licensing standpoint) it is possible to deploy both OWB and ODI (customers have more flexibility in their deployments which is good). But effectively both the components will have their seperate releases (according to the SOD above). The actual product merging would happen only after a seperate individual release of both the products. More details are available in the statement of direction.

Advertisements

Posted in All Posts, Oracle Data Integrator, OWB | 1 Comment »

Oracle BI EE 10.1.3.4 – Integration between BI EE & Hyperion Financial Reporting(HFR) – Drilling from HFR to BI EE by passing parameters – Part 2

Posted by Venkatakrishnan J on January 30, 2009

If you had read my blog entry here, i would have shown you how to drill from BI EE to HFR by passing the parameters from BI EE to HFR. I had written that blog entry first since that integration was a bit easier than what we would be doing today. Today, our aim is to achieve drilling from HFR to BI EE by passing parameters from HFR to BI EE. We shall be using the same reports that we used in the last blog entry. So, our HFR and BI EE reports over which we would achieving the drills is given in the below screenshots

       

       

So, our aim is when we drill from HFR to BI EE, the target BI EE report should automatically get the “Feb” filter since we are looking at Feb data in HFR. The major challenge in getting this integration to work is that HFR does not allow direct HTML formatted dynamic links in the report layout. One can only pass dynamic parameters. For HFR the URL parameters would look like the one shown below

http://localhost:19000/workspace/index.jsp?module=tools.relatedcontent&repository_path=/Drill%20Through%20Report&elementType=2&repository_name=Drill%20Through%20Report&repository_format_id=html&run=1&sso_token=$SSO_TOKEN$&attribute=Product.id.Product&attribute=Year.id.Qtr2&attribute=Measures.id.Measures&attribute=Market.id.Market&attribute=Scenario.id.Scenario

For BI EE the URL parameters(GO URL) would look as shown below

http://localhost:9704/analytics/saw.dll?Go&Path=/shared/Paint%20Demo/HFR%20BI%20EE%20Integration/Drill%20Report&Action=Navigate&P0=1&P1=eq&P2=TIMES.MONTH_NAME&P3=Feb

As you see, the parameters above are very different. In the previous case, we were able to do it since we can hardcode the other parts of the workspace URL within the report itself using HTML formatters. But this is not available in HFR. So, in order to achieve the integration we would need an external jsp page which would accept the CONTEXT parameters from HFR and will dynamically generate a BI EE URL. Once the URL is generated, the jsp page will automatically do the redirection. So, the integration can be summarized as shown below

       

So, lets start with creating the jsp page in JDeveloper. The below code is a sample jsp code that i created for passing the month from HFR to the BI EE report.

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">
<%@ page contentType="text/html;charset=windows-1252"%>
<%@ page import="java.io.*" %>
<%@ page import="java.util.Map" %>
<%@ page import="java.util.Map.Entry" %>
<%@ page import="java.util.jar.Attributes" %>
<%@ page import="java.util.Iterator" %>
<%

    Attributes attribs = new Attributes();
    String BIEEGoURL = "http://localhost:9704/analytics/saw.dll?Go&nquser=Administrator&nqpassword=Administrator&Path=/shared/Paint%20Demo/HFR%20BI%20EE%20Integration/Drill%20Report&Action=Navigate&P0=1&P1=eq&P2=TIMES.MONTH_NAME&P3=";

    Map map = request.getParameterMap();
    Iterator iter = map.entrySet().iterator();
    while (iter.hasNext()) {
    Entry n = (Entry)iter.next();
    String key = n.getKey().toString();
    String values&#91;&#93; = (String&#91;&#93;) n.getValue();
    int i = 0;
    while(i < values.length)
    {
    if (values&#91;i&#93;.indexOf("Year.id") == 0 )
    {
    BIEEGoURL = BIEEGoURL + values&#91;i&#93;.substring(8,values&#91;i&#93;.length());
%>
<%
    response.sendRedirect(BIEEGoURL);
    }
     i = i + 1;
     }
        attribs.putValue(key,values&#91;0&#93;.toString());}

%>

       

The jsp above basically does the following

1. Accepts the CONTEXT parameter from jsp which would be of the form &attribute=<dimension name>.id.<dimension value>
2. Parses the CONTEXT parameter and extracts the dimension value for the Year dimension.
3. Then it appends the extracted dimension value to a string which in turn is a BI EE Go URL.
4. Once the Go URL is completely generated, the jsp redirects the page to the BI EE Go URL.

Now, copy the above jsp over to the {OracleBI}\oc4j_bi\j2ee\home\applications\analytics\analytics directory so that we can access the jsp over the OC4J BI EE URL. This jsp can basically be deployed on any java app server. Test this jsp using the below URL

http://localhost:9704/analytics/HFR%20-%20BIEE%20Integration.jsp?&attribute=Product.id.Product&attribute=Year.id.Feb&attribute=Measures.id.Measures&attribute=Market.id.Market&attribute=Scenario.id.Scenario

Now, the above URL should automatically filter our target BI EE report for the Feb month (it is assumed that the BI EE report has the proper is prompted filters).

       

Once the above check is completed, the next step is to include the below link in the related content link in HFR.

http://localhost:9704/analytics/HFR%20-%20BIEE%20Integration.jsp?$CONTEXT$

       

Now, once you click on a HFR measure, you would notice that the filters get passed from the HFR report to the target BI EE report.

Posted in All Posts, EPM, Hyperion Financial Reporting, Hyperion Workspace, OBI EE Plus | 2 Comments »

Oracle BI EE 10.1.3.4 – Integration between BI EE & Hyperion Financial Reporting(HFR) – Drilling from BI EE to HFR by passing parameters – Part 1

Posted by Venkatakrishnan J on January 28, 2009

In the prior blog entry, i had given an overview of how to go about achieving integration between any two reporting toolsets. As a followup to that, through a series of articles, i would show the integration between the following toolsets

1. BI EE and Hyperion Financial Reporting
2. Hyperion Financial Reporting and Hyperion Web Analysis
3. BI EE and Hyperion Web Analysis

Today we shall see how to integrate BI EE and HFR. The integration between BI EE 10.1.3.4 and HFR 11.1.1 would be seamless in terms of security as SSO is now available between these 2 products. So, our aim today is to drill from a BI EE report to an HFR report by passing parameters from BI EE to HFR. So, lets start with a simple HFR report as shown below.

As you see, this is a very simple report showing the Total Sales for a Year and the corresponding products. Now, assume that the sales transaction details are reported through BI EE. For example, consider the BI EE report shown below.

Now, our aim is to enable drill on the Month column of the BI EE report i.e when anyone clicks on, say the Month January, the HFR report should also provide the data for January. In order to achieve this, we need to understand how HFR accepts parameters in the URL. To know that, go to any data cell in HFR and click on related content.

In the related content, just include the same report as the related content report.

Then click on properties. You would get the below URL.

http://localhost:19000/workspace/index.jsp?module=tools.relatedcontent&repository_path=/Drill%20Through%20Report&elementType=2&repository_name=Drill%20Through%20Report&repository_format_id=html&run=1&sso_token=$SSO_TOKEN$&$CONTEXT$&rcp_version=$RCP_VERSION$&rcp_alias=$RCP_ALIAS$&mimetype=application/hyperion-reports-report

As you see, in the above URL there are certain dollar prefixed and suffixed names that pass the values from the base report to the target report. In this case $CONTEXT$ is the actual keyword which passes the actual parameter POV values to the target report. In order to see the actual value of the CONTEXT parameter, lets go back to the related content again and enter the below link instead of the above report

http://localhost?$CONTEXT$

Now, lets click on the URL from the HFR report and see what happens to the target URL.

http://localhost/?attribute=Product.id.Product&attribute=Year.id.Year&attribute=Market.id.Market&attribute=Accounts.id.Accounts&attribute=Scenario.id.Scenario

As you see, all the POV values are passed to the target report in the form of URL parameters. The parameter is of the form

attribute=Market.id.Market

i.e we can pass dynamic parameters to a HFR report by passing the parameters in the form shown below

attribute=<dimension>.id.<dimension value>

Now, lets test this out using the below URL.

http://localhost:19000/workspace/index.jsp?module=tools.relatedcontent&repository_path=/Drill%20Through%20Report&elementType=2&repository_name=Drill%20Through%20Report&repository_format_id=html&attribute=Product.id.Product&attribute=Year.id.Jan&attribute=Market.id.Market&attribute=Accounts.id.Accounts&attribute=Scenario.id.Scenario

As you see, in the above URL we are passing Jan as a POV to the year dimension.

Now, once we are sure that the parameters are getting passed to the HFR report, lets go to the BI EE report that we created earlier. In the column formula, enter the below formula

'<a href="http://localhost:19000/workspace/index.jsp?module=tools.relatedcontent&repository_path=/Drill%20Through%20Report&elementType=2&repository_name=Drill%20Through%20Report&repository_format_id=html&attribute=Product.id.Product&attribute=Year.id.'||LEFT(TIMES.FISCAL_MONTH_NAME, 3)||'&attribute=Market.id.Market&attribute=Accounts.id.Accounts&attribute=Scenario.id.Scenario">'||TIMES.FISCAL_MONTH_NAME||'</a>'

Now, convert this column format to HTML. We would get the below report with the drills enabled for all the months.

Now, if we drill on any month, it would take us the HFR report. But after that you should see the HFR report for the corresponding month.

The above uses the approach 1 that i had blogged about here. Next, lets look at how we can pass dynamic parameters from HFR to BI EE.

Posted in All Posts, EPM, Hyperion Financial Reporting, OBI EE Plus | 7 Comments »

Integration of all Oracle and Non-Oracle Reporting Toolsets – URL Parameters – An introduction

Posted by Venkatakrishnan J on January 26, 2009

Last week i was interacting with one customer who had 2 different reporting toolsets across 2 different departments(they had 2 HR departments as the company recently went through a merger). Both the departments had a good data warehouse. One residing on Oracle and the other on SQL Server. Due to the current global economic climate and for operational efficiency purposes, they wanted to merge both the departments into a single one(both the departments were functioning seperately due to the recent acquisition). Again, the client was under very stringent budgetary constraints and hence was not in a position to spend more on data consolidation of the warehouses. Hence, the customer wanted to take the integration in a two phased approach.

1. Achieve management level reporting by integrating the reporting toolsets of both the departments.
2. Achieve actual data consolidation by merging both the data warehouses once budget becomes available in 2010.

Though they wanted to take the 2nd approach as that would solve their long term goal, due to the budgetary constraints, they just could not proceed with this approach. They wanted to know whether it was possible to integrate their existing reporting toolsets to achieve their immediate reporting requirements. Though this was a unique request, i wanted to understand what exactly they meant by integration as there could be multiple possibilities and approaches to integration. Their scenario was that they had 2 different reporting toolsets and both of them provided more or less a similar set of reports. As part of their consolidation effort, they wanted to merge both the reporting tools into a single one or atleast provide a capability which would provide a seamless drill down from one tool to another. For example, one of their key reports was Employee count breakdown by Year. Since they do not have a consolidated warehouse, they were fine with having 2 reports with the same employee breakdown by year for both the departments seperately. But they wanted a capability wherein one can drill down from one report to another. Basically, what they wanted was a capability to pass parameters from one report to another. And the major challenge was that the reports were built out of 2 different toolsets.

Though the requirements listed by the customer above are pretty reasonable, one of the reasons why i blogged about it here is to give an overview of how such integrations can be achieved. More importantly, the effort required to achieve these integrations are always on the lower side considering the fact that considerable investments have been made on the toolsets already. This blog entry would give a brief on how in general integrations like the one listed above can be achieved. We would go into details of the integration in the coming articles.

In general, all reporting tools that are web based accept parameters in 2 forms.

1. Through the URL
2. Through backend javascript post method parameters

Parameters are nothing but certain report/tool specific values which apply fiters, set configuration settings etc through the URL itself. I would list down some of the major vendors/reporting toolsets that support URL parameters

1. Oracle BI EE – BI EE has excellent parameter support through the URL. There are quite a few examples that i have covered in this blog before. Supports use of username/password in the URL itself.
2. BI Publisher – BI Publisher has good URL parameter support. One issue is that one cannot pass username/passwords through the URL itself as BIP does not support it as of now.
3. Discoverer – Both Discoverer Plus and Viewer support URL parameters. 9i release of discoverer supported passing of username and passwords through the URL. In 10g, this was removed due to security reasons. But Post method of login (2nd method above) is supported.
4. Hyperion Financial Reporting – Has excellent support for URL parameters. Passing of Username/Password through the URL is not supported.
5. Hyperion Web Analysis – Has excellent support for URL parameters. Passing of Username/Password through the URL is not supported.
6. Microstrategy – Supports an extensive API for parameter calls through the URL
7. Cognos – Supports parameter passing through the URL
8. Business Objects – I could not find an URL API for BOBJ. Not sure whether it is supported(i believe it should be). But i am pretty sure the 2nd method above, is supported.

As you see, almost all the tools above support passing parameters through the URL. What would vary is the extensiveness of this API across toolsets and across releases in each of these toolsets. For example, almost all features of BI EE is exposed through the Go URL and Dashboard URL. But in the case of BI Publisher or Discoverer not every functionality is exposed through the URL. So, to do the integration, it is necessary to understand the toolsets involved in detail. The integration which i would be covering below would not involve a SSO across the products. To achieve SSO across the products would require extensive coding effort as it is necessary to understand how each tool handles the security.

Integration of reporting toolsets can be achieved by 2 methods. They are shown below

       

       

1. The first approach assumes that the source reporting tool allows dynamic URLs to be built within a report itself. For example, in BI EE one can pass dynamic values from a report column to any other URL.
2. The second approach assumes that the source reporting tool only passes the URLs dynamically. The tool itself does not have the capability to build dynamic URLs. For example, Hyperion Financial Reporting & Web Analysis, does not allow to build dynamic URLs. In such cases, the parameters are passed to a custom JSP/ASP page which in turn generates the target URL and redirects accordingly.

This should give you an idea of how the integration would work. In the coming blog entries, i would elucidate this with examples of integration between HFR & BI EE, Web Analysis & BI EE, HFR & Web Analysis etc.

Posted in All Posts, EPM, Hyperion Financial Reporting, OBI EE Plus | 3 Comments »

Hyperion Business Rules 9.3.1 & Essbase – An introduction

Posted by Venkatakrishnan J on January 22, 2009

In one the recent Essbase implementations that i did for a customer, there was a customer need to kick start Essbase related data load, aggregation etc through a web based console. Since the customer had Oracle E-business suite and was most familiar with the use of concurrent manager, i had created a set of concurrent programs to submit various Essbase related data loads(through the use of SSH, MaxL & shell scripts). Essentially, the end users can kick start an Essbase load process by submitting a concurrent request. But one of the questions that i was asked by the customer was whether Hyperion Product stack itself had anything similar to this. Thats when i had to explain the customer about the existence of Hyperion Business Rules which does more or less the same thing. But the advantage of using this component is that it provides a good graphical interface (it uses Java API) to design different business rules. For example, the typical requirement is to run various financial calculations during the end of a quarter again to ensure that all the numbers are correct. So, the business users would require the capability to dynamically pass month values from the front end. Hyperion Business Rules was built to address these kind of requirements. But one of the shortcomings that i found was the lack of a capability to run data load/dimension build rules. Right now, it has a sophisticated UI to define business rules which can just do calculations, allocations on Essbase as well as Planning.

Hyperion Business Rules or HBR comes along with the Essbase Administration Services installation. It was a seperate product before but now it has been clubbed into EAS.

       

There are 4 main options in HBR. They are

1. Rules – A rule typically runs a specific calculation/allocation on a single Essbase Database. It can be extended quite extensively to include advanced calculation scripts.
2. Sequence – A sequence basically helps in sequencing a list of rules based on business requirements. It basically provides an order to the Rules execution.
3. Macros – Macros are like mini rules which can be reused across multiple rules.
4. Global Variables – Global Variables are nothing but a set of variables that provide a global context to the variable values.

Since HBR comes along with EAS, its security is derived from Shared Services. Each rule/sequence execution/validation by a user can be controlled through the shared services console. HBR is relevant only to BSO cubes as ASO cubes do not support calculation scripts. To start with lets create a simple rule on the Demo->Basic cube. The first step in creating a rule is to choose the outline.

       

Now, right click on the Graphical Tab open area and choose either Actions or Functions. Actions provide the capability to run calculation scripts. So, lets choose Action and Aggregate Data.

       

The UI of HBR provides the capability to calculate the entire database or portions of the database. For now, lets choose the entire database.

       

There are also calculation optimization options like Aggregate Missing, Calc cache setting etc. But again not every setting is included. In order to run more complex scripts, just go to the Source tab and add accordingly.

       

Then choose a location (the outline itself) and then assign privileges for users who can execute this rule.

       

Variables can be added if needed so that the calculations can be customized based on user input. Save and Validate the above rule. These rules can be launched from a web browser. The default url for HBR is provided below

http://hostname:10080/hbrlauncher/

       

So, if you are doing an Essbase implementation and have requirements to provide end users with the capability run complex allocations/business rules from a thin client, HBR would probably be the best fit. Typically this is more used with Hyperion Planning.

Posted in All Posts, EPM, Hyperion Business Rules, Hyperion Essbase | 4 Comments »

Oracle BI EE 10.1.3.4 & Essbase Connectivity – Conforming Dimensions and Federation – Drill through to a Relational Database

Posted by Venkatakrishnan J on January 21, 2009

I was helping out a customer this week to setup drill through from an Essbase BI EE report to a Relational Database. Though it is straightforward to set this up, i thought it is relevant to blog about it as there are multiple variations of the integration between an Essbase data source and a relational data source. There are basically 2 types of integration which are possible as of now.

1. Creating a report containing a relational measure (for example forecast) and an Essbase measure (for example sales) through a conforming dimension
2. Creating a report wherein drills are enabled(on a conforming dimension) from an Essbase data source to drill down to a relational data source

To set the above 2 integrations, there are certain pre-requisites. I would list them below

1. There should be conforming dimension between an Essbase Cube and the relational database. For example, if Essbase has Sales measure and an Oracle DB has the forecast measure, to establish integration between the 2, we would need a seperate common dimension having the same values in all the levels or at least in one of the levels.
2. The conforming dimension in Essbase should not be ragged in nature. In such cases, the drill through would not work. The main reason for this is that BI EE converts each level into a specific generation.

Lets understand this further through an example. We shall be using the Demo->Basic cube in Essbase. This cube basically stores Actual, Budget and various measures data in multiple dimensions at a Monthly level as shown below.

       

Assume that we have a database table containing Monthly Week level data as shown below.

       

Now a typical requirement is to have a report at the month level and drill down to the Week level. It is generally desirable to add a drill through from the parent Essbase report to the relational database so that analysis at a weekly level is possible. In order to achieve this, lets first start with importing the Demo->Basic cube into the repository. Also, import the database table and create an alias for this table. Our intention is to use the same table for dimension as well as the fact. The table and the alias would be joined at the week level.

       

       

In the list of pre-requisites above, i had stated that we need a conforming dimension with atleast one level of values to match exactly between the Essbase Outline and the Relational database. In our case, the conforming dimension would be the year dimension along the Month level. As a first step, we need to establish a federation at the dimension level by including the TIME_DRILL table in the sources of the year dimension. Also, change the logical key to be of the Week column instead of all the 3 generations from Essbase.

       

       

Also, create another level below the Gen3, Year level.

       

Ensure that the Gen3, Year and Week columns are properly mapped in the LTS of the TIME_DRILL source. Also, change the level of this source to Gen4, Year.

       

       

This will ensure the linkage between the Week and Gen3, Year columns. Include the Alias table of TIME_DRILL as a seperate Fact table in the BMM.

       

Create a logical join between the TIME_DRILL_FACT and the Year Dimension through the Week Column. And also set the content level of this LTS at the Gen4, Year level.

       

       

Now, create 2 reports as shown below. One report would be reporting on Gen3, Year and the Amount measure from Essbase. And the other report would be on WEEK column and Amount Sold measure from the database. The latter would be the target drill through report with the Gen3, Year column prompted. Enable the navigate option in the Gen3, Year column of the parent report to navigate to the child report. This would seamlessly drill from the Essbase report to the relational report.

       

       

Posted in All Posts, EPM, Hyperion Essbase, OBI EE Plus | 2 Comments »

Oracle BI EE 10.1.3.4 – Delete using Writebacks – An alternative approach using EVALUATE

Posted by Venkatakrishnan J on January 15, 2009

I saw this question floating around in one of our internal forums wherein a user wanted to know whether it is possible to Delete records using Writebacks. Unfortunately, Writebacks in BI EE support only inserts and updates(though deletes can be done as well, but is not supported). In this blog entry i would walk through an approach where in one can delete records using the EVALUATE function in BI EE. Be warned that this approach uses a potentially lethal command PRAGMA AUTONOMOUS_TRANSACTION which can lead to deadlocks if multiple users are using this at the same time. But if this is something that is needed by an admin user and is a non-negotiable requirement, this approach can be used. To demonstrate this, we shall use the simple EMP table from scott schema. Our parent report is shown below. Our intention is to have an ability that can delete any employee from the report & from the database.

      

So, each record should basically have an URL, clicking on which should delete that record from the database. To do this, lets first create a database function as shown below

create or replace FUNCTION WRITEBACK_DELETE_EMP(P_ENAME IN VARCHAR2,TO_DELETE IN VARCHAR2)
RETURN VARCHAR2
IS PRAGMA AUTONOMOUS_TRANSACTION;
V_ENAME VARCHAR2(100);
BEGIN
IF TO_DELETE = 'Yes' THEN
DELETE EMP WHERE ENAME = P_ENAME;
COMMIT;
RETURN 'Success';
ELSE
RETURN 'No Delete';
END IF;
EXCEPTION
WHEN OTHERS THEN
RETURN 'Failure';
END;

As you see, the function basically accepts 2 parameters. One parameter is the Employee Name and the other is more of a boolean indicator to ensure that we are not deleting the records while creating the report itself. To enable deletion of the records, lets create another report which basically would display the Employee name and a Return flag indicator from the above function. While creating this report, ensure that you are passing only No to the second parameter. Else it will delete all the records.

      

      

EVALUATE('WRITEBACK_DELETE_EMP(%1,''No'')',EMP.ENAME)

This report should also have a filter on the Employee name to is prompted.

      

Now, create a narrative view on the report as shown below

      

Ensure that your compound view only has this narrative view. Also, before saving the report, change the evaluate function to the one shown below

EVALUATE('WRITEBACK_DELETE_EMP(%1,''Yes'')',EMP.ENAME)

Once this is done, go back to the parent report, and in the delete column just add a navigate link to the child report that we created above.

      

Now, if you click on the delete link, this would automatically delete that particular record from the database.

      

We get the Success message (typically after a delete we should not even get the record in the child report) with that particular record because of the AUTONOMOUS_TRANSACTION. Thats because, by default we are using a SERIALIZABLE option of AUTONOMOUS_TRANSACTION. So, even if the record got deleted, the parent transaction does not know about it and hence we would see it. The other drawback is that one would have to manually refresh the report. This can be overcome by using GO URL instead of Navigate and adding a custom refresh function after the click on the delete url.

Posted in All Posts, OBI EE Plus | 3 Comments »

Hyperion Essbase 11.1.1.0 – Using Java APIs – Part 2 – XMLA and Normal MDX Outputs

Posted by Venkatakrishnan J on January 13, 2009

Yesterday i had given an example of the Java API which can be used for displaying the dimension members from an Essbase Outline. Today, lets test out another example which would basically fire MDX Queries onto the Essbase Server. One of the main reasons for me to try this example is to figure out how MDX is handled by Essbase and in turn how it is handled by BI EE. For this we would be testing 2 different scenarios i.e 2 different MDX queries

1. An MDX query which produces same results using Essbase specifications as well as XMLA specifications.
2. An MDX query which does not produce the same results using Essbase specifications and XMLA specifications.

First let us try to understand the difference between the above 2 types of queries. As an example we shall start with the MDX query given below

With
set [Evaluate0] as '{[Product].Generations(2).DIMENSION.MEMBERS }'
set [Evaluate1] as '{[Market].Generations(1).DIMENSION.MEMBERS }'
select { [Scenario].[Actual] } on columns,
NON EMPTY {crossjoin ({[Evaluate0]},{[Evaluate1]})}
properties ANCESTOR_NAMES, GEN_NUMBER on rows
from [Demo.Basic]

       

The above MDX query produces the same results as per XMLA specs as well as Essbase specs. This is because the Generation standards for both XMLA and Essbase remain the same. But Essbase Level specifications are slightly different from the XMLA specifications. In XMLA specifications, Levels(0) denotes the dimension itself. But in the case of Essbase, Levels(0) denotes the leaf nodes. This is the key differentiator which clearly lets us know the BI EE actually uses the Essbase specifications instead of the XMLA specifications. Another example is provided below which shows an MDX query that produces different results in XMLA specs and Essbase Specs.

SELECT
CrossJoin([Measures].CHILDREN , [Market].CHILDREN) on columns,
[Product].Levels(0).Members
DIMENSION PROPERTIES [Product].[Intro Date] on rows
from Sample.Basic

       

Now, as part of our JAPI testing, lets use the XMLA/Essbase compliant MDX query The java class code to test this out is given below

package SampleBIEE;

import com.essbase.api.base.*;
import com.essbase.api.metadata.IEssMemberSelection;
import com.essbase.api.session.*;
//import com.essbase.api.datasource.*;
import com.essbase.api.dataquery.*;
//import com.essbase.api.metadata.*;
import com.essbase.api.domain.*;

public class MDXQueryJAPI {

private static String s_userName = “admin”;

private static String s_password = “password”;

private static String s_olapSvrName = “localhost”;

private static String s_provider = “http://localhost:13080/aps/JAPI&#8221;;

private static String s_appName = “Demo”;

private static String s_cubeName = “Basic”;

private static final int FAILURE_CODE = 1;

public static void main(String[] args) {
int statusCode = 0;
IEssbase ess = null;
IEssCubeView cv = null;
byte ch[] = new byte[2];

try {

acceptArgs(args);
// Create JAPI instance.
ess = IEssbase.Home.create(IEssbase.JAPI_VERSION);

// Sign On to the Provider
IEssDomain dom = ess.signOn(s_userName, s_password, false, null,
s_provider);

try {
cv = dom.openCubeView(“Mdx Query Example”, s_olapSvrName,
s_appName, s_cubeName);
performMdxQuery(cv);
} catch (Exception x) {
System.out.println(“Error: ” + x.getMessage());
x.printStackTrace();
statusCode = FAILURE_CODE;
} finally {
// Close cube view.
try {
if (cv != null)
cv.close();
} catch (EssException x) {
System.err.println(“Error: ” + x.getMessage());
}
}

if (ess != null && ess.isSignedOn() == true)
ess.signOff();
} catch (EssException x) {
System.err.println(“Error: ” + x.getMessage());
}

if (statusCode == FAILURE_CODE) System.exit(FAILURE_CODE);
}

private static void performMdxQuery(IEssCubeView cv) throws Exception {
boolean bDataLess = false;
boolean bNeedCellAttributes = false;
boolean bHideData = true;

String mdxquery = null;

mdxquery = “With ” +
“set [Evaluate0] as ‘{[Product].Generations(2).DIMENSION.MEMBERS }’ ” +
“set [Evaluate1] as ‘{[Market].Generations(1).DIMENSION.MEMBERS }’ ” +
“select ” +
“{ [Scenario].[Actual] } on columns, ” +
“NON EMPTY {crossjoin ({[Evaluate0]},{[Evaluate1]})} properties ANCESTOR_NAMES, GEN_NUMBER on rows ” +
“from ” +
“[Demo.Basic] “;

IEssOpMdxQuery op = cv.createIEssOpMdxQuery();

op.setQuery(bDataLess, bHideData, mdxquery, bNeedCellAttributes,
IEssOpMdxQuery.EEssMemberIdentifierType.NAME);
op.setXMLAMode(false);

op.setNeedFormattedCellValue(true);
op.setNeedSmartlistName(true);
op.setNeedFormatString(false);
op.setNeedFormattedMissingCells(true);

op.setNeedMeaninglessCells(false);

cv.performOperation(op);

IEssMdDataSet mddata = cv.getMdDataSet();

printMdDataSet(mddata, op);
System.out
.println(“\n—————— MDX Result in Grid Format —————-“);
printMdDataSetInGridForm(mddata);
System.out
.println(“—————— ————————- —————-“);
}

private static void printMdDataSet(IEssMdDataSet mddata, IEssOpMdxQuery op) throws Exception {

IEssMdAxis[] axes = mddata.getAllAxes();

int nAxes = axes.length;

System.out.println(“Number of axes: ” + nAxes);

for (int axisInd = 0; axisInd < nAxes; axisInd++) { if (axes[axisInd].isSlicerAxis()) System.out.print("Slicer "); System.out.println("Axis " + axisInd + " Number Of Tuples : " + axes[axisInd].getTupleCount() + " Number Of Dimensions : " + axes[axisInd].getDimensionCount()); } for (int axisInd = 0; axisInd < nAxes; axisInd++) { IEssMdAxis axis = axes[axisInd]; int nTuples = axis.getTupleCount(); // Get all the dimensions and their info in this axis System.out.println("\nGetting dimensions in axis : " + axisInd); IEssMdMember[] dims = axis.getAllDimensions(); for (int axisDimInd = 0; axisDimInd < dims.length; axisDimInd++) { IEssMdMember dim = dims[axisDimInd]; int propscnt = dim.getCountProperties(); System.out.println("Dim " + axisDimInd + " dim name : " + dim.getName() + " #props " + propscnt); for (int propInd = 0; propInd < propscnt; propInd++) { System.out.println("Property " + propInd + " Name : " + dim.getPropertyName(propInd) + ", Type : " + dim.getPropertyDataType(propInd)); } } System.out.println("\nGetting members in all the tuples of axis : " + axisInd); for (int tupleInd = 0; tupleInd < nTuples; tupleInd++) { System.out.println("\nTuple : " + tupleInd); IEssMdMember[] mbrs = axis.getAllTupleMembers(tupleInd); printMemberInfo(mbrs); } } System.out.println("\n\n\nPrinting results through clusters ....."); for (int axisInd = 0; axisInd < nAxes; axisInd++) { IEssMdAxis axis = axes[axisInd]; // Get all the clusters in this axis IEssMdCluster[] clusters = axis.getAllClusters(); long nClusters = axis.getClusterCount(); System.out.println("\nAxis " + axisInd + " Num clusters " + nClusters); for (int clusterInd = 0; clusterInd < nClusters; clusterInd++) { IEssMdCluster cluster = clusters[clusterInd]; int clusterTupleCount = cluster.getTupleCount(); // Get the members based on the dimension index System.out.println("\nCluster " + clusterInd + " Size " + clusterTupleCount); for (int dimInd = 0; dimInd < cluster.getDimensionCount(); dimInd++) { IEssMdMember[] mbrs = cluster .getAllDimensionMembers(dimInd); System.out.println("Cluster Dim " + dimInd + " Number Of Members : " + mbrs.length); } // Get the members based on tuple index System.out .println("\nGetting members in all the tuples of this cluster..."); for (int tupleInd = 0; tupleInd < clusterTupleCount; tupleInd++) { System.out.println("\nTuple : " + tupleInd); printMemberInfo(cluster.getAllTupleMembers(tupleInd)); } } } System.out.println("\n\n\nGetting all the cells..."); System.out.println("Number of cells : " + mddata.getCellCount()); for (int ulCellOrdinal = 0; ulCellOrdinal < mddata.getCellCount(); ulCellOrdinal++) { if (!op.isNeedFormattedMissingCells() && mddata.isMissingCell(ulCellOrdinal)) { System.out.println("Cell[" + ulCellOrdinal + "] = #Missing"); } else if (mddata.isNoAccessCell(ulCellOrdinal)) { System.out.println("Cell[" + ulCellOrdinal + "] = #NoAccess"); } else { printCellValue(mddata, ulCellOrdinal, op); } } } private static void printCellValue(IEssMdDataSet mddata, int cellOrdinal, IEssOpMdxQuery op) throws EssException { String actualVal = (mddata.isMissingCell(cellOrdinal)) ? "#Missing" : "" + mddata.getCellValue(cellOrdinal); String fmtdCellTxtVal = mddata.getFormattedValue(cellOrdinal); int cellType = mddata.getCellType(cellOrdinal); if (fmtdCellTxtVal != null && fmtdCellTxtVal.length() > 0) {
String printSmlistName = (cellType == IEssMdDataSet.CELLTYPE_SMARTLIST)
? (“, SmartlistName = ” + mddata.getSmartListName(cellOrdinal))
: “”;
String printNumericValue = (!mddata.isMissingCell(cellOrdinal))
? (“, NumericValue = ” + actualVal)
: “”;
System.out.print(“Cell[” + cellOrdinal + “] = ” + fmtdCellTxtVal
+ ” {CellType = ” + cellType + printNumericValue
+ printSmlistName
+ “}”);
} else {
System.out.print(“Cell[” + cellOrdinal + “] = ” + actualVal);
}

if (op.isNeedFormatString()) {
if (cellType == IEssMdDataSet.CELLTYPE_DATE
|| cellType == IEssMdDataSet.CELLTYPE_DOUBLE) {
System.out.println(” [FormatString = ”
+ mddata.getFormatString(cellOrdinal) + “]”);
} else {
System.out.println();
}
}
}

private static void printMdDataSetInGridForm(IEssMdDataSet mddata)
throws Exception {
IEssMdAxis[] axis = mddata.getAllAxes();

int cols = axis[1].getTupleCount();
int rows = axis[2].getTupleCount();
IEssMdMember[] mem = axis[2].getAllTupleMembers(0);
for (int i = 0; i < mem.length; i++) System.out.print("\t"); for (int j = 0; j < cols; j++) { mem = axis[1].getAllTupleMembers(j); printTuple(mem); } System.out.println(); int k = 0; for (int i = 0; i < rows; i++) { mem = axis[2].getAllTupleMembers(i); printTuple(mem); for (int l = 0; l < mem.length; l++) { int propcnt = mem[l].getCountProperties(); for (int propInd = 0; propInd < propcnt; propInd++) { IEssValueAny propval = mem[l].getPropertyValueAny(propInd); System.out.print(mem[l].getPropertyName(propInd) + ", " + propval + " "); } } for (int j = 0; j < cols; j++) { if (mddata.isMissingCell(k)) { System.out.print("#Missing\t"); } else if (mddata.isNoAccessCell(k)) { System.out.print("#NoAccess\t"); } else { String fmtdCellTxtVal = mddata.getFormattedValue(k); if (fmtdCellTxtVal != null && fmtdCellTxtVal.length() > 0) {
System.out.print(fmtdCellTxtVal + “\t”);
} else {
double val = mddata.getCellValue(k);
System.out.print(val + “\t”);
}
}
k++;
}
System.out.println();
}
}

private static void printTuple(IEssMdMember[] mem) throws EssException {
System.out.print(“(“);
for (int i = 0; i < mem.length - 1; i++) System.out.print(mem[i].getName() + ", "); System.out.print(mem[mem.length - 1].getName() + ")\t"); } private static void printMemberInfo(IEssMdMember[] members) throws EssException { for (int mbrInd = 0; mbrInd < members.length; mbrInd++) { IEssMdMember member = members[mbrInd]; System.out.println("Mbr " + mbrInd + " identifier " + member.getName()); int propcnt = member.getCountProperties(); for (int propInd = 0; propInd < propcnt; propInd++) { IEssValueAny propval = member.getPropertyValueAny(propInd); System.out.println("Property " + propInd + " Name : " + member.getPropertyName(propInd) + ", Type : " + member.getPropertyDataType(propInd) + ", Value : " + propval); } } } static void acceptArgs(String[] args) throws EssException { s_userName = "admin"; s_password = "password"; s_olapSvrName = "localhost"; s_provider = "http://localhost:13080/aps/JAPI"; } } [/sourcecode] If you look at the code above there is a specific setting called [sourcecode language='cpp'] op.setXMLAMode(false); [/sourcecode] So what this mode basically does is, it returns the data in XMLA compliant format or in the default Essbase format. Now, lets compile and run this from JDeveloper with the XMLA mode turned off.        

As you see, since this Query itself is XMLA compliant the output of the query itself returns the data in same format as well. The only difference is that there are certain properties which are not returned as part of the output. So, we shall see the output of the class with the XMLA setting turned on.

       

Not much of a difference here apart from the properties. Of course, the printing of properties is controlled through a seperate method, but the output data in itself is dictated by the MDX and the XMLA settings. Now, let us test the non XMLA compliant MDX query above from the same JAPI with the XMLA setting turned on.

       

As you see, since the XMLA mode is turned on, the Levels(0) in the query is treated as topmost Node (which is the product itself) and displays data only for the product member. Now, lets switch this XMLA setting to off and execute the same class.

       

The above clearly demostrates the fact that BI EE does not follow the XMLA specifications. Of course, this can change in the coming releases, but this should serve as a testing mechanism to understand how various Reporting Tools connect to Essbase/MSAS connectivity.

Posted in All Posts, EPM, Hyperion Essbase | 1 Comment »

Hyperion Essbase 11.1.1.0 – Using Java APIs – Part 1 – Viewing Outline Members

Posted by Venkatakrishnan J on January 12, 2009

In a prior entry here, i had briefly walked through my observations on the BI EE and Essbase Connectivity. There i had noted that BI EE uses the C-API of Essbase instead of XMLA API through the provider services. I wanted to further validate this by means of testing the APIs directly. To begin with, i would be starting with the Java APIs. The Java APIs can be executed in 2 modes, they are

1. Embedded Mode – In this mode, the API jar files & the client application should reside on the same machine as the Essbase Server.
2. Using Provider Services – In this mode, the API jar files can reside in any machine. The jar files can connect to the Essbase Server through the Provider Services. The client machine does not require provider services to be installed

In our testing, i would be using the Provider Services option. Also, we shall be using JDeveloper 11.1 (production release) to test the Java API. The entire Java API is divided into 6 main packages. They are

1. com.essbase.api.base – This package contains methods/classes/interfaces which form the fundamental core of the APIs. They provide the necessary object types to load/query data into/from essbase.
2. com.essbase.api.dataquery – This package contains methods/classes/interfaces which can be used for querying from Essbase.
3. com.essbase.api.datasource – This package contains methods/classes/interfaces which provide information on calculation scripts, security filters etc.
4. com.essbase.api.domain – This package contains information on the ESSBASE_SERVER, Users, Groups etc
5. com.essbase.api.metadata – This package can be used for querying/modifying the outline of any database within a specific Essbase server
6. com.essbase.api.session – This package is used for maintaining stateful and stateless transactions. Typically used for logging in and logging out of Essbase.

In our example today, we shall start with a simple Java Class which would be used for displaying all the Outline Members within the Sample->Basic cube. This is based on an sample class provided as an example when provider services is installed. The entire Java class file would be divided into 4 different methods.

1. Login Details Retrieval Method – This would contain the login credentials for the Essbase Server.
2. Member Retrieval Method – This would be used for retrieving all the members within the outline
3. Member Display Helper Method – This would be used primarily for display purposes (with proper indentation).
4. Main method – Used for execution

The entire sourcecode for this is provided below

package EssbaseJAPI;

import com.essbase.api.base.*;
import com.essbase.api.session.*;
import com.essbase.api.datasource.*;
//import com.essbase.api.dataquery.*;
import com.essbase.api.domain.*;
import com.essbase.api.metadata.*;

public class OutlineListing {
private static String s_userName = “admin”;
private static String s_password = “password”;

private static String s_olapSvrName = “localhost”;
private static String s_provider = “http://localhost:13080/aps/JAPI&#8221;;

private static final int FAILURE_CODE = 1;

public static void main(String[] args) {
int statusCode = 0;
IEssbase ess = null;
IEssOlapServer olapSvr = null;
try {
acceptArgs(args);

ess = IEssbase.Home.create(IEssbase.JAPI_VERSION);

IEssDomain dom
= ess.signOn(s_userName, s_password, false, null, s_provider);

olapSvr = (IEssOlapServer)dom.getOlapServer(s_olapSvrName);
olapSvr.connect();
IEssCube cube = olapSvr.getApplication(“Sample”).getCube(“Basic”);

listOutlineMembers(cube);
System.out.println(“\nOutline Viewing sample complete.”);
} catch (EssException x) {
System.err.println(“Error: ” + x.getMessage());
statusCode = FAILURE_CODE;
} finally {
try {
if (olapSvr != null && olapSvr.isConnected() == true)
olapSvr.disconnect();
} catch (EssException x) {
System.err.println(“Error: ” + x.getMessage());
}

try {
if (ess != null && ess.isSignedOn() == true)
ess.signOff();
} catch (EssException x) {
System.err.println(“Error: ” + x.getMessage());
}
}

if (statusCode == FAILURE_CODE) System.exit(FAILURE_CODE);
}

static void listOutlineMembers(IEssCube cube) throws EssException {
IEssCubeOutline otl = null;
try {
otl = cube.openOutline();
System.out.println(“Outline information\n——————-“);
System.out.println(“Count of enabled DTS members: ” + otl.getCountEnabledDTSMembers());

System.out.println(“\nListing all ouline members in cube ” +
cube.getApplication().getName() + “/” + cube.getName());
System.out.println(“———————————————–“);
IEssIterator dims = otl.getDimensions();
for (int i = 0, cntTabs = 0; i < dims.getCount(); i++) { IEssDimension dim = (IEssDimension)dims.getAt(i); listOutlineMembers_helper(dim.getDimensionRootMember(), cntTabs); } otl.close(); otl = null; } catch (EssException x) { System.err.println("Error: " + x.getMessage()); } finally { if (otl != null) { try { otl.close(); } catch (EssException x) { System.err.println("Error: " + x.getMessage()); } } } } static void listOutlineMembers_helper(IEssMember mbr, int cntTabs) throws EssException { for (int i = 0; i < cntTabs; i++) System.out.print("\t"); System.out.println(mbr.getName()); boolean fetchAllProps = false; IEssIterator mbrs = mbr.getChildMembers(fetchAllProps); ++cntTabs; for (int i = 0; i < mbrs.getCount(); i++) listOutlineMembers_helper((IEssMember)mbrs.getAt(i), cntTabs); } static void acceptArgs(String[] args) throws EssException { s_userName = "admin"; s_password = "password"; s_olapSvrName = "localhost"; s_provider = "http://localhost:13080/aps/JAPI"; } } [/sourcecode] Now, lets execute the above method. This should give us all the outline members.

The above is pretty straightforward as all we are doing is extracing each member in the outline and displaying them. In the next blog entry we shall see, how i believe the C-API is being used by BI EE. I would demonstrate that through the Java API. We shall fire an MDX query through the Java API and compare the output of the query with the output of BI EE. This would hopefully throw more on the connectivity. I would also address how BI Publisher currently uses this Java API (yes BI Publisher uses Java API instead of the C-API like BI EE) to return the datasets.

Posted in All Posts, EPM, Hyperion Essbase, OBI EE Plus | 1 Comment »

Oracle BI EE 10.1.3.4 & Hyperion Essbase 11.1.1.0 – MDX & C API or MDX & XMLA – A brief look at XMLA API – Part 1

Posted by Venkatakrishnan J on January 8, 2009

One of the few things that i wanted to do after writing this series of Essbase and BI EE articles was to understand in detail about the protocol that BI EE uses to connect to Essbase. As you might probably know, MDX is the query standard (which Microsoft defined initially) that is used universally (all major OLAP vendors) for accessing Multi-Dimensional data sources. One major exception to this is Oracle OLAP. To use MDX for analysis on MOLAP data sources, one would need an API. Currently Hyperion Essbase supports the following APIs

1. C API
2. Visual Basic API
3. Java API
4. XMLA API

Out of these 4 APIs, the C API is the one that is most widely used in client applications. This API has a wide range of functions to query & modify data within Essbase. The other important fact is that this API supports MDX for querying. XMLA also supports MDX for querying. I was always under the impression that BI EE uses the XMLA API to fire the MDX queries back to Essbase. But having done some investigation over the last couple of days, this seems not to be the case anymore. Looks like BI EE uses the C-API to fire the MDX queries. Of course, the basic criteria to make the BI EE Administrator to connect to Essbase is to install the Essbase Client (which will also install the C-API). But it is strange to see that XMLA is not used for reporting. In order to use the XMLA API, one would have to install the provider services. This is not a pre-requisite for BI EE & Essbase connectivity. This to me is kind of strange that too after using the connectivity in 10.1.3.3.1 where it was a hidden feature.

XMLA or XML for Analysis, is nothing but a standard defined to enable client applications to communicate with Essbase server over the internet through the standard SOAP protocol. That is one of the reasons why one would have to install the provider services. Basically provider service would accept input requests through the standard SOAP protocol. Then the MDX query embedded within the SOAP request would be fired on to the Essbase server. The Essbase Server would return the data back to the provider service. The provider service would inturn embed this into a SOAP response and then send it back to the client. This is represented by means of the diagram shown below

I studied the XMLA specification in detail for a couple of days. Though i still have some open ended questions which i will be blogging about in the coming days, i believe this API has excellent features and has tremendous potential for further uptake. Having said that, it would be interesting to see how this standard would be further developed(or endorsed by Oracle) as currently BI EE does not use this API. Though BI EE uses MDX which is the heart of XMLA, the standard in itself is not used. Though XMLA does not support quite a few reporting formats like C-API does, I am not sure whether this was the reason to go with this API rather than XMLA in BI EE. The main reason why i believe this was chosen was to facilitate writebacks to Essbase(which is available currently for relational sources). MDX over XMLA cannot write data back to Essbase as it is not supported as of now. But C-API has extensive range of data manipulation functions. Something has surely changed between what was available in 10.1.3.3.1 as a hidden feature and 10.1.3.3.2 (the actual release). I will go into detail of how XMLA works and how that can be used by generic relational reporting tools other than BI EE next week. The above is just my observation. I have not validated this with Product Management. Feel free to leave your comments/observations. I would be covering all the 4 APIs above (its been some time since i have used C. Time for me to refurbish my C skills) in the coming months.

Posted in All Posts, Hyperion Essbase, OBI EE Plus | 1 Comment »