Today I will be looking at extracting data from essbase and the different methods that can be used. At first I thought great the knowledge module mixed with the API would extract directly from essbase, as with some of the other integration methods it is not as perfect as it could be.
There are three methods to be able to extract data from essbase; these are using either a Calc script (9.3 onwards), Report script of an MDX query. If your essbase db is BSO then you will be able to use all of the methods, if it is ASO then you will only be able to use the report script or MDX query.
I will first tackle the Calc script method; this method is probably my least favourite due to the way it operates. You first have to create a calc script using the DATAEXPORT function that will extract to a text file.
The DATAEXPORT function writes a dense dimension as the columns so you have to decide how you want your data to look, as it will need to be matched against how you reverse the essbase database in ODI. I am using the Sample.Basic db and have decided to use the Scenario as the dense dimension for the columns.
It is important to use :- DataExportDimHeader On;
As ODI will consider the first 2 records to be header information, the extract file will also have to write to location which will be accessible by ODI, this is not so bad if ODI is on the same server as essbase but becomes more of an issue if they are separate, one of the reasons I am not so seen on this method.
Running the calc script produces the following output :-
First a new model needs to be inserted in the ODI designer, I am assuming you have set up the connection to essbase in the topology manager, if you have not then have a then have a read of a previous blog
In the reverse section, select the context and the logical agent, the KM to use is “RKM Hyperion Essbase..”
To be able to return the correct data columns the KM options have to be updated.
MULTIPLE_DATA_COLUMNS – Set to “Yes” as our exported data has multiple columns.
DATA_COLUMN_DIMENSION – Set to “Scenario” as this data dimension in the exported data file
DATA_COLUMN_MEMBERS – Set to “Actual,Budget,Forecast” as these are the members in the exported data file.
Reversing the cube produces
You will notice the members Actual,Budget & Forecast have been generated in the data DataStore.
The next step is to create a DataStore that will be the target for the exported data; I am going to use a SQL server database table as the target as I couldn’t really see the point of running a calc script to export to a text file and then generating another text file from the export.
I just ran a selective reverse on an existing model to retrieve and store the database table as a DataStore
With the source and target DataStores created the next stage is to create the interface.
As usual the interface was named, the context chosen and the staging area set as Sunopsis Memory Engine.
In the diagram tab the Sample_BasicData DataStore was dragged to the source and SampleDataExtract DataStore to the target as all the columns are the same name everything was auto-mapped.
Within the flow area the LKM to use is “LKM Hyperion Essbase DATA to SQL”
The Options are important to get correct.
EXTRACTION_QUERY_TYPE :- Three possible types CalcScript, MDXQuery or ReportScript, you will need to manually enter the type if you are not using the default ReportScript.
EXTRACTION_QUERY_FILE :- This is the location of the file that will be run to extract the data from essbase, so obviously three file types. For the CalcScript you can enter the fully qualified location of the file or if it resides in the essbase app directory you can just type the name of it without the file type suffix.
For example :- E:\FileStore\extract.csc or extract
EXT_COL_DELIMITER :- this is the columns delimiter that is being used, in my example the calc script is creating a comma delimited file so I just enter ,
EXTRACT_DATA_FILE_IN_CALC_SCRIPT – This only applies if you are using the query type of CalcScript, it is the location of the file created by the calc script., this file must be accessible by ODI.
The other options are self-explanatory and have been covered previously.
The IKM I used was “IKM SQL to SQL Append” with the option of “DELETE_ALL” so the table is cleared out each time the interface is run.
When I ran the interface for the first time it failed with the error:-
Extraction using calculation script is not supported for essbase versions prior to 9.3
I am using version 11.1 and have encountered this error previously when loading metadata into essbase, there are currently no updated versions of the Java files so even though Oracle are pushing to use ODI it is not fully compatible with 11.1.
I wasn’t going to leave it there so tracked down the Java class that generates the error, it is in ODIEssbaseDataReader.class
I updated and compiled the class to not use logic for versions greater than 9, after putting the compiled class back into the jar package the interface ran through without a problem.
What happens when the interface is executed is ODI will run the specified calc script, the calc script will output to a text file, once the export is complete ODI will then load the text file into the database table.
Ok, onto the next type of extraction method Report Script, first step is to create the report script.
Make sure you use the commands {SUPALL}{TABDELIMIT}{ROWREPEAT}{NOIDENTGEN} these will format the output in the way ODI can interpret.
I am going to use the same interface as used in the Calc Script method so I wrote the report to replicate the same output.
In the interface I had to set the options
EXTRACTION_QUERY_TYPE – ReportScript
EXTRACTION_QUERY_FILE – extract or you can put in the fully qualified file path if it is not part of the essbase database.
EXT_COL_DELIMITER - The report is tab delimited so \t is used.
The final method that can be used is an MDX query; I am going to use the same interface so I once again wrote the script to output the script in the same format.
I am sure the MDX could be wrote in different ways but for this exercise it produces the desired results.
In the interface I updated the options
EXTRACTION_QUERY_TYPE – MDXQuery
EXTRACTION_QUERY_FILE – This has to be the fully qualified name of the MDX script
EXT_COL_DELIMITER – MDX output is tab delimited by default so \t was used.
So there we have it a brief look into the different ways you can extract data from essbase, each one has its own merits and drawbacks.
That concludes looking at what can be done with the KMs for essbase and planning, next I am going to move onto the other areas of interest in ODI such as packages and variables.
When I create a text filed from Essbase using a calc script command: DATAEXPORT, it generates a character at the end of each record. ODI complains reading that character. I have to open the file in wordpad and save the file to remove the characters and it then loads fine from ODI
ReplyDeletehi john, could you please explain the merits and drawbacks of each extraction method? It will be a useful information for us. Thx
ReplyDeleteImportant thing to add if you are going to use extraction with CalcScript:
ReplyDeleteFor extracting data using calculation script, the Essbase server and the Oracle Data Integrator
Agent must be running on the same computer.
Hi John..
ReplyDeleteI feel its hard to edit and recompile ODIEssbaseDataReader.class. Do you decompile ODIEssbaseDataReader.class and all the library (including iterated library)? it's hard to do. How is the easiest way to recompile ODIEssbaseDataReader.class?
Well, I'll thank's if you give me the compiled ODIEssbaseDataReader.class (or the odihapp_essbase.jar).. :)..
thank's before. This is my email address : yunianto_k@yahoo.co.id.
I thought the post made some good points on extracting data, I use python for simple extracting data,data extraction can be a time consuming process
ReplyDeletebut for larger projects like files, the web, or documents i tried "extracting data from the web" which worked great, they build quick custom screen scrapers, extracting data, and data parsing programs
Hi
ReplyDeletethanks for the wonderfull blog.
i created the interface with calc script and got an error saying "Error occured while reading the data file produced by calculation script."
later got it, my odi and essbase server are not installed into same server.
and can i do the dataexport in maxl script using the same calcscript if yes plss let me know how it is done.
You can either,
ReplyDeleteInstall an agent on the essbase server.
or make sure the agent can access the exported data.
In the past I have run an calc script only interface to run a data export, then copied the export file to a location where the agent can access it.
Or you can create an ODI procedure to call a maxl script.