Monday, 29 August 2011

ODI Series - Summary

I thought I would take this opportunity to put together a full list of links to all the ODI related blogs I have written, as I try to cover all areas of EPM I feel the ODI ones are becoming fragmented and it is worth putting together a summary page that can be used as a start point and one that I will update in the future as and when I cover ODI subject areas.

This is probably one of the few posts that I don’t have to spend hours taking screen shots and uploading them.

Many of the earlier posts were based on ODI 10g but to be honest the concept still holds true in 11g and they still can be used as a reference point, I know I still use them because I have not had the chance use ODI as much as I would like lately and with my failing memory when I do use it I need to try and remember how I went about it.

No point in babbling on but if you feel I have not covered an area or you have some ideas for articles then just get in touch, there have been quite a few posts that I have written due to being contacted one way or another.

So here is the table of contents for the ODI series.

ODI Series Part 1
  • Introduction and installation of ODI, this was based on 10g. If you are looking to install 11g then there are a number of different blogs that have covered off the installation.

ODI Series Part 2 - The Agent
  • Covers the configuration of an agent on 10g, the concept is similar if configuring a standalone agent of 11g the main difference is if running on windows then the method to run the agent as a service has changed as it can be controlled by using OPMN which I covered here.

ODI Series Part 3 - The configuration continues
  • A run through of the topology manager and the configuration for using flat files, rdbms, essbase and planning.

ODI Series 4 - Enter the Designer
  • An introduction to the designer and process to building an interface to load metadata from a flat file to a planning application.

ODI Series Part 5 - SQL to Planning
  • Covers the process for loading planning metadata from an rdbms source.

ODI Series Part 6 - Data load to planning
  • How to load data through the planning layer.

ODI - Getting text data into planning
  • An alternative solution and not the official method to load text data into planning, I am sure at the time of writing the blog this was the only method but I know in subsequent releases of ODI and planning then loading text data can easily be achieved by using the adaptor.
ODI - Loading text member data to planning
  • Part of this blog covers the direct method of loading text member data to planning, I believe this is only possible from planning 11.1.1.3+

ODI & Planning - Brief look into file manipulation
  • As requested an introduction to manipulating metadata from a flat file source and loading into planning.

ODI Series - Planning and multiple UDAs
  • How to load multiple UDAs into a planning application.

ODI Series - Handling errors with Planning/Essbase loads
  • A run through of a few possible methods of proactively taking actions if there are errors in Planning/Essbase loads.

ODI Series - Planning 11.1.1.3 enhancements
  • An overview of the list of enhancements when using Planning 11.1.1.3 or later with ODI.

ODI Series - Planning 11.1.1.3 enhancements continued
  • There wasn’t enough time to go through all the enhancements in the first instalment so in this part I go through the remaining ones.

ODI Series - Renaming planning members
  • There is currently no direct method of renaming planning members using the adaptor so in this post I go through a possible solution to achieve the renaming.

ODI Series - Deleting planning members
  • Step by step process overview for deleting planning members and some of the issues that may be faced.

ODI Series - When is a planning refresh not a refresh
  • Even though an interface may have completed successfully in the Operator there may be a possible issue with the planning refresh which I highlight and give a possible solution to the issue.

Planning 11.1.2 - additional ODI functionality mixed in with a helping of ODI 11G
  • A look at the additional features available when using Planning 11.1.2+ and ODI.

ODI - Question on loading planning Smart List data directly into essbase
  • Covers an alternative way to load smart list data by loading directly to essbase instead of going through the planning later.

ODI Series - Loading Smart List data into planning
  • The official way of loading smart list data by loading through the planning layer.

ODI Series - Loading Essbase Metadata
  • Loading metadata from a flat file source into an essbase database.

ODI Series - Loading data into essbase
  • The title pretty much describes the post which is all about loading data into an essbase database.

ODI Series - Extracting data from essbase
  • Covering the three different methods to extract data from essbase databases.

ODI Series - Essbase Outline Extractor
  • Covers the process of extracting metadata from an essbase database.

ODI / Essbase challenge - Extracting alternate alias member information
  • At the time of writing this post there was no method to extract alternate alias member information using the essbase adaptor, I cover a method of customising the adaptor code to make it achievable. The code that I developed was later used by Oracle and a patch release of ODI made it possible to extract the information, I covered the patch here.

ODI Series - Loading SQL Metadata into essbase and its problems
  • A run through of an issue that was highlighted on the Oracle forums when loading essbase metadata from an rdbms source and changing the IKM rules separator options to values other than the default comma.

ODI Series - Extracting essbase formula issue
  • Highlights and provides a solution to an issue that maybe encountered extracting essbase member formulas that span over multiple lines.

ODI Series - Essbase to Planning
  • As there is currently no adaptor which allows the extraction of metadata from planning I go through one method to get around and this and extract the information from an essbase database and then transform and load into a planning application.

ODI Series - Deleting essbase members part 1
  • First of a three part series on deleting essbase members, this part goes through the standard method of using the remove unspecified option in a load rule.

ODI Series - Deleting essbase members part 2
  • In this second part I go through a method to delete members where the source file only contains the members that need to be removed.

ODI Series - Deleting essbase members part 3
  • In the final part I use a combination of Jython, essbase Java API and a customised KM to delete members based on members supplied from a source file.

ODI Series - Renaming essbase members
  • Based on the final part of the deleting essbase members, modifications to the Java code allow for renaming members based on a source containing the original and new member names.

ODI Series - Essbase/Planning - Automating changes in data
  • Using Change Data Capture (CDC) and the simple journalizing method to automate the process of loading data into an essbase database.

ODI Series - Essbase/Planning - Automating changes in data - Part 2
  • Using Change Data Capture (CDC) and the consistent set journalizing method to automate the process of loading metadata to a planning application.

ODI Series - Release Update
  • Highlights important patch updates to fix bugs with the essbase adaptors, please note this was 10.1.3.4.7_01 so no doubt you are already running a newer release than that.

ODI Series - Patch Update - Essbase data load issue resolved
  • A review of a patch released to fix a bug with loading data to essbase.

ODI Series - Problem with latest essbase patch release
  • The patch released to fix a bug with loading data to essbase did fix the original issue but now generates a new bug which I log with Oracle.

ODI Series - Patch Update - 10.1.3.5.5
  • A patch is released that finally does fix the issues with loading data to essbase.

ODI Series - Essbase related bug fixes
  • A delve into a couple of essbase related bug fixes, one of them is based around the order of members when extracting essbase metadata and the other allowing to extract alias information not only from the default table.
ODI Series - Issues with 11.1.1.5 and the Hyperion knowledge modules

  • A look at a couple of issues relating to 11.1.1.5 and loading metadata to essbase and loading metadata to planning using memory engine as the staging area.

ODI Series - Will the real ODI-HFM blog please stand up
  • Covers the configuration required to start using the HFM adaptor, some of the errors that may be encounter and finally successfully reversing a HFM application.

ODI Series - Loading HFM metadata
  • An overview of the knowledge modules available for HFM and then a step by step process to load metadata to a HFM application from a flat file source.

ODI Series - Loading HFM data
  • Step by step process to load data to HFM and a description of each of the IKM options available.

ODI Series - Extracting data from HFM
  • Overview of how to extract data from HFM and a description of the KM options available.

ODI Series - Putting it all together
  • So you have a number of interfaces, procedures, variables and you are looking to automate your processes, this post covers putting it all together with the use of a package.

ODI Series - A few extra tips
  • Covers loading multiple UDAs to an essbase database and how to sum up flat file source data when loading through the planning layer and not direct to essbase.

ODI Series - The final instalment
  • Definitely not the final instalment of the ODI series and in this post I go through generating scenarios and scheduling them in the Operator, I also go through how to execute a scenario using the Java SDK in 10g.

ODI Series - Excel functionality
  • A look at the process of extracting data from an excel file and loading into a relational table.

ODI Series - Reversing a collection of files from an excel template
  • An overview of using the Knowledge module “RKM File (FROM Excel” , this RKM provides the ability to reverse files and maintain the structure definition based on the information held in an excel template.

ODI Series - Web Services
  • In the first of a four part series looking at Web Services in 10g I go through setting up Axis2 and how to upload the ODI public web service to it, this allows the use of the OdiInvoke service and its five operations with one of them being the execution of scenarios using a SOAP request.

ODI Series - Web Services Part 2
  • Using Java to interact directly with the ODI web services.

ODI Series - Web Services Part 3
  • Step by Step configuration of Data Services and an example of how they could be used to load data to essbase.

ODI Series - Web Services Part 4
  • A look at the ODI tool - ODIInvokeWebService and putting it to practice by invoking EPMA web services.

ODI Series - Processing all files in a directory
  • Step through of a method to process all files in a directory and load them to a target relational table.

ODI Series - Quick look at user functions
  • Ever wondered what are user functions and how to use them, if so have a read of this post.

ODI Series - Executing Hyperion Business Rules
  • An in-depth look at using ODI to execute business rules.

ODI Series - Executing Hyperion Business Rules Part 2
  • If part 1 didn’t satisfy you then in this part I go through another method to execute business rules by customising a knowledge module and using Jython

Using Lifecycle Management (LCM) with ODI
  • A look at the possibilities of using ODI and LCM, in the example I go through the loading Cell Text to Planning

Managing ODI 11g standalone agents using OPMN with EPM 11.1.2
  • Step by step process of configuring a standalone agent to be controlled by OPMN, the agent is configured on an EPM hosted environment.

If you are looking for all other areas of ODI not covered here then I can recommend these excellent blogs -

Still can't find what you are looking for or need help then the ODI forum is your next stop




Sunday, 14 August 2011

ODI Series - Deleting planning members

I have been asked numerous times lately how to delete planning members using ODI, I did cover loading planning metadata way back in one of my early blogs but have never covered deleting and even though it is quite similar to loading metadata I thought I may as well go through the steps and highlight some errors that may be encountered, it also stops me having to repeatedly explain the process and I will be able to be lazy in future and just provide the link to this post.

The process may have been covered elsewhere on the internet but I wanted to make sure the series I have wrote is complete.

If you have experience using the Hyperion planning adaptor with ODI then this post will probably be of no benefit to you as it is aimed at beginners.

There are going to be some assumptions though before starting.

• The planning application is of the classic type and not EPMA.

• The required Knowledge modules have been imported
(source and target “IKM SQL to Hyperion Planning”)

• The planning application has been reversed into a model and Datastores using the KM
- “RKM – Hyperion Planning

• The source Datastore containing the members to be deleted has been created whether it be a file or a relational table.

If you don’t know how to do the above then it is worth reading through the here and here as I cover off the basics that you need to know to get you up and running, I didn’t think there was any point in going over it again.

I will be going through the steps using ODI 11g but the principals are exactly the same in 10g.



If you take a look at a reversed planning applications model and then expand any dimension Datastore you will notice a column called “Operation”, now if you have only ever loaded metadata then there is a distinct possibility that you have ignored the column as when the column is left blank during an execution of an interface the default value is used and this is “Update”.

If you want to delete members then this column becomes far more important as there are a number of different values that can be set depending on the type of deletion.

The types of delete operations available are

• Delete Level 0
• Delete Descendants
• Delete Idescendants

These operations may seem very obvious to what they can achieve but I will go through each one of them so that is clear.

Before I go through them it is worth highlighting I am going to use the planning sample application and the members deleted are all going to be from the Segments dimension.



The first of the operations is “Delete Level 0”, using this operation will delete a member if is a base level member so basically it means if it has no children then it will be deleted.

To start with I need to have a list of level 0 members I want to delete in the source.



The source technology for the deletions is Oracle, if you are intending of using a flat file as the source then the process is exactly the same.

I have created a model and reversed a table called “DEL_MEMBERS”, the table has one column called “MEMBER” which contains the members to be deleted, as I mentioned earlier I am assuming you have already set up your source Datastore but if you are confused and don’t know how to then have a read here.

In the table there are two records “IPOD304” and “MP3” and if you look at the hierarchy above you will see both these members are level 0.

The source and target are ready so this means we can create an interface which will process the source records and attempt to delete what is contained in those records from the planning applications target dimension.



When creating an interface you will need to set the staging area to be different from the target as planning has no SQL capabilities and if you don’t set this then the interface will generate warnings.



The warning generated is “Staging Area's technology doesn't have the Where capability which may lead to errors at execution time. You may want to switch the Staging Area to another technology (Memory Engine for instance).”

As well as receiving warnings the planning knowledge module will not be selectable until the staging area has been defined.

If I am using an RDBMS as a source technology then I would set the staging area to be the same as the source, if using a flat file as the source then more than likely I would use the “Memory Engine” as the staging area due to the low number of records to process and complexity of the transformation which are more tuned to that of being handled in memory.

If using a flat file as the source and the file was large or the transformation of the data were complex then I would consider using an RDBMS as the staging area, this would first load the flat file into a relational table, perform any transformations and then delete the members, it is worth testing with the “Memory Engine” first and any issues are encountered then look into changing the staging area.

One thing I do notice in 11g is you can’t set the staging area until the target Datastore in the interface has been designated.



Once the interface has been created go to the Mapping tab and drag the source Datastore on to the source area and then drag the target planning dimension Datastore onto the target area. If the staging area has not been set this is the time to go back to the Overview tab and set it.

The mapping on the target are then created, in this example it is a simple one to one mapping so all that is required is the target dimension name column (Segments – this will depend on the dimension you are trying to delete members from) is mapped to the source, Target “Segments” maps to source DEL_MEMBERS.MEMBER

If the source is a file then the mapping will need to be set to execute on the staging area.

In the Operation column the type of operation is defined by entering ‘Delete Level 0’, make sure it is enclosed in single quotes or an error will be generated when executing the interface.



It is also necessary to set the mapping to execute on the “Staging Area” for the operation column or once again an error will be generated due to the target not having any transformation capabilities.

You don’t have to hardcode the operation type in the target column you could just as easily have an extra column in your source to define the type of deletion which you would just map to the target Operation column.



If at this point you are unable to select the required Knowledge Modules then it is probably down to the KM’s have not been imported or the staging area has not been set.

The important options to be set in for the IKM are

LOG_ENABLED – set whether an output log for the interface is generated.
LOG_FILE_NAME – full path and filename for the log.
LOG_ERRORS – set whether any errors during the deletions are logged.
ERROR_LOG_FILENAME – full path and filename for the error log.

The error files will be generated on the machine that the agent being used in the interface is executed against.

MAXIMUM_ERRORS_ALLOWED – The default is 0 which means if any deletion errors are generated the interface does not stop, if you want the interface to stop after a set number of errors then use this option.

REFRESH_DATABASE – This executes a planning refresh as the interface is only deleting the members from planning and if you want the deletions to be pushed to essbase then the option will need to be set to yes.

If running deletions against a number of dimensions then only set the last interface option to yes and remember that if any dense dimension members have been deleted then the time to refresh will depend on the size of the data in the essbase databases.

Once the options have been set the interface can be executed.



Once the interface has completed then check the operator for success and if you want to view how many records have been deleted then open the “Report Statistics” step. Do not be alarmed by the warning sign against this step as it doesn’t mean the interface has failed it is just the way the adaptor has been designed.

If you want to update the “Report Statistics” step to report in a more meaningful manner and remove the warning sign then have a read here of an article I wrote on how to change it.

If there are any rows rejected or the interface failed then the first place to check is the logs that would have been generated if enabled in the KM options.

If everything has run through without any issues then it is also worth checking the hierarchy in planning to be certain the members have been deleted.

If any of the members deleted also have shared members in the hierarchy they will also be deleted using the above method.



So say you only wanted to delete the shared members “IPOD304” and “MP3” from a hierarchy then you would probably think the following would work.



The adaptor unfortunately does not work in this way and even though the interface completes successfully the error log will produce errors.



The error descriptions do not directly point out the issue though it is clear the method does not work.

If you think about it the hierarchy may have multiple instances of the same shared member so the adaptor doesn’t really know if you want to delete them all or just some of them.



To be able to delete shared members then the parent of the member to be deleted is required so to get around this the Datastore can be updated to include a parent member column.



The original interface is updated to include a mapping from source to target “Parent” column and executing the interface will delete members based on the member and its parent defined in the source.

So what happens if the interface is set up to delete level 0 members and it tries to delete a member that is not level 0.



In the above interface the source contains a member called “AG” and if you look at the hierarchy this is not a level 0 member.



The interface will not fail but this time you will get a more meaningful message in the error log stating “Attempted to delete a non level 0 member using the delete level 0 operation.”

Another common misunderstanding when creating a deletion interface is the concept that it is possible to use the Alias column instead of using the member column.



In the above interface the alias is mapped instead of the member and once again the interface will run through successfully but there will be errors generated in the log.

If you only have the alias in the source then it is certainly possible to generate the member name but this requires mapping to the planning applications relational tables and not one to be covered today, if you are interested in understanding how to go about it then just get in touch.

Right, on to the next type of operation “Delete Descendants” and this basically means delete all members which are descendants of the current member defined in the source.



In the above interface the source member is “AG” and the operation is defined as “Delete Descendants



All the descendants of the member “AG” have been deleted.

Something to be aware of when deleting descendants if all the children of a member are deleted then the parent member can still contain data, it might be that fine but just something to be sure that is what you want as you may need to run a calc after the deletion.

Watch out if the parents data storage is set “Dynamic Calc” and all its descendants are deleted as level 0 members cannot be “Dynamic Calc” unless they have a formula associated with them.

The final operation type is “Delete Idescendants” which will delete the current member defined in the source and all its descendant members.



In the above interface the source member is defined as “AG” and the operation is set to “Delete Idescendants”.



The member “AG” and its descendants have been deleted from the hierarchy.



If like in the above interface and the operation type is set to “Delete Idescendants” and a level 0 member is defined in the source the level 0 member will still be deleted as it is still a valid operation.

Another way of defining the type of operation is to use an ODI variable.



The variable can be used in the target mapping which adds more usability to the interface. If you are going to use a variable you will need to create a package first and then drag the variable to the interface to either set or declare it and then add the interface as the next step.

If you want to understand the query that is being generated on the source/staging area before the deletion is being executed on the planning side then go into the operator and open the step “Load data into planning” and switch to the Code tab where the SQL being used can be viewed.



I think that covers off the basics on deleting members from planning and should hopefully get you on your way.