In this last instalment of data services I am going to be looking at the ODI tool ODIInvokeWebService, in the previous blogs I have looked at methods of invoking scenarios through web services, using web services on top of datastores, creating your own apps and today the provided tool that can invoke web services from within the ODI environment.
There are many different reasons that would need to invoke a web service to satisfy the requirements of your integration, keeping tradition in the Hyperion arena I am going use this opportunity to call an EPMA web service. The EPMA web services are not too well documented but they are out there and can be used, if you are using version 9.3.1 there is a document available here.
I am not sure where the documentation is for version 11 but the web services do still exist and fully functional.
If you are going to consume the web services of EPMA with ODI then I recommend making sure you are running an ODI release of 10.1.3.5.0_02 or later as it seems to fix a bug with resolving WSDLs correctly, I hit the bug in the past and couldn't call some web services.
The object of this simple example is to connect to EPMA, retrieve a session id and then create a new folder in EPMA. The first step is to create a package and add the tool ODIInvokeWebService to the diagram.
The tool takes a WSDL URL as one of the parameters, if you are going to use any of the EPMA web services then you will need a session id so the first URL to use is :- http://<epma machine>/hyperion-bpma-server/Sessions.asmx?wsdl
Once you have entered the URL, you can click on the advanced button and connect to the WSDL and that will retrieve all the available operations, the advanced editor lets you enter values for the operation variables then send the soap request and review the results.
The CreateSession operation takes a UserName, Password and connectionString (the connectionString is optional)
Invoking the SOAP request from the editor returns the required SessionID, once applied the parameters “Port Type”, “Operation” and “XML Request” are automatically populated.
There is a parameter “Request File” which allows the use of a file to be as the request instead of using editor.
Now after the session id is retrieved we need a way of storing the value so it can be used with the next call, there are a number parameters that come into play.
“Storage Mode for Response File” that takes the values “NO_FILE” (no response is stored), “NEW_FILE” (A new
response file is generated.If the file already exists, it is overwritten) and “APPEND” (The response is appended to the file. If the file does not exist, it is created)
“File Encoding for Response File” - character encoding for the response file.
“XML Encoding for Response File” - Character encoding that will be indicated in the XML declaration header of the response file.
“Response File” - File storing the web service soap response.
Executing the package produces the following XML file.
So now we have the XML file containing the session id but we need to be able to read in the id, like most technologies there is one available for XML documents and allows the ability of reversing XML files into a set of datastores.
As usual the physical details have to be created in the topology manager.
Only a name is required in the definition.
In the JDBC Url the location of the xml file is entered.
In the physical schema definition any values can be entered for the schema and a logical schema has to be given and applied to context.
The process for reversing the XML file follows the same process as other technologies in the Designer.
Once reversed a separate dataStore is created in the Model for each element of the XML file.
Now the XML has been converted to datastores you can use standard SQL to query the information, as I am only interested in the session id I can query the value simply.
To store the session id I created a variable that is refreshed by using SQL to query the dataStore.
This variable can then be added to the package.
For the final step I want to create a new folder in EPMA, to do this there is a WSDL available for Folder operations.
By using the variable the SessionID value can be populated, without using the session id the soap request will fail, as the request will not be authenticated.
In the parameters of the operation the only other value that needs to be entered is for “Storage Mode for Response File” and this is set to “NO_FILE”
Executing the package will first send a soap request for a session id, the response will write to an xml file, a refreshing variable will pick up the session id from the xml file and store it, the last step uses the Create folder operation and passes in the stored session id and creates a new folder in EPMA.
So there we have a really simple example of how to use the ODIInvokeWebService tool with a bit of XML and EPMA thrown in.
A delve into the disturbing world of EPM, uncovering the mysteries and facts beneath the shiny surface, yawn your way through yet another blog, let's hope this one is interesting...
Sunday, 31 May 2009
Sunday, 17 May 2009
ODI Series – Web Services Part 3
After a little bit of a detour I am going to continue today with the Web Services functionality and concentrate on step-by-step configuring Data Services. Data Services enable to create a web services against existing DataStores The Data Service provides the ability to retrieve, update and delete data on a Data Store by using a Web Service. I am going to assume that you have Axis 2 installed and have followed the first part of this blog on Web Services.
The first step of the configuration is to go into the Topology Manager; by default there will be an Axis 2 technology already installed, a Data Server will need to be inserted to set up the connection information to the axis2 web application.
The name, base URL’s and account details are required, the default account for axis2 is admin/axis2.
The upload URL is the location where ODI will upload an aar (axis2 archive file) with compiled information about the data store.
Once this applied set up a logical schema and apply it to a context and that is all that is required in the Topology Manager.
In the designer you will need to import a SKM (Service Knowledge Module), the SKM you choose depends on the underlying technology your Data Store uses that you are going to generate a Data Service on.
Now select the model that you want to apply a data service to and edit, select the services tab.
In the application server dropdown there should be the name of the logical you created.
The Namespace and Package Name you enter is your choice.
The Name of the data source is important, as it will be used later on in the configuration of the web server; the data source name is used to map JBDC connection details to your data store, when using data services the JDBC information for the data store is not picked up from ODI and has to manually entered into the web applications configuration files.
The name has to start java:/comp/env/<name of data source>
The name of the Data Service is once again entirely your choice; you should end up with something like the following.
If you select the “Deployed datastores” tab you can pick which Data Stores you want to become part of the Data Service, a important prerequisite is that the Data Store has a primary key applied otherwise the generation of the Data Service will fail.
If you have not set up the ODI_JAVA_HOME environment variable to point to a JDK instead of a JRE you will get the following error.
All being well the Data Service will be generated and uploaded to the axis2 web application.
You can check it has been deployed correctly by listing available services in the axis2 web application
With each deployment of a Data Service there are a number of available operations such as add, delete, view Data Store through a Web Service.
Now this where many experience difficulties with Data Services, most rightly so assume that is the configuration complete but as I mentioned the Data Services do not pick up the data source connection details to the repository from ODI, these details need to added to the configuration files in the web application server.
If you try use a Web Service without configuring the data source then you will experience an error like.
The location of the files will depend on the application server you are using, in this example I am using Tomcat and the first file to update is context.xml that is located in the conf directory.
In the file additional information needs to be added between <Context>
The format of the additional information is :-
<Resource name=" " type="javax.sql.DataSource" driverClassName=" " url=" " username=" " password="" maxIdle="2" maxWait="-1" maxActive="4"/>
The resource name relates to the Data Source name that was defined in the Services tab of the model.
The driveClassName is JDBC driver name, this will depend on what technology you are using.
The url is the connection details to the database.
If you are unsure of the details you just need to go back into topology manager and check the Data Server
An example of a completed context.xml is :-
The next file that requires updating is web.xml that is located in the axis2 web application directory.
The information that requires adding can be placed anywhere between <web-app>
It basically information that tells the axis2 web application the name of the datasource to map to the context.xml and is in the format of :-
<resource-ref>
<description> </description>
<res-ref-name> </res-ref-name>
<res-type> </res-type>
<res-auth> </res-auth>
</resource-ref>
So in my example when it is has completed it looked like.
Once the configuration files have been completed then the web application server will require a restart.
To test the Data Services I am going to use soapUI again, if you have not downloaded it I recommend doing so as it is perfect for testing any sort of web service, it is full of functionality and free.
To test the service you will need to point soapUI to the WSDL, you can get the URL from the axis2 service list
Selecting the service will point to the WDSL address.
Entering the WDSL URL into soapUI will retrieve the available operations.
So lets take one of the operation getSampleData as an example.
The soap request takes any of the columns from the DataStore, the primary key column is mandatory even though it says optional.
Running the SOAP request will retrieve data from the Data Store against the defined ID number.
So with the different operations you can have all the functionality of deleting, adding and querying records.
A simple example of how this could be used is say you have users that enter manual adjustments into essbase cube, the current process is that the user sends in the data and this has to be loaded into the cube and then rolled up, this process needs to be automated and controlled.
Well as long as they have http access then a web page can be created.
Once the user has entered the information and submitted, the data could be sent directly into a database table using the data service I have just gone through (using the addSampleData operation), to get an idea on how to write Java code to do this have a look at this blog.
So how about automating the process loading the data into essbase, well if you read back to a previous blog I go through exactly how to do it by using CDC on a Datastore, with CDC any changes to a Datastore can be monitored and when a threshold is met then an interface could be run that loads data from the table into essbase, finally you could use the KM option to run a calc script after the data load to roll up part of the cube.
Now you will have an extremely simple automated adjustment load process.
The first step of the configuration is to go into the Topology Manager; by default there will be an Axis 2 technology already installed, a Data Server will need to be inserted to set up the connection information to the axis2 web application.
The name, base URL’s and account details are required, the default account for axis2 is admin/axis2.
The upload URL is the location where ODI will upload an aar (axis2 archive file) with compiled information about the data store.
Once this applied set up a logical schema and apply it to a context and that is all that is required in the Topology Manager.
In the designer you will need to import a SKM (Service Knowledge Module), the SKM you choose depends on the underlying technology your Data Store uses that you are going to generate a Data Service on.
Now select the model that you want to apply a data service to and edit, select the services tab.
In the application server dropdown there should be the name of the logical you created.
The Namespace and Package Name you enter is your choice.
The Name of the data source is important, as it will be used later on in the configuration of the web server; the data source name is used to map JBDC connection details to your data store, when using data services the JDBC information for the data store is not picked up from ODI and has to manually entered into the web applications configuration files.
The name has to start java:/comp/env/<name of data source>
The name of the Data Service is once again entirely your choice; you should end up with something like the following.
If you select the “Deployed datastores” tab you can pick which Data Stores you want to become part of the Data Service, a important prerequisite is that the Data Store has a primary key applied otherwise the generation of the Data Service will fail.
If you have not set up the ODI_JAVA_HOME environment variable to point to a JDK instead of a JRE you will get the following error.
All being well the Data Service will be generated and uploaded to the axis2 web application.
You can check it has been deployed correctly by listing available services in the axis2 web application
With each deployment of a Data Service there are a number of available operations such as add, delete, view Data Store through a Web Service.
Now this where many experience difficulties with Data Services, most rightly so assume that is the configuration complete but as I mentioned the Data Services do not pick up the data source connection details to the repository from ODI, these details need to added to the configuration files in the web application server.
If you try use a Web Service without configuring the data source then you will experience an error like.
The location of the files will depend on the application server you are using, in this example I am using Tomcat and the first file to update is context.xml that is located in the conf directory.
In the file additional information needs to be added between <Context>
The format of the additional information is :-
<Resource name=" " type="javax.sql.DataSource" driverClassName=" " url=" " username=" " password="" maxIdle="2" maxWait="-1" maxActive="4"/>
The resource name relates to the Data Source name that was defined in the Services tab of the model.
The driveClassName is JDBC driver name, this will depend on what technology you are using.
The url is the connection details to the database.
If you are unsure of the details you just need to go back into topology manager and check the Data Server
An example of a completed context.xml is :-
The next file that requires updating is web.xml that is located in the axis2 web application directory.
The information that requires adding can be placed anywhere between <web-app>
It basically information that tells the axis2 web application the name of the datasource to map to the context.xml and is in the format of :-
<resource-ref>
<description> </description>
<res-ref-name> </res-ref-name>
<res-type> </res-type>
<res-auth> </res-auth>
</resource-ref>
So in my example when it is has completed it looked like.
Once the configuration files have been completed then the web application server will require a restart.
To test the Data Services I am going to use soapUI again, if you have not downloaded it I recommend doing so as it is perfect for testing any sort of web service, it is full of functionality and free.
To test the service you will need to point soapUI to the WSDL, you can get the URL from the axis2 service list
Selecting the service will point to the WDSL address.
Entering the WDSL URL into soapUI will retrieve the available operations.
So lets take one of the operation getSampleData as an example.
The soap request takes any of the columns from the DataStore, the primary key column is mandatory even though it says optional.
Running the SOAP request will retrieve data from the Data Store against the defined ID number.
So with the different operations you can have all the functionality of deleting, adding and querying records.
A simple example of how this could be used is say you have users that enter manual adjustments into essbase cube, the current process is that the user sends in the data and this has to be loaded into the cube and then rolled up, this process needs to be automated and controlled.
Well as long as they have http access then a web page can be created.
Once the user has entered the information and submitted, the data could be sent directly into a database table using the data service I have just gone through (using the addSampleData operation), to get an idea on how to write Java code to do this have a look at this blog.
So how about automating the process loading the data into essbase, well if you read back to a previous blog I go through exactly how to do it by using CDC on a Datastore, with CDC any changes to a Datastore can be monitored and when a threshold is met then an interface could be run that loads data from the table into essbase, finally you could use the KM option to run a calc script after the data load to roll up part of the cube.
Now you will have an extremely simple automated adjustment load process.
Monday, 4 May 2009
ODI Series - Loading Smart List data into planning
Last time I went through how to load planning smart list data straight into essbase, it is my preferred method due its flexibility but it is not the official way of doing it. Just to make sure there is an even balance I will go through exactly the same example but use the planning layer and KM to load this information.
So we have exactly the same set of data but the first thing we need to decide is how we are going to load, when loading data through the planning layer you have to decide on a “Data Load Dimension” and a “Driver Dimension” with a list of members.
If you are going to be loading smart list data then the dimension that holds the smart list information has to be the driver dimensions and the smart list members have to be the selected members.
This means the “Driver Dimension” is going to be the account dimension as these contain the members with the smart lists.
The Data Load dimension is limited as it cannot be the Year, Period, Scenario, Version or Currency dimensions, so as I am using the Sample application this leaves the Account, Entity and Segments dimensions, the Account dimension is going to be the Driver Dimension so it can only be the Entity or Segments dimensions. In my example either will do so I have chosen the Segments dimension.
With the decisions been made the next step is to set up the data load information in planning, unfortunately at the moment this is a manual process, one of the reasons this is not my preferred choice because when it comes to automation the last thing you want to do is start having to manual select information.
I know that the outline loader that is part of the planning utilities in Version 11 you can set up the data load information from a flat file, hopefully in the future the ODI KM API will be updated so it can be set through ODI.
This is accessed from Planning > Administration > Data Load Administration.
Select the Data Load Dimension and click Go.
This will give a list of available Driver dimensions from the drop down.
The list of members you are going to load data for with the driver dimension have to be manually selected, luckily in my example I am just going to be loading against two account members, lots of fun if you are loading data for many large number of members.
Once you click “Save” it will update the planning database table with the information, if you are interested it populates table “HSP_DRIVER_MEMBER”, this table is cached by planning so you can’t just update it with information to try and automate the process.
The planning application will have to be reversed again in ODI, this is because there are currently no columns available to load data and driver information.
The reversing will create extra columns in the “Data Load Dimension”, these are “Data Load Cube Name” which relates to the essbase database you are going to load the data into, “Point-of-View” this is used to hold a comma separated list of the remaining dimensions, finally columns for each Driver member so in my example members “MakeDriver” and “Unit Sales”.
Remember if you add any extra driver members and want to load data against them you will have to repeat this process.
The interface can now be created as we have the source and target information ready.
The Segments dimension is directly mapped, as this is the “Data Load Dimension”. The “Data Load Cube Name” is mapped to ‘Consol’ as this is the name of the plan type (essbase database).
The remaining dimensions are put against the “Point-of-View” column, these have to be comma separated, in this example I am using SQL Server as the staging area so I am using the + operator to concatenate, if you are using Oracle you can use || to concatenate. It is also possible to use the Concat function in the expression editor.
As there is only one column for the Data in the source and the target requires being split into a column for each driver member then an expression can be used on each of the driver member columns.
The expression is basically saying when the account member being loaded equals “MakeDriver” then use the value in the Data field.
The same expression was used in the mapping of “Unit Sales” and updating it from “MakeDriver” to “Unit Sales”
Running the interface will populate the essbase database with the correct smart list id value and data value. It is your choice which method you choose, if your data is not going to change much and you are happy with the configuration then this method is perfectly acceptable, if your metadata changes frequently and there are additional smart list members then it will require extra manual changes.
So we have exactly the same set of data but the first thing we need to decide is how we are going to load, when loading data through the planning layer you have to decide on a “Data Load Dimension” and a “Driver Dimension” with a list of members.
If you are going to be loading smart list data then the dimension that holds the smart list information has to be the driver dimensions and the smart list members have to be the selected members.
This means the “Driver Dimension” is going to be the account dimension as these contain the members with the smart lists.
The Data Load dimension is limited as it cannot be the Year, Period, Scenario, Version or Currency dimensions, so as I am using the Sample application this leaves the Account, Entity and Segments dimensions, the Account dimension is going to be the Driver Dimension so it can only be the Entity or Segments dimensions. In my example either will do so I have chosen the Segments dimension.
With the decisions been made the next step is to set up the data load information in planning, unfortunately at the moment this is a manual process, one of the reasons this is not my preferred choice because when it comes to automation the last thing you want to do is start having to manual select information.
I know that the outline loader that is part of the planning utilities in Version 11 you can set up the data load information from a flat file, hopefully in the future the ODI KM API will be updated so it can be set through ODI.
This is accessed from Planning > Administration > Data Load Administration.
Select the Data Load Dimension and click Go.
This will give a list of available Driver dimensions from the drop down.
The list of members you are going to load data for with the driver dimension have to be manually selected, luckily in my example I am just going to be loading against two account members, lots of fun if you are loading data for many large number of members.
Once you click “Save” it will update the planning database table with the information, if you are interested it populates table “HSP_DRIVER_MEMBER”, this table is cached by planning so you can’t just update it with information to try and automate the process.
The planning application will have to be reversed again in ODI, this is because there are currently no columns available to load data and driver information.
The reversing will create extra columns in the “Data Load Dimension”, these are “Data Load Cube Name” which relates to the essbase database you are going to load the data into, “Point-of-View” this is used to hold a comma separated list of the remaining dimensions, finally columns for each Driver member so in my example members “MakeDriver” and “Unit Sales”.
Remember if you add any extra driver members and want to load data against them you will have to repeat this process.
The interface can now be created as we have the source and target information ready.
The Segments dimension is directly mapped, as this is the “Data Load Dimension”. The “Data Load Cube Name” is mapped to ‘Consol’ as this is the name of the plan type (essbase database).
The remaining dimensions are put against the “Point-of-View” column, these have to be comma separated, in this example I am using SQL Server as the staging area so I am using the + operator to concatenate, if you are using Oracle you can use || to concatenate. It is also possible to use the Concat function in the expression editor.
As there is only one column for the Data in the source and the target requires being split into a column for each driver member then an expression can be used on each of the driver member columns.
The expression is basically saying when the account member being loaded equals “MakeDriver” then use the value in the Data field.
The same expression was used in the mapping of “Unit Sales” and updating it from “MakeDriver” to “Unit Sales”
Running the interface will populate the essbase database with the correct smart list id value and data value. It is your choice which method you choose, if your data is not going to change much and you are happy with the configuration then this method is perfectly acceptable, if your metadata changes frequently and there are additional smart list members then it will require extra manual changes.