Monday 5 October 2015

Planning REST API part 2

In the last post I went through the core functionality available using the REST API in Planning, the post mainly focused on what is available in both on-premise  planning and PBCS.

I thought I would go through the some of the other REST resources which are only currently available in PBCS and these being around Lifecycle management.

If you have not read the last blog then I recommend doing so before reading this as it may not make much sense otherwise.

It is fair to say that most of the REST API functionality is also contained within the EPM automate utility but there may be situations where you don’t want to install the utility or you want to include access to the REST resources in your current business processes, it may also be the case where you want to script a process yourself.

As explained in the last post the planning REST resources are all accessed through the following URL structure:

http(s)://server:port/HyperionPlanning/rest/{api_version}/{path}

To access the Lifecycle management resources then the URL structure is

http(s)://server:port/interop/rest/{api_version}/{path}

To be able to find out the current API version then use a GET method with /interop/rest


I am once again using a REST client browser addin for demo purposes.

As the REST APIs require basic authentication I added my PBCS username and password to the header, the addin automatically encodes the username and password using base64.


The response was not what I was excepting as I definitely supplied the correct credentials.

The current REST API documentation and examples suggest that the username and password should be sent through the authorization header using basic authentication.

This threw me for a while until I watched the http traffic from the EPM automate utility and noticed that the format for the username should be:

identitydomain.username


Please note that is not my email address before you attempt to send mail to it :)

Since my post on the REST API Oracle have been in touch with me and have advised that the documentation will be updated in the future so you will not suffer the same pain :)

After updating the header to the correct format the results were much more pleasing.


From the JSON response you can see that the current API version is 11.1.2.3.600

To return information on the services that are available then the following resource can be accessed:

/interop/rest/{api_version}



I am going to concentrate on the application snapshot resources as this is where the main LCM functionality resides and this includes:
  • Get Information about All Application Snapshots
  • Get Information about a Specific Application Snapshot
  • Upload/Download files or LCM snapshots to PBCS file repository.
  • List/Delete files in the PBCS file repository.
  • Execute an LCM import/export
I am going to start off with uploading a metadata file to the PBCS repository so that a job can then be run to import the metadata.

To upload a file there is a post resource available using the URL format:

/interop/rest/{api_version}/applicationsnapshot/{applicationSnapshotName}/contents?q={“isLast”:”boolean”,”chunkSize”:”sizeinbytes”,”isFirst”:”boolean”}

{applicationSnapshotName} is the name of the file to be uploaded

The query parameters are:
chunkSize = Size of chunk being sent in bytes
isFirst = If this is the first chunk being sent set to true
isLast = If this is the last chunk being sent set to true.

The document does state the following:

The client calls the Upload REST API multiple times based on the size of file to be uploaded.
The client breaks the existing stream into a number of chunks, depending on the logic, so that each chunk size is not greater than 50 * 1024 * 1024 bytes.

I believe the size is based on the example code in the documentation which breaks the upload into 52mb chunks and is not the maximum size that can be sent in chunks.

It is all sounds more complex than it actually is, let me take an example using PowerShell to upload a csv file, I am using PowerShell like in my previous post because it does not require much code which is good for demo purposes and most should have easy access to it for testing.


Using the Invoke-RestMethod cmdlet makes life simple, pass in the URL, the encoded basis authentication header (using domain.username for the username), and the file to be uploaded.

The content type needs to be set to “application/octet-stream” otherwise it will not work.

The response LCM status codes are:
0 = success
-1 = in progress
Any other code is a failure.

Checking the inbox/outbox explorer confirms the file has been successfully uploaded


The only difference to download a file is that it is a GET method and there are no query parameters required like an upload, the format is:

/interop/rest/{api_version}/applicationsnapshot/{applicationSnapshotName}/contents

You can also use the list files resource to return information about the files/folders in the PBCS repository.

This requires a GET method and the following resource URL:

/interop/rest/{api_version}/applicationsnaphots


There are two types, LCM which is usually an LCM snapshot (inbox/outbox/data are FDMEE related) and EXTERNAL which indicates the file is not LCM related.

The last modified time (which will need converting) and size is only relevant for EXTERNAL file types.

Deleting files is simple as well as there is a resource available using the DELETE method and following format:

/interop/rest/{api_version}/applicationsnaphots/{applicationSnapshotName}


Right back to my original task, first was to upload a metadata file which is done and next to execute a job to import the metadata.


I have a job already defined in the console which will import the csv file from the inbox into the product dimension.

I covered running jobs in the last post and it is the exactly the same concept to run an import metadata job.


Once the job ID has been stored you can check the status and also additional details about the job.


To clear any doubts the job console can be checked within the planning application.



Once the metadata has been loaded successfully you could execute another job to refresh planning by just changing the job type and job name.

Right let us move on to LCM export/import functionality which will repeat an export/import of an already defined snapshot

There are two post method resources available which use the following format:

LCM Export
/interop/rest/{api_version}/applicationsnaphots/{applicationSnapshotName}/migration?q={type:”export”}

LCM Import
/interop/rest/{api_version}/applicationsnaphots/{applicationSnapshotName}/migration?q={type:”import”}

So basically all you need to do is change the query parameter type depending on whether you are performing an export or an import.

To show how to use this functionality I am going to perform a repeat export of an LCM snapshot called LCM_PLANAPP


In all of my examples up to now I have put the full URL together depending on what resource I am executing which is fine but there is another to generate this by returning information about the snapshot.

I did use the resource earlier but did not show everything that can be returned, just to recap the resource to use is:

GET /interop/rest/{api_version}/applicationsnapshots/{applicationSnapshotName}


Depending on process you want to carry out on the snapshot you can return the full URL to use and the method.

I can put this into practice when running the LCM export.


The first section of the script returns all the information about the application snapshot.

The URL to run an export is then stored and used in the next REST call.

The LCM export status is displayed which is -1 so it is still processing.

The script waits for 10 seconds and then checks the status again which this time returns 0 which means it has completed.

To run an import all that would be required would to change “export” to “import” on line 18.

That pretty much covers most of the LCM REST functionality so you should now be able to upload/download/list/delete files, run LCM export/imports and combine with Planning jobs.

There are some REST resources available for FDMEE but I am going to cover them in a future post where I will look at possible options when using web services and FDMEE.

12 comments:

  1. John, did you try the import_data job?I'm getting the following error
    Unable to load the following resource: An error occured attempting to load Essbase with EssUpdateFileEx: "null", for application "appname", cube "Plan1", data file "/u02/user_projects/instance0/tmp/input.txt".
    The same job works fine if I trigger that from PBCS.

    Another annoying stuff is right after the execution, the console shows an error which says "Unable to location session", and I've to close the session and open it again.

    I'm going to raise an SR for all these issues :(

    ReplyDelete
  2. Celvin, I tried with a planning source type data load and it worked but have not tried with an essbase one, will give it a go when I get the chance.

    ReplyDelete
  3. John
    I tried again today and seems like it is working now!!!!

    ReplyDelete
  4. Hi, John, did you have test the planning unit api, can you show some case,thanks

    ReplyDelete
  5. Hi John,
    You mentioned about REST resources for FDMEE. Do you know where can I find information about the REST resources for FDMEE?

    Thanks

    ReplyDelete
  6. Hi David, it is only the REST API for PBCS that contains resouces for FDMEE, these are contained in the PBCS documentation.

    ReplyDelete
  7. Hi John,

    Is it possible to edit a particular cell value writing a mdx query posting it as data for a combination of row,column and pov?

    ReplyDelete
  8. Hi John,

    Did you find the resources to list Substitution Variables and to change them (POST)?
    This functionality was added in one of the latest releases of EPM Automate, so it is logical to expect a resource exists.

    Thanks
    Yury

    ReplyDelete
  9. Hi Yury,

    Yes it is possible to list substitution variables and update them using REST resources.

    ReplyDelete
  10. Can we fetch user variable value using REST API?

    ReplyDelete

Note: only a member of this blog may post a comment.