Sunday, 7 October 2018

EPM Cloud - Recent additions to EPM Automate and REST API

In the EPM Cloud 18.10 release there were a few additional commands added to the EPM Automate utility, these are also available through the REST API as the utility is built on top of the API.

An annoyance for me with EPM Automate and the REST API has been not being able to rename a snapshot, even though it has always been possible through the web UI.


Not being able to rename out of the UI made it difficult to automate archiving the daily snapshot in the cloud instance before the next snapshot overwrote the previous one. You could download, rename and upload but this over complicates what should have been a simple rename.

With the 18.10 release it is now possible to rename a snapshot with a new EPM Automate command.

To rename a snapshot, the syntax for the utility is:

epmautomate renamesnapshot <existing snapshot name> <new snapshot name>

Using EPM Automate and a script, it is simple to rename the snapshot, in the following example the daily snapshot is renamed to include the current date.


This means the snapshot is now archived and the next daily maintenance will not overwrite it.


Please note though, there is a retention period for snapshots which currently stands at 60 days and a default maximum storage size of 150GB. If this is exceeded then snapshots are removed, oldest first to bring the size back to 150GB.

The documentation does not yet provide details on how to rename a snapshot using the REST API, but I am sure it will be updated in the near future.

Not to worry, I have worked it out and the format to rename a snapshot using the REST API is:


If the rename is successful, a status of 0 will be returned.


In the UI you will see the snapshot has been renamed.


If the rename was not successful, a status that is not equal to 0 will be returned and an error message will be available in the details parameter.


The functionality will only rename snapshots and does not work on other file types.

It is an easy task to script the renaming of a snapshot using the REST API. In the following example I am going to log into a test instance and rename the daily snapshot, then copy the daily snapshot from the production instance to the test instance. This means the production application is ready to be restored to the test environment if needed, also the test daily snapshot has been archived.


The above section of the script renames the test snapshot, the next section copies the production snapshot to the test instance.

When calling the REST API to copy a snapshot, a URL is returned which allows you keep checking the status of the copy until it completes.


Now in the test instance, the daily snapshot has been archived and contains a copy of the production snapshot.

 

It is also possible to copy files across an EPM Cloud instance using the EPM Automate command “copyfilefrominstance”. This command was introduced in the 18.07 release and the format for the command is:

epmautomate copyfilefrominstance <source_filename> <username> <password_file> <source_url> <source_domain> <target_filename>

To achieve this using the REST API is very similar to my previous copy snapshot example.

Say I wanted to copy a file from the test instance to the production one and rename the file.


An example script to do this:


The file has been copied to the production instance and renamed.


When the 18.10 monthly readiness document was first published it included details about another EPM Automate command called “executejob”

“executejob, which enables you to run any job type defined in planning, consolidation and close, or tax reporting applications”

This was subsequently removed from the document, but the command does exist in the utility.


The command just looks to bypass having to use different commands to run jobs, so instead of having to use commands such as “refreshcube”,”runbusinessrule” or “runplantypemap” you can just run “executejob” with the correct job type and name.

For example, if I create a new refresh database job and name it “Refresh”


The job type name for database refresh is “CUBE_REFRESH” so to run the refresh job with EPM Automate you could use the following:


The command is really replicating what has already been available in the REST API for running jobs.

The current list of job types is:

RULES
RULESET
PLAN_TYPE_MAP
IMPORT_DATA
EXPORT_DATA
EXPORT_METADATA
IMPORT_METADATA
CUBE_REFRESH
CLEAR_CUBE


I am not going to go into detail about the REST API as I have already covered it previously.

The format for the REST API is as follows:


The response will include details of the job and a URL that can be used to keep checking the status.


I was really hoping that the functionality was going to allow any job that is available through the scheduler to be run, for instance “Restructure Cube” or “Administration Mode” but it looks like it is only for jobs that can be created. Hopefully that is one for the future.

In 18.05 release a new EPM Automate command appeared called “runDailyMaintenance” which allows you to run the daily maintenance process without having to wait for the maintenance window. This is useful if new patches are available and you don’t want to wait to apply them. In 18.10 release the command includes a new parameter which provides the functionality to skip the next daily maintenance process.

The format for the command is:

epmautomate rundailymaintenance skipNext=true|false

The following example will run the maintenance process and skip the next scheduled one:


I included the -f to bypass the prompted message:

“Are you sure you want to run daily maintenance (yes/no): no?[Press Enter]”


The REST API documentation does not currently have information on the command but as the EPM Automate utility is built on top of the API, the functionality is available.

The format requires a POST method and the body of the post to include the skipNext parameter.


The response will include a URL to check the status of the maintenance process.


When the process has completed, a status of 0 will be returned.


It is worth pointing out that as part of the maintenance steps, the web application service is restarted so you will not be able to connect to the REST API to check the status while this is happening.

Another piece of functionality which has been available through the REST API for a long time, but not EPM Automate, is the ability to return or set the maintenance window time.

To return the maintenance time, a GET method is required with the following URL format:


The “amwTime” (Automated Maintenance Window Time) is the scheduled hour for the maintenance process, so it will be between 0 and 23.

To update the schedule time a PUT method is required and the URL requires a parameter called “StartTime”


If the update was successful a status of 0 will be returned.

You can then check the maintenance time has been updated.


The following script checks the current maintenance time and updates it to 03:00am


I did notice a problem, even though the REST API is setting the time, it is not being reflected in the UI.


It looks like a bug to me. Anyway, until next time…

Sunday, 30 September 2018

Automating data flows between EPM Cloud and OAC – Part 1

In past blogs I have covered in detail, automation in EPM Cloud using the REST API. Recently I have blogged comprehensively on the Essbase REST API in OAC, so I thought I would combine these and go through an example of automating the process of moving data between EPM Cloud and OAC Essbase.

The example will be based on extracting forecast data from PBCS using Data Management, downloading the data and then loading this to an OAC Essbase database. I will provide an option of downloading data directly to OAC from PBCS for those who have a customer managed OAC instance, alternatively for autonomous OAC the data can be downloaded from PBCS to a client/server before loading to Essbase.

I am going to break this into two parts, with the first part covering the setup and manual steps to the process, then the second part gets into the detail of automating the full process with the REST API and scripting.

Before I start I would like to point out this is not the only way to achieve the objective and I am not stating that this is the way it should be done, it is just an example to provide an idea of what is possible.

To start out with I am going to want to extract forecast data from PBCS and here is a sample of the data that will be extracted:


To extract the data, I am going to use Data Management, once the integration has been defined I can add automation to extract the data using the REST API.

As it is EPM Cloud, I will need to extract the data to a file and this can be achieved by creating a custom target application in Data Management.


The dimensions have been created to match those of the OAC Essbase database, I could have included scenario but that is always going to be static so can be handled on the Essbase side.


There are slight differences between the format of the Year in PBCS


to that in the Essbase database.


Aliases could be used but I want to provide an example of how the difference can be handled with period mappings in Data Management.


This will mean any data against, say FY19, in PBCS will be mapped to 2019 in the target output file.

If there are any differences between other members these can be handled in data load mappings in DM.

In the DM data load rule, source filters are created to define the data that will be extracted


In the target options of the file a fixed filename has been added, this is just to make the process of downloading the file easier. If this is not done, you would need to either capture the process ID from the REST response to generate the filename or read the filename from the jobs REST response, both methods produce the same outcome but, in this example, I am going for the simpler option.


Before running the integration, I will need to know which start and end period to select.

For the automated process I am going to pick this up from a substitution variable in Essbase, it would be the same concept if the variable is held in PBCS as both have a REST resource available to extract the information.


The data will be extracted for a full year, so based on the above sub var, the start period would be Jan-19 and the end period Dec-19


Running the rule will extract the data from PBCS, map and then produce an output file.


The rule ran successfully so the exported file will be available in the inbox/outbox explorer.


If I download the file you can see the format of the exported data.


When I cover the automation in the next part I will provide two options, the first one will download the data file directly to OAC from PBCS and then load the data, the second will download the file from PBCS to a machine running the automation script and then stream load it to Essbase.

As this post is about manually going through the process, I have downloaded and the file from PBCS and uploaded to OAC Essbase.


The file has been uploaded to the Essbase database directory.


Now an Essbase data load rule is required to load the above file.

A new rule was created, and the uploaded data file selected.


The columns in the data file were mapped to the corresponding dimensions.


The data is always being loaded to the forecast scenario member and not contained in the file, so this was added to the datasource information.


As I mentioned earlier I could have easily included scenario in the data export file by adding the dimension to the target application in Data Management, it is up to you to decide which method you prefer.

Once created it will be available from the scripts tab and under rules.


To run the rule, head over to jobs in the user interface and select “Load Data”


The application, database, rule and data file can then be selected.


The status of the data load can then be checked.


This is a hybrid database there is no need to run a calculation script to aggregate the data, if aggregations or calcs were required to be run then you could simply add this into the process.


A retrieve on the data confirms the process from extracting data from PBCS to OAC Essbase has been successful.


You could apply this process to extracting data from OAC and loading to EPM Cloud, one way to do this could be to run an Essbase data export script, the export file could then be uploaded to EPM Cloud, and a Data Management rule run to map and load to the target application.

We have a process in place, but nobody wants to live in a manual world, so it is time to streamline with automation which I will cover in detail in the next part. Stay tuned!