Friday, 5 July 2019

EPM Cloud – Recent Web service and Groovy additions

There have been a few updates to EPM Cloud recently that are definitely worth highlighting. New functionality has been added, allowing connections to Web services to be created. These on their own are not much use, but with new classes added to the Groovy API, it is now possible to interact with these Web Services. This opens a whole new array of possibilities.

In the 19.06 release the following was contained in the ‘what’s new’ document:

“Service Administrators can now create connections to external Web services and specify optional query or header parameters while defining external connections. These connections can be referenced or used in a Groovy script to create a communications link between the Groovy script and the external Web resource.”

It was certainly possible to add new connections but there was not much information on the Groovy side. The API documentation was certainly lacking in detail.

Moving on to the 19.07 release and the ‘what’s new’ document had the following snippet:

“Planning Groovy object model APIs have been enhanced with the ability to call external REST APIs.”

This time the documentation contained details of new Groovy classes to work with REST APIs.

As you have probably noticed I have been blogging about REST APIs for a number of years now, so this new functionality has really taken my interest.

If you think about it just from an EPM Cloud perspective, there has been a large investment in developing and enhancing the REST API capabilities. EPM Automate is built on top of the REST API. With the new Groovy functionality, it allows you to call any of these REST resources. If you have multiple EPM cloud instances, then these resources can now be called from planning business rules. It removes some of the dependency for them to be run externally by scripting or EPM Automate. It doesn’t even have to be a separate cloud instance; the REST resources can be called against the same instance, which I will get on to later.

It doesn’t stop there, with the majority of cloud providers having a REST API of some sort then this has opened up lots of new areas to explore. This could be loading/exporting data or metadata to a cloud instance, running integration or administrative tasks. You could have a cloud-based ERP system that you want to interact with or maybe OAC that is heavily built on REST which I have covered in previous posts.

There is a downside though, this is based on Groovy which is only available in Enterprise Planning. It is frustrating that Groovy is only available in EPM Enterprise Cloud Services, I don’t understand why it can’t be included with standard and I know that I am not alone in feeling this way. Hopefully Oracle will one day see sense and allow it in both Cloud Services.

Going through a couple of examples should provide a greater insight to these new features and make it clearer how it operates.

Let us start out by calling a REST API resource against the same planning instance. To keep it simple I will call the resource that returns basic information about the application.

Using a REST client, I can make the following GET request.


This same REST call can be achieved using a Groovy script. There are two ways to make an HTTP/HTTPS connection, an on-demand connection which requires the full URL and username/password to be included; alternatively, a named connection can be used which can be set up from connections in the planning application. I am going to concentrate on the named connection as it does not expose the credentials in the script and is better for maintenance as it can be reused across multiple scripts.

If you navigate to connections and create then you will see an option for “Other Web Service Provider”


The connection name will be the one that is referenced from the Groovy script.

The URL can be the fully qualified external name for the cloud instance. As I am going to run the REST calls against the same instance and know the port the web server is running against, I have defined the URL with localhost. It works well and doesn’t require the URL to be updated if the connection is migrated across instances, so I am sticking with it until Oracle say otherwise :)

Under advanced options, parameters and headers can be defined. This can also be set in the Groovy script, if it is something that is going to be used in all calls then it probably makes more sense to define it in the advanced options.


I could have put in the full URL to the REST resource and had different connections for different types of calls but as each connection requires credentials, it would become a maintenance headache to keep updating them when the password changes. It is one of the downsides of connections that once the password has changed it has to be manually updated.

Take note that to use all the REST resources, then it should be a service administrator account that is defined in the connection. Also think about who has access to the Groovy rules that will be calling the REST resources as behind the scenes it will be the administrator account being used.

Before I get on with the Groovy side of things I want to point out that I am not a Groovy programmer and there will definitely different ways to write the code. I just want to try and keep things simple and I will let you get carried away with the coding. I will say that Calculation Manager has definitely not been developed as an IDE for writing Groovy and patience can be required. It can be challenging when you don’t know all the classes that are available and with static type checking enabled it can push you to the limit :)

To replicate the earlier REST request which returns application information is extremely simple in Groovy.


In the above example, the connection details that were just configured are retrieved, the URL is then concatenated with the path contained in the get method. A GET request is made and the response is returned as a string. The body of the response is then outputted and can be viewed in the job console.


You can see that the JSON response is the same as the one shown earlier using the REST client. The JSON response is not the easiest to read and process, there are various classes available to manage JSON. One of these is "JsonSlurper" which I will use in the following example:


It looks a bit messy because of having to define the types like Map and List because static type checking is enabled. Basically, the JSON body of the response is converted into a Map which means it will contain key/value pairs. The items key is extracted as a list and the first index entry converted back to a Map, the value from the name key is then extracted, stored and printed. I am sure it can be simplified, though with this type of JSON response I prefer to use different classes which I will include in later examples.


The HTTP response also has methods to extract the headers, status code and text.


This produces the following output.


The status code is useful for checking the HTTP request returned successfully.

As mentioned earlier there are two ways to make a connection, the named connection which all the previous examples have been using. It is also possible to use an on-demand connection, based on the same REST resource call, this can be achieved with something similar to:

// call REST resource to return application information
HttpResponse response = connection("https://<cloud_instance>/HyperionPlanning/rest/v3/applications","username","password").get().asString()

My choice when using REST calls with authentication would be to use the named connection so the credentials are not displayed in the rule. It also makes more sense to manage the connection details centrally instead of having them in multiple rules.

The previous examples have provided the basics of making a REST request but just returning the application is not really that exciting or useful. For example, what if I wanted to run a Data Management rule from a form which will load data from Fusion Cloud or a file or another cube in the application or even a different EPM Cloud instance? Soon I will be able to add on-premise sources to that list with the forthcoming EPM Agent. Anyway, Data Management has a REST API which I have covered in previous blogs, so with a bit of Groovy scripting it is now possible to execute a data load rule from a business rule.

In the next example I am going to run a Data Management load rule from a form, the start period of the rule will be based on substitution variables. This can all be achieved with REST calls and the concept is similar to something I covered in a previous blog, though this time there is no need for external scripting.

Once again to keep it simple, I have a basic form which has data populated up to May.


The idea will be run the rule to load data for June which is based on two substitution variables for current month and year.


The current month sub var value can be returned using the following REST request.


To return the current year sub var value just requires a change to the variable name in the URL.


The load rule can be executed with a POST request to the Data Management REST API URL, the body of the request contains the rule parameters in JSON.


Now we just need to replicate these REST calls in a Groovy script.

Let us start with the current month variable.



It is pretty much the same as the earlier examples, a GET request is made to return the sub var information. This time I am using the "JSONObject" class to extract the sub var value from the returned JSON.

If the script is run, the sub var value will be written to the job console.


On to the current year variable.


This will produce:


The start period for the Data Management rule can now be generated from the stored variable values.


The start period is now in the correct format.


Next the body of the REST request can be created in JSON. Once again, I am using the “JSONObject” class to handle this.


The JSON created matches the example shown earlier in the REST client.


A REST request using the post method which includes the JSON in the body can now be made.


The remaining part of the script checks if the return status code equals 200, if it doesn’t then an exception is thrown. This will generate an error message back to the user with the status code and text.

If the request was successful, the returned JSON information in the response is processed. The rule status code, rule status text and process ID are stored.

If the rule status code is not -1 (Running) or 0 (Completed) then an exception is raised, an error message will be displayed to the user with the process ID and the rule status.


If the script is run and the Data Management load rule is executed successfully the job console will contain.


If it is not a long running rule, then you could enhance the script so it checks until the rule completes.

In Data Management or Data Integration process details you will see the rule has been run.


Now all that is left to do is add the rule to the form. I added a menu option to the form which runs the rule.


Selecting the menu option then runs the Groovy script business rule.


Once the data load rule completes, the data will have been loaded and viewable in the form.


To provide an example of the data load rule failing, I updated the script to include an invalid URL.



I am going to leave it there for today, in the next post I will go through connecting to an external Web service which will return exchange rates, these will then be loaded directly to planning.

3 comments:

Rob Maller said...

Hi John, nice post. Being able to use REST from calc scripts opens up some nice options for me.

I have got rules and data management jobs working nicely - but have not been able to update Sub Vars, the code updates them but throws an error (i assume because there isn't an appropriate response). Did you manage to get this working?

Thanks

John Goodwin said...

An exception is raised because there is no JSON content returned. What you can do is catch the exception. The exception will contain "returned a response status of 204 No Content", you can check the exception message contains 204 and then you know it has worked.

Biswarup Bhattacharyya said...

Hi John,

That's been quite a writing. Thanks for that. I have this set up ready and everything is running as expected. It would be great to add some dependencies and wait till completion of a previous job. Trying to build something around that. In case you already thought of something it would be great to know.

Biswarup