Friday, 27 September 2019

Data Management/FDMEE – Extracting data and member aliases

I was recently asked whether Data Management could display the alias for account members where the source is Oracle Financials. The reason behind the request was the users found it easier to understand and map data based on the source Financials account alias.

I am going to go through a possible solution and lead on to extracting data from a source application and producing an export file with alias members.

This post is going to be based on EPM Cloud but most of it can be replicated with on-premise FDMEE.

At first, I thought maybe the alias information was being pulled through from the source. If all columns are displayed in the workbench there is quite a bit of additional information available which is held in attribute columns.


This does not include any member aliases though.

In the source filter of the load rule there is an option to define the members to extract data for.


If the filter condition is selected this will open the member selector. Selecting List View will display the member and description.


So the type of member information I am after is being stored. Populating the member information is controlled by selecting “Refresh Members” from the target application.


It is a shame that the source values can not be selected in the Data Load Mappings even though they are being stored, it is only possible to select target values.


Anyway, the member information is being stored in a database table named “AIF_TARGET_APPL_MEMBERS”. There is no access to this table in EPM Cloud, but it is possible to track down the details in on-premise.


There is currently no scripting available in Data Management, but there is an option to use SQL mappings, which means it should be possible to query the member name and return the alias.

To achieve this, you could add a lookup dimension to the target application. This dimension will hold the account alias members. As it is a lookup dimension it will not be loaded to the target.


The lookup dimension does not need to be mapped in the import format.


A SQL mapping can be added to the lookup dimension. The SQL will return the alias member based on the source account member.


I am sure the SQL could be improved, this is a simple example of how to go about it, and I am not saying it is the only or right way.

Once the mapping has been created the validation can be run again.


Now the data in the workbench contains the Financials account and alias. To keep the source information up to date then it is important to run refresh members.

This leads me on to a question that has been raised on numerous occasions; which is whether it is possible to extract data to a file, but populate with member alias instead of member name?

Based on the logic that I have just gone through then it should be possible. Currently in the cloud or on-premise there is no option to extract data by member alias, yes you can argue that mappings could be added to map the members in the extracted data to alias. This would require a process to keep them in sync and there is nothing wrong with going down that route. I just want to provide an alternative.

So, let us go through an example of extracting data from a source EPM Cloud Planning application and generate an output file including member alias.

I am going to base it on the “Data Export to File” which I went through in a previous blog post, if you are not aware of all the functionality it offers then have a read of the post. It should also be possible to use a custom target application.

I have created my target application which includes a dimension for account alias.


As with my previous example, I have added a SQL mapping to map the member name to alias.


In the data load rule, I have defined a download file name. You don’t have to do this if you are happy with downloading the .dat file version.


Once the rule has been run, the data in the workbench includes the account alias.


The output file can be downloaded from Process Details, the filename will be in the format of
<TARGET_APPLICATION>_ <PROCESS_ID>.dat


As a filename was specified in the data load rule options, it can also be downloaded from the inbox/output explorer.


Alternatively, it can be downloaded using EPM Automate or REST.

The target file includes a column for account member name and alias.


If I only wanted to include the member alias the target application would only need to contain the required dimensions.


A similar SQL mapping can be added for each dimension.


The workbench includes both the source member and the alias as the target.

 

The output file will only contain the member aliases. I also defined pivot by period in the data load rule options.


Nice. How about automating refreshing the member information so it is in sync with the source application.

This is possible using REST. A post request can be made containing the application name in the body.


Unfortunately, this doesn’t work for Oracle Financials as ‘refresh members’ is processed in a different way, it should work for most EPM type applications.

This can be scripted and automated using a language of your choice.


After executing the script, you will be able to validate the refresh members process has been run in Process Details.


If you have Enterprise planning, you can use Groovy and the REST API to run the refresh members process. I wrote a post detailing how to use the REST API with Groovy which you can read all about here.

In summary a Groovy business rule can be created which makes a REST call.


Once deployed the rule will be available in planning.


After running, Job Details contains some of the response from the REST call.



The process ID can be matched to confirm the refresh members process has been run.


To keep the member information in sync then the rule can be scheduled to run on a regular basis.


This means that when the data extract is run it will be populated with up to date member aliases.

I am going to leave it there, hopefully you have found this useful. Until next time…

Monday, 9 September 2019

EPM Cloud – Managing journals with REST

I previously put together a series of posts covering journals in EPM Cloud Financial Consolidation and Close. The posts provided an overview of journals and detailed the workflow process before moving on to look at the options for loading journals through the simplified UI and Data Management.

If you have not read the posts and are interested they are available at:

EPM Cloud - Journals and Data Management - Part 1
EPM Cloud - Journals and Data Management - Part 2
EPM Cloud - Journals and Data Management - Part 3
EPM Cloud - Journals and Data Management - Part 4

At the time of writing them there were no options to manage journals through the REST API or EPM Automate. In the ever-evolving world of EPM Cloud, recent functionality has been added which allows you to interact with Journals through the REST API. I am sure related commands will be introduced to EPM Automate in a future release.

It is now possible to carry out the following actions through the REST API.
  • Retrieve Journals
  • Retrieve Journal line item details
  • Submit, Approve, Post, Unpost and Reject Journals
  • Open or close Journal periods.
As I have a keen interest in REST and have been covering the rise of the APIs for the last four years, I thought it would be worth having a look at what is currently possible with journals.

I am not going to go through the basics again so there will be an assumption you understand what REST all is about. For demo purposes I will be using a free browser-based REST client and for scripting it will be PowerShell. Please feel free to select your preferred option as I am not suggesting those are the ones to use, you can work with REST APIs in most scripting languages so pick the one that you feel most comfortable with.

To start off, I have created three journals which have the status of “Working”. Workflow is enabled so the journals will need to go through the different states before being posted.


To retrieve the list of journals there is a REST resource available. A GET request can be made against URL format:

https://<cloud_instance>/HyperionPlanning/rest/v3/applications/<app_name>/journals?q=<filter>

The filter, which is in JSON format, defines which journals to return; it must include scenario, year, period and journal status. The status depends if workflow is enabled but the possible values are Working, Submitted, Approved and Posted.

The query filter can also include entity, label, description and group.

To retrieve the journals I have created, I can make a request to the following URL:

https://<cloud_instance>/HyperionPlanning/rest/v3/applications/Vision/journals?q={"scenario":"Actual","year":"FY19","period":"Sep","status": "WORKING"}

Using a REST client this provides a response of


As you can see there is a lot of useful information returned for each journal which can then be processed, for example you could generate a list of journals by label and description which have been posted for the current period.

By default, 5 journals will be returned. This can be controlled by including a limit parameter in the URL. If 10 journals are to be returned the URL could be:

https://<cloud_instance>/HyperionPlanning/rest/v3/applications/Vision/journals?q=<filter>&limit=10

The response will include a maximum of 10 journals.


If there are more journals than the limit, the “hasMore” parameter value will be “true” and you can use “offset” or increase the limit to return the remaining journals.

The journal URL that is returned is a direct link to the journal. If you enter the URL into an authenticated browser it will open the journal and display the line items.

For example:


To convert this into scripting does not require much effort. The following script returns journal for a month which having the “Submitted” status, only the journal label and description are outputted.


The script can be easily updated to output any of the details returned in the response. This information could be processed further or possibly sent to a list of people.

Just to highlight, if you don’t include the status in the filter section of the URL then the response returned is:


If one of the required dimensions is not included in the filter:


This REST resource only provides summary journal information, if you require the line item details then there is another resource available.

A GET request can be made to a similar URL which includes the journal label, the format for this is:

https://<cloud_instance>/HyperionPlanning/rest/v3/applications/<app_name>/journals/<journal_label>?q=<filter>&lineItems=true

If you don’t include the “lineItems” parameter then the default is false, this means it would only return summary information like the previous resource.

The filter must include scenario, year and period.

As an example, let us take the journal “Rent Expense Adjust” which has the following line items.


The URL to retrieve this journal with line item details:

https://<cloud_instance>/HyperionPlanning/rest/v3/applications/Vision/journals/Rent Expense Adjust?q={"scenario":"Actual","year":"FY19","period":"Sep"}&lineItems=true

Using a REST client this returns the following information


Compared to the previous REST resource, the response not only includes the total debit and credit for the journal but also a breakdown of the line items.

I did notice an issue with intercompany being returned as null instead of the member name and a slight spelling mistake with “description”.

In terms of scripting, I have modified my previous example to return journal information including line item details.


Now that I have covered retrieving journal information, let us move on to updating the journal status.

This time the REST resource requires the POST method to a URL with the format:

https://<cloud_instance>/HyperionPlanning/rest/v3/applications/<app_name>/journals/<journal_label>/actions

It also requires parameters in JSON in the body of the request which must include scenario, year, period and action.

The action can be “Submit”, “Reject”, “Approve”, “Post”, “Unpost”

Let us submit the “Rent Expense Adjust” journal which currently has a status of “Working”.


The response will provide details of the new status.


In the UI, the journal now has a status of “Submitted”.


This can easily be replicated by modifying my existing script.


The next step in the workflow would be to approve the journal, what if I try to skip a step and try and post the journal?


An error message is returned to highlight that a journal cannot be posted when it is in the state of “Submitted” as the workflow process must be followed.


If I try to approve the journal and the period is not open, the following error will be returned in the response.


In the UI you can see that period has not been opened.


This leads me nicely on to the final REST resource which provides the ability to open and close periods.

The URL format is:

https://<cloud_instance>/HyperionPlanning/rest/v3/applications/<app_name>/journalPeriods/<period>/actions

Once again this requires a POST method and parameters included the body of the request. The parameters must include scenario, year, period and action.

The action can be either “Open” or “Close”.

An example to open a period would be


Which provides a response to indicate the period is now open.


This can be confirmed in the UI.


An example to close a period using a modified version of the previous script would be


The period will now be closed.


Please note that if you try and close a period which has journals with a status of “Submitted” or “Approved”, then it will look like the period has been closed as the same response will be returned but it will not have been closed.

If you try this in the UI you will be presented with a confirmation message.


I think that about covers the current journal REST API functionality. Until next time…