Sunday 14 August 2016

Load testing with the EPM Automate replay command – Part 2

In the last part I went through setting up Fiddler and the process to create a HTTP archive (har) file after recording Smart View actions.

Once the “har” file(s) have been created the next part of the process is to create a replay file.
The documentation provides the perfect insight to what a replay file is.

“A replay file is a CSV file that lists the credentials (user name and password) and the name of the HAR files that are to be run to load the system using the replay EPM Automate Utility command. Ensure that the user name and password that you specify has the rights to run the activities included in the HAR file.

On executing the replay command, the EPM Automate Utility runs each row in the replay file in parallel to exert load on the service. For example, if your replay file contains 10 rows, the utility replays 10 sessions so that you can perform tests to verify that user experience is acceptable when the service is under specified load. Each activity included in the HAR file is run serially.”


Basically every row in the replay file will be executed in parallel so the more rows the more load that will be generated.

The format for the replay file is:

username,password,harfile


So in the above example I am going to run one session and the “har” file contains a recording of opening a form, data was entered, the form was saved which also fires of a business rule and then the session was logged out.

The next step is run the replay command using the EPM Automate utility.

The format for the command is:

epmautomate replay replay_file_name.csv duration=N [trace=true] 

replay_file_name.csv contains the “har” files and user credentials

duration is the number of minutes to run the load test for, if all the actions in the har file complete before the end of the duration then it is run again until the duration is up.

trace is optional and creates XML files with each post and response carried out as part of the replay, you would only really use this to diagnose issues.


In my example the replay is going to run for a duration of 3 minutes and the output written to a csv file.

I am writing the output to a csv file as it opens up well formatted in excel for further analysis.


The outputted information is pretty well detailed and breaks down each Smart View action with a description and duration.

The replay command does not take account of time for the user to carry out an action so as soon as one action has completed it moves on to the next, there are no options to add delays or think time but I suppose this is not a problem if you are just trying to produce load on the system.

You can see at the bottom of output log that once the actions are completed then they are repeated as the duration time had not been reached.

The longest duration of this test is the saving the form as this also executes a business rule.

The total sequence of actions was run six times over the three minutes, I averaged out the total time and the time to save the form and run the rule.


Next I recorded a similar set of actions against a different form, produced the “har” file and added it to the replay file.


The replay command was run for the same duration and results collated.


I now have the average times for two sets of actions that have been run independently so how about running them in parallel.

All that is required is to include both “har” files in the replay file.


The replay command was run again for three minutes and the average times calculated.


The average times are higher with just two sets of actions being run.

I decided to record running a resource heavy rule from Smart View and add that to the replay file.


The replay command was run using the same parameters.


Now you can see that the average times are starting to take a bigger hit as more load is being put on the system.

I could go adding more entries to the replay file to create more load but hopefully you get the picture of how the replay command can be used.

Anything that can be done in Smart View can be recorded and added to the testing, as I said earlier you could have users actively using the application while the replay is being run to get a feel of how the system performs under load and at the same time you will be getting stats back from the replay.

Once the replay has been set up it can easily be run from different client locations to validate whether the timings are similar or not.

I must stress this has just an example of using the replay and is in no way a statement on how to go about load testing your application.

I was going to end the post here but I thought I might as well include a bit of technical detail on how the replay command works, this might not be for everybody so If you are not interested, until next time… :)

I mentioned earlier that the “har” file contains all the actions carried out in Smart View in JSON format, and in simplistic terms the replay command runs through file and extracts the relevant information then posts this to the Smart View URL.

The main obstacle that the command has to deal with is that in Smart View posts there are two unique identifiers that are required for authentication purposes, these being the single sign on (SSO) token which is generated first and then a security identifier (SID) which is used in further communication, both are generated at session time so this means that the recorded set of Smart View actions cannot just be played back from the “har” file as authentication would fail.

First of a valid SSO token has to be generated and this is done by a REST resource.

The username and password are read from the replay file.


The username and password are base64 encoded and added as a Basic authorisation header and the following REST resource is requested.


If authentication was successfully the response header will contain the SSO token.


The SSO token is then stored and is then used in the XML body of a POST request to the Smart View URL.


What is interesting is that the replay command always uses the URL /interop/rest/smartview instead of the one used with Planning and Smart View which is /HyperionPlanning/SmartView

So a REST resource is being used to accept Smart View XML and then I presume is being posted on to the Planning Smart View servlet. I am guessing this is get around having to mess around with Oracle Access manager tokens in a cookie which is the way it is done in Smart View.

The response from the above request contains the SID


With the SSO and SID stored the replay command can now run through “har” and process the JSON.

Let us take an example of one of the Smart View actions which opens an application, the output from the replay command has the following entry.


In the JSON there is a matching entry which I have converted to object form so it is easier on the eye .


The replay command cycles through the JSON entries and first checks whether “url” contains "*/HyperionPlanning/SmartView"

If this is true, it then checks that “method” equals “POST

The “postData” “text” string is stored and the text is replaced with anything between with the stored SID and anything between with the SSO token.

Now the text string is posted as XML to the REST resource https://<epm_cloud_instance>/interop/rest/smartview to replicate the Smart View action.

The headers in the response contain the information that is written to the output log.


X_EPM_FUNCTION = “screen” column in the output log.
X_EPM_ACTION = “action” column
X_EPM_OBJECT = “object” column

The next entry is read in and processed and so on until the end of the “har” file.

If the duration has not been exceeded the process is repeated until it has.

So that is pretty much covers how the replay command operates.

If you are interested in load testing your EPM cloud environment, feel free to get in touch.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.