Smart View and multiple pods, how can I simplify connections?

With the shift to the cloud, one of the major differences with Smart View is the shared connection url. When it was on-prem it was pointed to my production workspace servers and all of the environments connections where available to me as a shared connection. I really only needed to change it if I wanted to work in development or test.

Now that most of my clients are moving to the cloud they are struggling with connecting to environments that have multiple pods. Oracle has given us the ability to combine the web front end via Connections but that will not yet work for Smart View. Yes, clients can define multiple Private Connections within Smart View but I think that naming those connections can be challenging to end users. I really like this approach as it provides an intuitive way of identifying the connections to the end users. Here is a sample of it in action.

I know this video was very very exciting, and I hoped you enjoyed it. But yes, I connected to three different pods all defined in my shared connections and yes there where nice friendly names for those connections, and it is also true that because they where all in the same domain I did not have to log in three times!!

How did I do this? I went back to an on-prem solution for Smart View.

 

Accessing Shared Connections from an XML File

 

I used this solution and applied it to the cloud to create a friendly Shared Connection experience. Here is an example of the syntax for what you just saw with edited domain names.

XML File

I then saved it as an XML file and defined a path to it in Smart View.


 

In the above example I am accessing it via a local file, however, clients should probably host this file in a central location on an internal web server and reference the file via URL so that everyone can get updates. I have tried hosting it in workspace but you need to login to access the file and that defeats the purpose of the connection so I would go with the URL hosting or make the file part of a deployment package.

I hope this helps clients struggling to maintain multiple pods and Smart View connections. It has saved me so much time as I have a single file for all of my clients within one shared connection profile!

EPBCS Data Maps – How to improve the Headcount Transfer to Reporting

Data Maps is a great feature in PBCS that allows you to seamlessly move data between plan types and to your reporting databases. You can map smart lists to dimensions in ASO reporting cubes to convert the accounts to dimensions so that you can easily report in Smart View and Reports. It works great when you have similar dimensionality but if you really have extensive mapping you should use Data Management to move the data between cubes. As we use EPBCS, and take advantage of the out of the box content we have hit a few performance issues with data maps. Here is a recent solution we devised to resolve slow transfers of data between the Workforce cube and the ASO Workforce reporting cube.

OVERALL PERFORMANCE

Essbase is not great at extracting members that are tagged as dynamically calculated. You will see this if you execute a data map and all of the sudden a quick map is now taking a considerable time to run. Most likely there are dynamical calculated members recently added to your selection. Luckily you can disable dynamically calculated members in the map. To enable this option edit the Data Map and click the Options button. In the window that displays click the option to Exclude Dynamic Calc Members.

Then run the map again and it should be back to what you are use to. However you may be missing some data that you expect to be there. If you cannot discover what the dynamic calc members are, create a data export calc script with the same source point of view and the option DataExportDynamicCalc OFF. Run the script in calc manager and the log tab will inform you what members where excluded from your export. You can then investigate how you can include them in your data map as stored members.

OWP_Headcount Data for Reporting

As your plan and forecast grows you will see that this map will get slower and slower. It is created out of the box for the Workforce Planning module. Its purpose is to move the headcount data from the WFP cube to the ASO reporting cube that is supplied with the solution. The main reason this is slow is because the source of the headcount data is a dynamically calculated account.

 

The reporting cube and the WFP cube share the parent OWP_Total Headcount. In the WFP cube it is a dynamic calculated parent account and in the ASO cube it is a level 0 account. The data map is not optimally designed, and you cannot select the exclude dynamic calc option because no data will move over.

Here is the solution I discovered.

Step one edit the data map. Then click the red x to the right of the accounts dimension.

This will move the Account dimension to the Unmapped Dimension section. Click on the hyperlink OWP_Total Headcount and open the member selector.

Clear the current selection and the select every level 0 member under OWP_Total Headcount except OWP_Departed Headcount.

Then on the Target select the account OWP_Total Headcount.

Then click the Save and Close option. You will then get an error message that you cannot save the map because there is an invalid member OWP_Total Headcount.

This error is a defect because in the ASO cube the member is a level 0 and without a formula .

Since that member was created by WFP you cannot edit any part of it. To work around this error temporarily point the map to the account OWP_Compensation Expenses. This should let you save the map.

Now that it is saved navigate to Variables. Create a substitution variable at the All Cubes level called TARGET_HC_ACCOUNT and set the value to OWP_Total Headcount.

Click ok, and then edit the Headcount map again. Edit the account dimension on the target side of the map. Enable the Substitution Variables selector and select the substitution variable TARGET_HC_ACCOUNT.

 

 

 

Now when you click Save and Close the Data Map will save!!!

 

SHAZAM!!!!

Take a look at some results from a client app below:

The out of the box method took almost 12 minutes mostly all extracting that dynamically calculated data. With the modifications it ran with the same results in 9 seconds!

CONCLUSION

Dynamically calculated members are great for reporting but they have no place in data exports, and obviously do not belong in Data Maps! I hope that this solution helps you with your implementation of EPBCS or even in your own custom apps.

Thanks for reading. Have a great Memorial Day Weekend and remember:
A hero is someone who has given his or her life to something bigger than oneself. “ Joseph Campbell

Part 2 – Where is my import/export security utility? How to Import Security

UPDATE

After I posted this, I realized that I was not checking for XML special characters and if you had one in your metadata it would error when importing into PBCS. I have corrected this and updated the downloaded script. Sorry if this caused any frustration.

This is part two of the import/export utility for PBCS. Extracting security is fine, but now we want to import the security back into our system. As you may remember we extracted all of the group security and placed it all into one comma separated file using the Powershell utility in the last post.

We are going to utilize LCM to import the security back but we have to get the files and the archive in the correct format. Some key points to remember with the import:

  • LCM will import all of the user security from the file.  So even though the export creates a file for each group you can import it all in one file.
  • There does not seem to be a way to clear the security before you load it.  This was possible in the utility.  (I will keep playing with this.)
  • So based on the last point it is a merge of the security in the system. 

 

CONVERT CSV TO XML

PBCS requires a very basic but specific XML format. I have written a Powershell Script to convert the CSV into the correct format for planning to accept. To utilize the solution you will need two files located here SEC2XML.

The download has two files. Please place these files in the same directory.

  • listing.xml – is a file that will be used to build an archive file that can be loaded to your PBCS environment. 
  • sec2xml.ps1 – is the Powershell script that will convert your csv file into the correct xml format and build an archive based on your original download that can be loaded back into PBCS.

This script will accept two parameters. The first is the path and name of the csv file. The second is the path to the original archive that you extracted application folder. A bit more detail on that parameter below:

This is an example of an unzipped lcm extract of just my security groups. The second parameter is looking for the path of the application folder in this structure in the below screenshot is HP-TOYS. TOYS is the name of my planning application.

Below is an example of the syntax to run the script:

/sec2xml.ps1 /Users/anthonymanfredi/Dropbox/POWERSHELL/SECURITY/securitymac.txt /Users/anthonymanfredi/Dropbox/POWERSHELL/BLOG/ZIP/HP-TOYS

  • /sec2xml.ps1 = The name of the script
  • /Users/anthonymanfredi/Dropbox/POWERSHELL/SECURITY/securitymac.txt = The path and name of my security csv file.
  • /Users/anthonymanfredi/Dropbox/POWERSHELL/BLOG/ZIP/HP-TOYS = Unzipped application folder.

 

The script does quite a bit of magic but here is a flow chart highlighting the steps:

The script will convert the csv file back into the proper xml format. Next it utilize your original zip extract to build a zip file to load your new security upload. I would recommend just using the groups export to simplify the process. The script will delete any existing xml security files in the groups folder and then place the newly created XML file in the directory. I deleted the other files so that there would be no contention on load and the latest updates will come from the new csv file.

The next step will copy the listing.xml in the info folder. This file will let planning know about the new export that you want to load.

Once that is complete it will zip the structure and put the file security_update.zip in the same directory you executed the Powershell from.

 

IMPORT THE SECURITY

Now that the archive is built you can login to the PBCS application and navigate to the Migration section.
Once you are there select the Snapshots tab and upload your Archive.

Now that its uploaded you should be able to navigate the archive and see the security like below:

Import the snapshot when your ready and the application security will be updated

Couple of notes:

  • You will need to delete the snapshot in PBCS or rename it when you run the process again. You can’t have the same archive twice.
  • When you extract the original archive place it in its own folder so the Powershell script will not accidentally zip up extra files.  

CONCLUSION

I hope this helps everyone with their security in PBCS. Please don’t try this for the first time in PROD!! Run it in test first. If you have any questions or problems let me know!!

Where is my import/export security utility???

When I first started working on Hyperion Planning, clients and consultants would call the hotline and ask how they could see their existing security because back in the 1.0 days it was not available. So I dove into the database and wrote a Microsoft Access database that enabled consultants and clients the ability to see their security. We distributed the database and instructions if people asked nicely or loudly until the feature was written into the tool.

The Oracle EPBCS/FCCS platform does not come with a nice way of exporting and importing object security the way the on-prem version of Hyperion Planning had with its command line utilities. I decided to take a stab at hopefully making it easier.

EXPORTING SECURITY

This entry will focus on exporting the current security of an application and formatting it into a csv file so you can easily edit the security or view its contents.

I am going to focus on group security since it is a best practice to apply security at a group level. You can follow this process for users as well.

The first step is to extract the current application security using the migration component of EPBCS. You can use a complete application extract or just extract the security. Below I am just extracting the group security.

Once the migration is executed download it to your machine. Unzip the file and you will find your currently assigned security in super fun XML files for each group.

Here is what it looks like in XML.

Its pretty simple XML but not great to edit. Now that you have the files I have written a Powershell script that will convert all of the files in a directory to a combined csv file. By no means am I a Powershell master but I like it because its extremely quick, and handles XML, JSON, and CSV formats extremely well. Generally you will find it installed on most new servers or clients. Here is some documentation to enable it on windows.

It can also run on your mac or linux environment.

You can download the script here. Once you download it place it in a directory where you plan to execute it from.

The script accepts two parameters. The first is the directory where the XML files are stored and the second is the name of the file you want to create. If you don’t put a path the file will be saved in the same directory as where you execute the script. The parameters should be separated with a space.

Once you launch the shell navigate to the directory you have saved the script in.

Here is the command :

./sec2csv.ps1 “V:\POWERSHELL\BLOG\HP-TOYS\resource\Security\Access Permissions\Groups\” security.csv

Place quotes around the directory if you have spaces in it.

It will look like this in the shell.

Once the script runs it will generate your export.

It has now combined all of your group files into one csv file that will look like this.

The information in this file will be very similar to what was required for the import and export utility in planning. The definition of these fields can be found here. The only one that is really different is the IsUser field. For group security this will always be N.

CONCLUSION

I hope you find this first part of this conversion helpful. I am testing the CSV to XML script now, it has been working great and I should have that up soon. I hope this helps you manage security in the cloud!!

EPBCS – Improved Smart List Lookup in Calculations

Over the last month I had to customize almost every feature in the EPBCS workforce planning application for a client. I have much to write about that experience but this one is my favorite. Do you want your Smart Lists to be dynamic but do not want to write long if/then statements for every possible value in the lists? Well read on…..

When I first saw the feature to build a Smart List from metadata I started to get excited about the potential to use that in a calculation. After going through the Workforce app in detail I have seen the light. If you are not familiar with that module yet it basically has three plan types. The first two BSO cubes are for the plan and another one is for the rates. When you create a TBH it performs a lookup in the rate cube to get default values. Here is a simplified example on how the functionality works so you can use it in your applications as well.

I built a 2 database application one called FINPLAN and the other called RATES.

Here is the dimensionality for the two plan types.

FINPLAN

RATES

 

 

 

The idea is that a user can enter in information in the Finplan cube and the system will do a look up in the rate cube based on a region they have selected in the Smart List. Here is a quick preview of the form.

 

Creating the Smart List

Once the dimensionality is set up correctly you need to set up a Smart List that is based on your application metadata. I created a Smart List named REGIONS and configured it to point to the level 0 members of the regions dimension.

This creates a dynamic list based on the application metadata. Notice on the Entries tab the id for the members in the list. If you know the backend of planning this is the OBJECT_ID from the HSP_OBJECT table, planning is going to use this identifier in your calc scripts to identify the member in Essbase.

After the list is created I assign it to the account Opportunity_Region in the FINPLAN Cube.

RATES

I then built a webform against the RATES database to store the rates an the No Entity member and at the Regions.

I am just storing a monthly rate by region in this cube.

IN ACTION

Here is a quick view of it working:

CALCULATION

This is where the magic happens. Basically we want to pull a value from the rate cube using the value assigned to the Smart List in the planning cube.

Here is the calculation:

Not as pretty version :

FIX (FY18,Plan,Working,USD,”No Entity”,@RELATIVE(“OPP”,0))
/*Define a variable to assign the Smart List value to*/
VAR SLVal;

/*Opportunity Overhead cost, lookup in Rate cube*/
“OVERHEAD_COST”(
/*Assign Smart List Value*/
SLVal=”OPP_REGION”->”BegBalance”;
IF( NOT (@ISMBR (“BegBalance”) ) ) /*Only Run on Months*/
IF( (SLVAL <> #MISSING) AND (“VOLUME”->”BegBalance” <>#MISSING)) /*Check if there is a Smart List Value and a volume*/
@XREF(_RATES_,”OVERHEAD_RATE”,”No Entity”,@CONCATENATE(“HSP_ID_”, @Name(@hspnumtostring(SLVal)))); /*apply Rate cube value*/
ENDIF
ENDIF

)

ENDFIX

THE MAGIC

If you look closer at the Xref formula you will notice@CONCATENATE(“HSP_ID_”, @Name(@hspnumtostring(SLVal))))

Lets break this down.

  • The SLVal was assigned from the Smart List value and it is populated with the OBJECT_ID of the member in PBCS.
  • The CDF hspnumtostring converts the numeric id to a string
  • The @Name function really makes sure its a string . Probably redundant.
  • Then its concatenated with “HSP_ID_”

Using the NY member as a value it would create a member that looks like this.

HSP_ID_50097

Behind the scenes Planning then can use that definition to translate it into a member that Essbase can read. I think with an alias table in Essbase.

Pretty cool!!

NOTES

The Xref formula uses txt values but if you need to convert this into an Essbase member then wrap the formula around an @MEMBER like this:

@MEMBER(@CONCATENATE(“HSP_ID_”, @Name(@hspnumtostring(SLVal)))))

 

You will only be able to use this in calculations and not in fix statements since you need the data value of the Smart List to drive the formula. I tried to get it to work in a member formula and was not successful, but I only spent a few minutes on it.

CLOSING

This is a pretty nice enhancement that is a key to understand if you need to troubleshoot or enhance Workforce in EPBCS. Plus you can utilize it in your own applications to make your scripts and applications more dynamic and focused.

Here is the LCM if you need it.