Want to hire me? For availability and scheduling please email info@arcepm.com

02 February 2016

The Compleat Idiot's Guide to PBCS, No. 4 - Philip Hulsebosch and Smart Push

A note

I have again managed to convince Philip to write a guest post.  Actually, he suggested it but I like to think that I am Svengali or at least a salesman.  Nope, this was all his idea.  It’s really great that our EPM community can come together and share in this way.

And with that, and a wee conclusion at the end, take it away Philip!

Smart Push, this blog, and you

Cameron, some other consultants and I were given the opportunity to work with a Planning and Budgeting Cloud Service (PBCS) pod (Version 15.11.65). We could see, touch, and feel the latest version of Planning, including all the new functionality already included there. We were able to explore and experiment with the specifics of the Cloud Service and found that there are some major differences between working with an on-premises and cloud version. As an example, we (sort of, or have at least resigned ourselves to it) love EAS, because it gives us a feeling of control. During our exploratory work, I sometimes missed it, but seriously, I did not need it. Now, I can imagine a life without EAS.  PBCS has much to offer and a different and often better way of handling common Planning requirements.

As Cameron noted, this series of PBCS blog posts is all about sharing our knowledge with the wider community as it marks a new era in the category of planning tools.

Smart Push Functionality

Smart Push is a method to push data into different Plan Types with a process triggered from a data form.  You can push data including comments, attachments, and supporting detail from Block Storage Option (BSO) Plan Types into other BSO or Aggregate Storage Option (ASO) Plan Types. It is different from EPMA Data Transfer, partitioning, or @XWRITE/@XREF. It comes close to the existing on-premises data copy into the reporting Plan Type through scheduled transfers, but I think it is easier to set up and more integrated into Planning.

Why is this so exciting?

Some users want to see the results at aggregated levels of the data input they have saved a few seconds ago. Yes! They also want to have all their KPI’s and Variances calculated as well. Yes! And you want to deliver, because you are so nice to them!

First, you tried aggregations. Then you needed to optimize these aggregations to the very last. Then you were having a reporting Plan Type in your Planning application, and you have built scheduled transfers running each couple of minutes and eating much of your server resources. Ugh, ugh, and ugh.  No, do not give up. Smart Push comes to the rescue.

This post will cover how you can implement this functionality and show you how it really works using the PBCS version of the sample Vision application which has a reporting Plan Type.

How to do this?

First, you need at least two Plan Types in your Planning application to connect these. A reporting Plan Type will do very good as a target although it could be other kinds of Plan Types if required.

The next step will be to map the source to the target. In this we need to take care of the dimensions, the point-of-view and the data options. Conceptually, this is familiar ground.

We will then define this connection in a data form and configure that form to run the data transfer on the save data process.

Lastly, I will show an example where Smart Push is tested with data.

Whoops, really lastly, I will show in the Simplified Interface where to check for errors.

Map Reporting Application

The first step is to map the reporting application. This target can be any BSO or ASO Plan Type in the Planning Application. Select the menu “Administration” and then the option “Map Reporting Application”.  

Figure 1: Administration menu to select the option “Map Reporting Application”.

The window “Map Reporting Application” opens. Here you can see existing mappings and with the green plus sign, new mappings can be created.

Figure 2: Overview of the mapped applications.

It is best practice to use a name which describes the mapping of the source and target Plan Type. I have taken here the name Plan1-VisASO. Accordingly, I take “Plan1” as the source and “VisASO” as the target Plan Type. Please note, multiple connections can be made between Plan Types.

Figure 3: Selection of source and target Plan Type.

In the second tab is called “Map Dimensions”. Here, the dimensions between the source application and the target application are mapped. There are 3 options for this:
  • Not Linked
  • Dimension to Dimension
  • Smart List to Dimension

Figure 4: Mapping of the dimensions.

In the member selection column, a selection can be made of the members of the individual dimensions. Therefore, it is important to know, which members exist in both Plan Types, otherwise the data cannot be captured at the receiving side.

There is no dimension “HSP_View” in the reporting application “AVision” and there is no dimension “AltYear” in Plan1 cube of the planning application. Therefore, no Element has been selected in figure 4 for this dimension “AltYear”.

Note: At least one dense dimension and the “Account” or “Period” dimension must be mapped. This is also visible at the upper section of figure 4.

When opening the Dimensions section in the Administration menu, one can select the receiving Plan Type, which is in our example “VisASO”, and see the reduced member set of the dimension accounts as shown in figure 5.

Figure 5: Account dimension of Plan Type “VisASO”.

In the member properties, you can see at the Source Plan in which Plan Type the data is stored. The checkmarks further down at the Plan Type section show in which Plan Type this member exists. In this example it is in the “Plan1” and “ViSASO”. Unchecking it will remove this member and all descendants from this Plan Type. Therefore, take caution here!

Figure 6: Member Properties and Plan Type selection.

Because we have a mismatch on dimensions between the Planning application and the Reporting application, we need address this in the “Point of View” tab. Here the member from the dimension is chosen which holds the data and which will serve as a placeholder for the missing dimension. In figure 7 you see the member “Base Data” selected for “HSP_View” and the member “FY09” for the dimension “AltYear”.

Figure 7: Mapping of the POV.

At the last tab of the Map Reporting Application you can see the “Data Options”.

Figure 8: Options about pushing comments, attachments and supporting detail or not.

Note the Interesting option “Append multiple cells into one cell”.  This is new functionality that  will merge the comments and attachments in the relational data source. This is impossible to do out-of-the-box.

Now, all settings were done and we can save the mapping. As you can see, multiple mappings are possible in this window (figure 9). In the next section I will continue on adding the mapping to a data form and enable Smart Push. This is the real new part.

Figure 9: Saved Application Mapping.

Activate Smart Push in Form Management

The next step is to enable Smart Push in Form Management. As you can see in figure 10, there is a new tab called “Smart Push”. Here you can add, change and remove Smart Push definitions of this data form.

Figure 10: Enable Smart Push at the data form.

After pushing the add connection button, a window with all mappings will be shown for selection. In this, I select the mapping created in the previous steps.
Figure 11: Selection of the Application Mapping.

It is very important to select the option “Run on Save”. This rather tiny option is on the top right side of the window.

Usually, you will select the option “Use Form Context” for all dimensions, but there is an option to overwrite the selection of the data form. An example is shown in figure 12. In other words, technically, you can copy data from Scenario A on the data form into Scenario B in the Reporting application. It is possible to add a few members into the selection. Mmm, this will be interesting…

It is also possible to add more than one mapping to one data form. Pushing the same data into different locations….

Figure 12: Setting about Form Content and example of an Overwrite Selection.

This is really smart, isn’t it?

I saved the form and now move over to test it.

Testing of Smart Push

I am doing some data entry on the data form. Then pushing the save button.

Figure 13: Data on the form before save and kicking off Smart Push.

The Smart Push actions always take place after the save and calculate process.

Figure 14: Information box after save data. Smart Push was successful.

Figure 15 shows the situation in the reporting application before pressing the save button on the data form. We see there is no data in the reporting application on the same POV as at the data form of Plan 1. We have selected the member “FY09” in the Dimension “AltYear”. This is where we defined the data target to be in the mapping.

Figure 15: Reporting Application (VisASO) before saving data in the “Plan1” application.

Figure 16 shows the situation after pushing the data into the reporting application. At level 0 the data look fine. We see summarized data in the member “Sales”, but the amount on “Net Income” -> “Sales” is incorrect!
Figure 16: Reporting Application (VisASO) – the data were pushed over and aggregate into the nodes.

Where is the problem for this delta? We see in the reporting application data too low for ”Gross Profit“ and see no data for ”Total Cost of Sales and Service“. This is missing and did not get transferred. When we have a look at the data form used for the Smart Push, we see these members are not included in the selection and not present on the data form.   

To correct the situation, I added the “Cost of Sales” member onto the data form and repeated the data entry and save. This did the Smart Push of the data again.

Figure 17: Data at the modified data form, now including the Member “Cost of Sales”, save and Smart Push.

When reviewing the data in figure 18, then I see the correct values for “Net Income” -> “Sales”. The Smart Push is looking at the data form for which data to transfer.

Figure 18: Reporting Application (VisASO) – now the data of “Cost of Sales” were pushed as well and aggregated.

Smart Push always clears the target data area before loading the data. In other words, also #missing values are transferred.

There is a log file on Smart Push Processes. The application manager should review this for areas of problems. The interesting part is, you can see this log only in the Simplified Interface. To get there, select the option “Console” > “Jobs” and set the filter.

Figure 19: Adjust the jobs filter for Smart Push.

Figure 20: Then select the status.

Figure 21: Example of Smart Push Job in the overview.

Philip’s final comments

Smart Push is already widely adopted within PBCS Planning applications and will be in on-premises whenever that comes out because it is so powerful and can thus be applied with a lot of creativity. However, it is very important all data is copied. Nothing kills the reputation of an application faster than wrong data.

Likely a data transfer process needs to be established next to Smart Push, which synchronizes all data. Smart Push is for the end-users who need to see their changes without waiting for some admin process to finish.

Compared to partitioning, @XWRITE and other options of data transfer, Smart Push is rather simple to configure and maintain. I am sure you will love it too!

Regards, Philip Hulsebosch.

Cameron’s final comments

Philip, again thank you so much for putting all of this hard work into this post.  We all owe you a debt of gratitude for showing us on-premises luddites how Smart Push works.  I (we) can hardly wait till it and the many other cool features of PBCS makes it to on-premises.  

Be seeing you.

28 January 2016

Working in EPM? Live in South Florida? Not going to the South Florida EPM Meetup? Why?

Why indeed

Just what is a meetup?  Given EPMers’ technological bent, it is surprising to me that we don’t readily cotton on to the concept of a meetup.  That’s a pity because they are a great way to (danger ahead:  a geek who thinks he’s witty) meet up with like-minded individuals.  Think of them as a social media tool involving living, breathing meatware, using their collective wetware, all occurring in real life.  Isn’t slang wonderful?

What does that all mean in Plain English?  

Have you been to an ODTUG-nurtured EPM meetup?  Hardworking (given her ODTUG volunteer workload it’s more like insanely hardworking) Janice D'Aloia heads that initiative within the ODTUG EPM community.  A note about these meetups:  ODTUG encourages them through funding and support but at the end of the day a meetup is owned by its members, not ODTUG so please don’t think attendees are taking any orders from what-is-likely-the-best-Oracle-user-group-ever.  It’s all part of ODTUG’s service to its community members and yes, it is pretty noble sounding and it just plain is.

If you are interested in getting help starting up a meetup in your area, please contact ODTUG at erin@odtug.com to start the meetup ball rolling.  ← That’s an idiom, not slang, but aren’t idioms just as wonderful as slang?  Discuss.

South Florida EPM Meetup

And that brings us to a specific meetup, namely the upcoming South Florida EPM meetup.  It’s  occurring on Thursday, 18th February, 2016 at Dave & Buster’s Hollywood, Florida location from 3:30 to 6:30 (or later if you’re having that much fun) pm.

What’s on offer?  The very things that make meetups so much fun:  education in form of a Special ODTUG Surprise (And no, I do not qualify as a surprise, or at least not a pleasant one.  There will be a projector, and a laptop, and a demo.), a super geeky-cool game, and the opportunity to meet your fellow Floridian EPM practitioners.  What’s not to like?

It’s easy-peasy to register – you can do this on ODTUG’s EPM meetup page right here.  Meetup.com provides a lovely confirmation screen once you’ve registered.

And Bob’s your uncle, you’re set to attend.  You are going to, right?  You should if you’re not.

And to whom to do we owe the pleasure?

As I noted, ODTUG is an enabler but meetups are intrinsically grassroots.  They are founded and led and staffed and attended by you, the EPM geek.  The organizers are just like us – people who live, breathe, and eat EPM.

In the case of the South Florida meetup it’s Jessica Cordova of ARC EPM and Kris Calabro of Tyco International.  Jessica and Kris make meetings like this possible and we’re all in their debt.  Meetup organizers, whether they be hiking enthusiasts, British sports car owners (make mine a Sunbeam Tiger with Minilites), or yes, even EPM practitioners do it because they love whatever the passion is.  Benefit from their enthusiasm if you’re in the South Florida area on 18 February, 2016.  Join us, won’t you?

Be seeing you.

21 January 2016

Stupid Programming Tricks No. 28 -- LCM, 7z, and Planning migrations

How long can Yr. Obt. Svt. be wrong?

The answer to that question is apparently indefinitely.  And the task so trivial.  Sigh.

The problem

This post was supposed to be one in my Compleat Idiot’s Guide to PBCS series and I will use some screenshots from a future post on on-premises to PBCS and back migration but I got hung up on making this work.

And then I realized I made the same mistake at a client.  Remember what consultants are supposed to do:  help customers.  I did, sort of, but I made my task much harder.  Sorry.

Let’s walk through this using PBCS although as you will see the issue is exactly the same in on-premises.  Sigh.

The Brotherhood of Man

I have to give credit to my younger, taller, smarter brother from a completely different set of parents, Celvin Kattookaran.  In my hour of need (I have many, too many) he came through for me and didn’t even make all that much fun of me when he explained the answer.

Breaking LCM

I want to migrate an on-premises application to  PBCS.  I have the old Planning sample application from (I think)  I have it working in both as well as  It’s a simple application, really simple, and I find simple hard enough as will be shown.

Please leave the premises

Here I am on my on-premises install of  I click on Application Management and…

Here I am in good old Shared Services:

I then export the application file objects:

I’ve moved it to another my Windows 7 VM (I try to keep my VM local only):

Not a cloud in the sky

Here I am in PBCS’ Application Management.  It’s just about the same as Shared Services:

Upload the LCM zip file:

And, as expected, here are the artifacts in PBCS.  As Danny Kaye says, everything is tickety-boo.

So what’s the problem?

If I unzip the application download and then make a change, any change, or even no change at all, and then rezip the file I get this on reimport.

Here I am in 7-Zip, a really awesome WinZip open source clone.  Fwiw, I actually have WinZip (I even paid for it and don’t like violating licenses by installing it on more than one machine) on my bare-metal laptop but 7-Zip is free and has almost all of the functionality; in some ways it’s quite a bit more advanced.

Here’s the folder unzipped.  If I were migrating this to an on-premises install I could copy this entire folder without compression to the import_export Shared Services folder.

Now I’m going to zip this to SampApp1a.zip.  Note that not a blessed thing has changed to the contents.  Also note that the compression engine, be it 7-Zip or Windows’ own compression functionality, makes no difference.  
The below are defaults:

Zip-a-Dee-Doo-Dah indeed

Uh-oh.  There’s a difference in size of the compressed archive.  How can that be?  Nothing and I mean nothing has been changed.  Oh well, there couldn’t possibly be anything to worry about yr. obt. svt. blithely thinks.

The upload goes swimmingly, I think.

Oh how wrong I am despite this lovely message:

And what happens when I try to open the file?

¡Ay, caramba!

Is the service really not available?  That’s silly as it’s clearly there and thus we see Yet Another Confusing Error Message.  Perhaps Oracle never anticipated someone doing something as boneheaded as is described below?  Probably.

Here’s the problem.  I zipped an unzipped folder and that was the issue.  Wot?  It’s the same, right?  Nope.

Here’s the zip file from Planning.

And here’s the unzipped-to-zipped archive.  Do you see what I did?  I zipped the SampApp1 folder within the archive.  What?

It actually makes sense – I zipped c:\users\cameronl\downloads\SampleApp1 to c:\users\cameronl\downloads.  The SampApp1 folder is part of the overall path and thus it gets included in the zip file.  Ultimatel Fail.  Although to be fair that isn’t actually an intuitive result.  Regardless, I should have looked into the zip archive itself but alas did not.

Nothing’s Impossible

The solution is to go into the SampApp1 folder and zip from there.

And here we go, just as LCM defines the file structure.

Success! Boil in bag!

Upload it and all is well.  Despite the changed (or in this case not changed) zip file.

Ballin’ the Jack

And that’s it.  So trivial and so painful.

Oh yes, my poor unfortunate client.  Ugh.  I was tasked with splitting up a Planning application.  I downloaded the LCM xml files, did my modifications, zipped them back up and…failure.  Bugger.

As this was on-premises Planning, after a moderate amount of pain I was able to get access to the import_export folders and move the modified LCM files.  I understate the case:  getting that access was really painful.  If only I had begged on my knees asked for Celvin’s help back then.

May my errors not be yours.  

Great American Songbook

It’s difficult to tell if any of you ever click through on the hyperlinks I sprinkle throughout these posts.  Assuming that you do (or maybe assuming that you don’t) I thought I would give you a listing of the music (and one TV show and a few movies) so you can have some idea of my strange cultural tastes.  As I like to remind Natalie Delemar –  @EssbaseLady – popular culture pretty much doesn’t interest me much past 1965.  It shows.
In order:
  1. How to Succeed in Business Without Really Trying, The Brotherhood of Man, Robert Morse (The 1967 version is the OBC definitive version or as close as we can get save a time machine to get back to 1961.)
  2. The Honeymooners, Please Leave the Premises, Jackie Gleason aka The Great One, et. al.
  3. Merry Andrew, Everything is Tickety-Boo, Danny Kaye
  4. Song of the South, Zip-a-Dee-Doo-Dah, James Baskett
  5. Swing Time, Pick Yourself Up (medley), Andy Williams and Jack Jones
  6. That’s My Boy, Ballin’ The Jack, Dean Martin, Polly Bergen, and a guy who reminds me of me only with a lot more talent

I like to think that a hundred years from today The Great American Songbook will be what our descendants will view as the musical acme of the 20th century.  Jazz is America’s Classical Music, or at least it is on this blog.

Be seeing you.