26 May 2016

Kscope16 Apex sessions I'm interested in

Apex?  Whhaaaaaat?

Although it may be somewhat hard to believe, in fact yr. obt. svt. has technology interests outside of EPM many of which – like what would it take to really build that fusion reactor in my back shed – are likely best left unexplored.

What is best explored is an exploration of what else Kscope16 has on offer.  As most everyone knows, my quite-a-bit-smarter-not-at-all-genetically-or-familially-related-but-somehow-confused-for-me-brother Martin D’Souza, is Mr. Apex.  Although explaining Apex to me required Martin to exercise patience and lots of really small words, I now have an appreciation for Application Express aka Apex.  For those of you as clueless as I (so, exactly no one but bear with me regardless), it took a while for the penny to drop:  Apex’s concept of a relational back end and programmatically generated web code is what builds practically all of the web sites you visit except maybe ones like this.  

Good grief I – and I’ll bet you – spend a lot of time on the web:  message boards, technical documentation, user group portals, really cool upcoming conferences, this idiotic web site.  Given that just about all of the EPM products are web based, and are becoming even more so over time, and are in fact conceptually Apex-like, it occurred to me that I ought to actually try to understand what’s going on under the hood.  This is a block for hackers, right?  And what better conference than Kscope16 for understanding that?

And with that…

Paranoia

Data breaches are everywhere on the web.  Everywhere.  It’s nice to know that Kscope developers get it in the form of defensive measures.  I suppose they’re exercising their inner paranoia which nicely matches with mine.

Security from Tim Austin

APEX Security: Discussing Real-World Security Risks, Jun 28, 2016, Session 10, 2:00 pm - 3:00 pm
APEX Security: Anatomy of a Cross-Site Scripting Attack, Jun 29, 2016, Session 18, 4:30 pm - 5:30 pm

I think, I hope, I pray that attending Tim’s sessions on real world security makes the paranoia go away.  I hope.

Internet of Things

Will EPM ever be part of IoT?  I am not holding my breath.  For those of us who are not terminally dull, Anton Nielsen is running a lab on just that (IoT, not being terminally boring as I am the world domain lead for that) via his Hands-On Training: Build Something! IOT = Internet + Oracle + Things lab on Jun 27, 2016, Traditional HOT 3, 2:00 pm - 4:15 pm.

My code never sucks.  Never.  Never. Ever.  

How does that go?  Admit nothing, deny everything, make counter-accusations, never change your story.  When things go tits up, obviously someone else buggered up the code.  And Peter Raganitsch agrees. Ah, I just read his summary more closely so that isn’t strictly true:  he’ll be talking about how to find the bugs, not shift the blame.  Where’s the fun in that as blame shifting is my modus operandi.  Given that I am kidding it behooves me to attend his It Wasn't Me: Finding Bugs in APEX Applications session on Jun 28, 2016, Session 12, 4:45 pm - 5:45 pm because sooner or later someone else is going to figure out my code stinks.

Why is Apex doing so well?  

Apex is cool.  EPM?  You decide.  Perhaps we can crib a page from Juergen Schuster’s Why? session Jun 27, 2016, Session 6, 4:30 pm - 5:30 pm.  I do think in the end that it boils down to Apex geeks being far cooler than we EPMers.  Seriously, we need to figure out why they have meetups, why their exclusive conferences are so awesome, and why they are kicking EPM’s conference butt.  

Is that enough?

It ought to be.

There’s some really good content in the Apex track even if we never ever ever plan on writing a link of code in that framework.  Security, IoT, code quality – these apply to us in equal measure.  

I hope to be at each and every one of these sessions and I hope you will be too.

Postscript

Blog Hop

Thanks for attending this ODTUG blog hop! Looking for some other juicy cross-track sessions to make your Kscope16 experience more educational? Check out the following session recommendations from fellow experts!


Things that go FOOM!

As a child of the Cold War, anything that involves acronyms like AEC or NRC or IAEA makes me positively squirm in my seat no matter how intriguing the thought may be.  Even solar radiation is something I avoid.  Playing with that stuff without fully understanding it (or wearing a dosimeter)?  Madness.  Apex?  Coolness.

Be seeing you.

18 May 2016

The Compleat Idiot's Guide to PBCS, No. 15 -- batching it up in PBCS and on-premises

Finally

Yes, finally.  You, oh Gentle Reader, and me, yr. obt. svt., are at that stage of finally tying together the different threads needed to perform the ostensibly simple monthly load of actuals into a Planning application.  As I’ve covered functionality I’ve made an effort to illustrate how to perform a given task in both PBCS and on-premises.  Arguably, that approach made the posts longer than perhaps needed but I, and I suspect you if you’re following along in this series, come to PBCS from an on-premises perspective and I wanted to highlight the differences.  No matter, it has been quite a bit of fun as well as hopefully educational.

And there is quite a bit that is different:  Jobs, Inbox/Outbox, Application Management, and most especially the epmautomate utility as will be covered in this post.  If you need a refresher or simply have far too much free time on your hands, I encourage you to peruse the below posts:

There’s quite a bit of content there, or at least awful lot of screenshots.  There are 200+ screenshots on just my contributions to this particular Compleat Idiot theme of monthly actuals.  That’s not counting my other posts as well as skipping over the contributions of Philip Hulsebosch and Chris Rothermel.  Good grief that’s a lot of PBCS content.  Some of it may even be correct, most likely the stuff Philip and Chris wrote.

It seems so simple

It is.  Let’s recap what needs to happen:
  1. Current application state is backed up to the local drive
  2. New metadata is loaded and the Planning application is refreshed
  3. The current month’s actual data is cleared
  4. Current month data is loaded
  5. The current month’s data is aggregated

And yet it is so different

The actions and concepts are exactly the same in on-premises and PBCS.  It’s those damn details that make it interesting.

When it comes to automation, the real difference is the number and type of utilities that are needed to perform identical actions in both platforms.  

Step the first – backup current application state

I showed in Compleat Idiot No. 13 how to use on-premises LCM and PBCS’ Application Management to manually back up applications and while the UI approach is largely the same the automation tasks are not.

On-premises

As with PBCS, one must first create an on-premises LCM export definition via the Shared Services console.  It’s a fair number of clicks but in the end it all makes sense:

LCM will write out the contents of the exported artefacts in one shot.  

If you have access to the Shared Services’ import_export folder, the exported objects are there for the taking:

Failing that, download the File System object from Shared Services and then look inside the zip file:

Whatever the approach, a file called Export.xml is created – this is the file definition of the export process.

Export.xml describes tasks which in turn define what is to be exported when the LCM utility is used:

The quintessence of eponymous

It isn’t often that EPM makes me laugh but this tickles my sense of humor:  what’s the logical name for LCM’s command line utility?  Why, “utility” of course.  Thus utility.bat exists and you can read all about it in the Lifecycle Management Guide.  

There are two things you must keep in mind, both of them fairly maddening.
Utility.bat (I am actually falling out of love with that name because if everything else I use at the command line in EPM-land is a utility, and they have names like outlineload.cmd and refreshcube.cmd thus I actually have a hint as to what they must do, a utility named “utility” doesn’t have a lot of utility.  Groan.  Really, the name is confusing.) must be run on the Shared Services server.  Actually, strike that – it has to be installed in some gawdawful way that is far beyond Infrastructure-Allergic Cameron’s abilities.  On my all-in-one VM, it’s not a big deal but it might be in the real world.  Check with someone who has a clue, IOW not me.

On initial run, utility.bat will ask for a correctly provisioned Shared Services username and password, which it will use to encrypt those fields in Export.xml.  And then it…deletes the file with the just-input and just-encrypted password on execution.  Double, no triple, no quadruple bugger.  You can catch it, just, in an editor like Notepad++ or TextPad if the xml file is already open and then not update the file when the editor catches the file change.  And then you make a copy of that file and thereafter copy that copy on top of the copy actually used.  OMG, it’s too awful to even type – just read Peter Nitscke aka @EssbaseDownUnder’s excellent post on it.  I thought I was losing what little mind I had left when this happened to me.  I think we can all agree that ain’t no feature.

Initial state:

Here’s the file somehow encrypted:

And here’s what you end up with right back at the beginning:

Maddening I tell you.  Maddening.

The answer is as Peter noted:  simply create a copy of the encrypted username/password file (btw, you can’t pass parameters to utility.bat – do it the utility’s way or do it all by hand) and copy that on top.  It doesn’t have to be sophisticated:

Did you enjoy the additional mini rant in the comments?  And how annoyed I was when I wrote it, i.e. the somewhat tortured grammar?  See, it is worth writing those things out if for the amusement factor alone.

Calling the code itself is then easy peasy no big deasy:
This will write the export content to the Export.xml (remember, it’s really the just-copied-over Export2.xml but hey, who’s counting?) location:

PBCS

I am happy to report that PBCS’ approach is much, much, much faster both to code and to describe.  

Remembering that just like on-premises’ backup a manual backup must be created to act as a template for LCMing (or App Managing) the application objects.

Step 1 – cause a complete backup of the Vision application:


Step 2 – download that backup to the local disk


That’s it.

And here you go in zipped format (on-premises creates uncompressed folders by tool although the ultimate content is the same):

Step the second – load metadata

On-premises

On-premises is easier this time, thankfully, as it couldn’t get a whole lot harder.  There are eleventy billion possible switches to that utility, most of which I have thankfully never used.  

Here’s the same file I used back in Compleat Idiot’s Guide No. 10:

Outlineload.cmd must be run from the Planning bin folder.  Good luck in gettting IT to go along with that when it comes to automation.  It has always been one of my more painful conversations during implementations.

If you can get past that (and really, what other choice is there), running it is a doddle.  I’m using Windows environment variables to make my code a wee bit easier to read but otherwise it’s 100% stock:

You’ll note that there’s an encrypted password file which, unlike LCM, does not get deleted on run.  Huzzah!

There’s output, lots of output from this command.  Chatty is one way of describing it although I suppose the proper Comp Sci way of describing it is “verbose”.

Here’s the output piped to a log file:

There’s also an error file which, whether it has any readable content or not, always exists, and always has at least three bytes of content, thus making it imperative that any automation process not only look for that file but look inside that file to see if something went badly or not.  At least it keeps us employed…

I could have used the /C switch in outlineload.cmd to force a refresh but I like to spell everything out so I’ve used refreshcube.cmd.  It, like outlineload.cmd is easy to invoke:

PBCS

Would you believe it isn’t quite as easy?  There are certainly more steps but in fact they’re quite a bit simpler:

It’s just four steps:
  1. Make sure the file NewProduct.zip isn’t in the InBox
  2. Upload the file NewProduct.zip to the InBox
  3. Use the job ImportNewProduct to load NewProduct.zip.  Remember that NewProduct.zip is the same file that the outlineload.cmd utility uses.
  4. Refresh the database.

I’ve gone into the overloading of zip files in Jobs for metadata (and a bit later data) before.  I encourage you to read Compleat Idiot No. 10 for all of the gen on how to do that.

Once the file has been uploaded to the Job called “ImportNewProduct”,

Just a note about epmautomate.cmd’s logging – it’s errorlevel return codes or nothing:
Status Code
Description
0
Operation completed without errors.
1
Operation failed to execute because of an error
2
Cancel pending.
3
Operation is terminated by the user.
4
Incorrect parameters.
5
Insufficient privileges.
6
Service is not available.
7
Invalid command.
8
Invalid parameter.
9
Invalid user name, password or identity domain.
10
Expired password.
11
Service is not available.

NB – The above is cribbed directly from the docs – I really do encourage you to read (and hey, steal give full attribution to sources) and learn from them – as they have improved quite a bit from the on-premises version.

Step the third – clear out the current month Actual data

On-premises

We have yet another utility on offer:  calcmgrcmdlinelauncher.cmd  Isn’t this fun?  It really is an example of how on-premises has evolved.  Remember that before 11.1.2.2 (11.1.2.1?) Hyperion Business Rules were written in EAS.  If you’ve really been around, business rules had its own horrific desktop tool.  Ah, the bad old days that are best forgotten.

In any case, there is a command line tool to launch business rules.  I’m not using rtp files or any of the other functionality.  See the docs here for more information.

NB – If you’ve ever wondered why I put in so many references to the documentation it is both because I am lazy (yes, I have been called that and haven’t quite decided if I’m flattered or insulted; I’m inclined towards the former) and because Oracle do a better job with the details than I ever could.

In any case, here are the parameters:  password, application, username, plan type, rule name.  

Unfortunately, there is no echoing of status.  Within the context of a extraordinarily simple batch script sans error checking I threw in an echo statement to at least tell me that it’s running.  Before you get excited about this lack of rigor, remember that this is a blog post, not an actual implementation.

PBCS

It’s good old epmautomate.cmd to the fore:

Again, additional parameters are possible.  See the PBCS documentation for details on how to parameterize the command.

Step the fourth – load data

On premises

As I wrote in Compleat Idiot Nos. 9 and 11, Planning’s native file format is brain dead.  Yes, I can sort of see the point when loading text or even Smart Lists (see, Peter, I do sometimes listen), but for the purposes of this series the purposes of most data load use cases, it’s beyond useless.

And, if PBCS supports the Essbase data file format, and on-premises’ outlineload.cmd doesn’t, what’s a geek to do?  The answer is spelt M-a-x-L.

The code to call MaxL is simple:

Here’s the MaxL code as called by the above batch script.  You now know the username and password to my 11.1.2.4 VM.  Don’t do this at home.

Did you spot what’s missing?  There’s no load rule.  None.  In that way it’s the same as PBCS loading Essbase’s data format.

The data file fully describes the outline layout.
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_000"    "4110"    "Jul"    2001
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_100"    "4110"    "Jul"    6184
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_110"    "4110"    "Jul"    6807
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_120"    "4110"    "Jul"    6425
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_130"    "4110"    "Jul"    6778
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_140"    "4110"    "Jul"    5198
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_150"    "4110"    "Jul"    3129
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_160"    "4110"    "Jul"    3750
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_170"    "4110"    "Jul"    1500

If there is an error on load, a file is created.

PBCS

Are you getting tired of references to empautomate.cmd or does it fill your heart with joy?  Hopefully the latter is true as it makes things so simple.

There’s no separate language such as MaxL.  There is the requirement to load the data file to the InBox after first making sure that the file isn’t already there.  If it is there, you’ll get a lovely error as there is no overwrite option.

Once that is done, as before it’s simple:

Step the fifth – aggregate the latest month

On-premises


PBCS

Step the sixth – exit

On-premises

There really isn’t anything to do with the Planning command line utilities as each one is self-contained.

I’m not going to repeat a screenshot but the MaxL script has a logout command.  There, that was fair wasn’t it.

PBCS


Strictly speaking this isn’t really necessary but the OCD/conscientious programmer in me insists on this.

Finally

We have reached the point in the show where all of the code is brought together in one delicious goulash.

The pictures tell a thousand words and as I’ve typed over two thousand thus far let’s leave it at that.

Results

On-premises

In the dimension editor

Metadata and data in Smart View

Aggregated

PBCS

In the dimension editor

Metadata and data in Smart View

Aggregated

Overall code

On-premises

I’ve enjoyed (?) counting the steps and using that as a way to measure complexity.  

In the case of on-premises I say complexity is a function of:
  1. Steps
  2. Tools
  3. Number of scripts

If you’re counting that’s:  12 steps, five tools (LCM’s utility.bat, outlinelineload.cmd, refreshcube.cmd, MaxL, and calcmgrcmdlinelinelauncher.cmd), and two scripts.  Add ‘em up and we get 19 discrete objects so long as your definition of object is sufficiently elastic.  Work with me on this – there are worse metrics out there.

The main DemoBatch.cmd code:

The auxiliary Essbase MaxL LoadForecastData.msh code:

PBCS

There are more steps with PBCS’ epmautomate.cmd primarily because of file management.

Counting the downloads, deletes, and uploads along with all of the other steps there are:  15 steps, one tool, and one overall script for a value of 17.  

Do we have a winner?

Even though Cameron thinks Math Class is Tough, especially for him, 17 is only two less than 19.  Is that a fair measure of PBCS being just a wee bit simpler than on-premises or quite a bit more or something else?  The metric I used in the other posts in this series focused on the number of steps in the user interface to complete a task.  Writing an automation script isn’t focused on clicks but instead commands and parameters.  

Perhaps a better way to measure simplicity is to make that score a product of the number of utilities by the number of possible commands or switches.  Assuming that more is less, there’s a fairly obvious winner and loser.

On-premises

Command
Parameters
LCM utility
1
outlineload.cmd
57
refreshcube
8
calcmgrcmdlinelauncher
7
MaxL import
12
Total
85

PBCS

Command
Parameters
login
5
exportsnapshot
1
download
2
deletefile
1
uploadfile
2
importmetadata
2
refreshcube
0
importdata
2
runbusinsessrule
2
logout
0
Total
17

On-premises uses five different utilities with a total of 85 possible parameters and switches while PBCS uses one utility with 17.

From an ease of understanding, writing, and managing within a script, PBCS is the clear winner unless you have a strange love for arcane and little used parameters.

When will we have epmautomate for on-premises Planning?

What’s next?

This is part six of a thankfully six part series on comparing on-premises versus PBCS administrative task processing in both interactive and now batch form.  To do this I had to document what seemed an almost maddening level of detail but if I want to understand, really understand something, I simply have to do it.  I now have that basic level of knowledge and I hope that you’ll be able to use these posts as your initial guide to PBCS when (I think we will all be PBCS customers sooner or later) you make the switch.

I haven’t exhausted the subject of PBCS and given the development cadence of the tool I have to wonder if I ever will.  Yes, the title of this blog contains the word Essbase and I hope to get back to that most excellent of databases but it might not be till the fall.  Really.

As for this administration use case I’m not done with this subject.  Look for an expansion (Yeah, I know, how could that possibly be but it will be, I promise.)  of this at the soon-to-be-here Kscope16 where I’ll present On-Premises Planning vs. PBCS: Common Administrative Tasks Compared, Contrasted, and Recommended with Jason Jones on Monday, 27 June 2016 at 12:45 pm.  

For those of you who don’t know Jason, here’s a recent photo of the two of us.  I’m on the right.
http://vignette2.wikia.nocookie.net/uncyclopedia/images/6/6f/Dean_martin_and_jerry_lewis.jpg/revision/latest?cb=20110601232043

Be seeing you.