Want to know The Truth About CPM?

01 May 2016

The Compleat Idiot's Guide to PBCS, No. 11, PBCS and data

Short(ish) and sweet(ish)

The previous post on this was long, too long maybe (Maybe?  Definitely.) and certainly the longest post I’ve ever written if counting the pages in MS Word is a guide – 56 – and that is too much of a good thing. In my defense, the length came from trying to show you all of the different ways PBCS supports loading metadata.  I’m going to try to keep this a bit shorter although the same approach of using graphical how-to between Workspace and the Simplified Interface will continue.

This post is going to show how to follow the next steps in loading Actuals into a forecast:
  1. Clear out the current Actual month just in case there’s something there or if a reload is required.  ←No, I’m not going to do that this time round as this post will again be too long.  You’ll have to wait.
  2. Load of Actuals data to the current month.
  3. Aggregate the database ← Ibid.  Sorry, but I can’t do another 50+ page post.

The post after this one (edit:  nope, the one after the one after this one, see above) will show (finally) how to do this in an automated way in both on-premises and PBCS.  If you thought there were differences between the two products thus far, you ain’t seen nothing yet.  

To reiterate, based on the length of this post, it’s data, data, data and no calculations as yet.  Sooner or later…

Sort of a mea culpa about data

Back in part 9 of this series, I wrote:

The on-premises (and PBCS) native data load functionality is so brain-dead it beggars belief.  Really, it’s just awful.  At least the two are at parity.  :)

Well, it is brain-dead.  And it is awful.  But, it does work.  This isn’t quite like my EPMA rant (see here and here and here and I’m sure there other places) but let’s take a look at data file formats for PBCS and then decide for yourself.  Btw, if you disagree with me, I’d really like to hear why.

An important bit to remember

Usually (always, actually) when I load data into a Planning application, I use an Essbase load rule to do so.  In fact, I’ve never, ever, ever used Planning to load data.  There is no direct access to Essbase in PBCS and hence no load rule.  You must provide a preformatted and pretransformed data file and it must be a text file unless you load through the FDMEE-lite that comes with PBCS.  I’ll cover that in a future post and as it’s not required and maybe not even desired, I’m going to stick with loading text files.  

I’ll also note that the data files can be in a zip format but again I’ll cover that in later post to help somewhat with brevity.

First the sweet, then the bitter

This is what Essbase (PBCS) data ought to look like.

There are the following dimensions in the PBCS sample application Vision’s Plan1 Plan Type:
  • HSP_View
  • Year
  • Scenario
  • Version
  • Entity
  • Product
  • Account
  • Period

It looks ugly when Yr. Obt. Svt. marks it up, but it’s a very simple dimensional mapping of data with each column representing a dimension and the last being the data values.

The tab delimited record:
"BaseData"    "FY15"    "Forecast"    "Working"    "410"    "P_000"    "4110"    "Jul" 2000

Also note that all field values except data must be enclosed in double quotes.  This is not necessary when loading through an Essbase load rule in on-premises Planning.  Why the tab and then double quote delimiters?  Damfino.

That bitterness?  Is it cyanide?

I might be resorting to a teensy-weensy bit of hyperbole, but this format just offends my Essbase and relational and I’m sure something else sensibilities.  

In any case, this is what PBCS data looks like and it oughtn’t.

The Planning data format is…different.  There’s a concept of a row and a column which in this case is all level 0 Products and July respectively.  The other data elements are in the Point of View but are actually they’re analogous to the data columns above.  In addition, the Plan Type is defined in the record as well.

NB – This record is comma delimited but it could have used a tab or some other character if desired.

The comma-delimted record:
P_000, 2000, "BaseData, FY15, Forecast, Working, 410, 4110", Plan1

Why would Oracle Hyperion do this?  The data ends up being sandwiched between the row (which ought to be more than one dimension in just about anyone’s definition of a fact table row) and the column and the “POV” which btw require all of the other dimensions – and thus fulfills the role of the row columns expcet it’s got a different name – to be defined within double quotes.  Why oh why oh why is this considered “better” or even “necessary” or “not-put-out-of-its-miserable-life-by-a-bullet-to-the-back-of-its-head-whilst-kneeling-at-the-edge-of-a-lime-filled-pit”?  ←This last bit might be a tad over the top but take it as read that I’m not a fan of seemingly needless complexity.

Seriously, Gentle Reader, tell me why it’s so and I’ll write a panegyric about why you’re right and I’m not.

Just in case you think I’m telling porkies, that Planning file format comes out of PBCS’ own self:

No, one cannot put more than one dimension into the Row slice definition.  Believe me, I tried.  

So I call mea culpa on my mea culpa – the Planning data format is brain-dead.  I’m sure someone, somewhere, somehow uses it but I cannot for the life of me understand why.

Step-by-step-by-step

Putting aside that rant – and it was a good one wasn’t it – let’s play the how-does-a-Compleat Idiot-do-this game.

Before and after the fact

NB – I’m only going to show this once as loading the data will have the same impact each time.

July is empty

Here’s my oh-so-simple load of data.  Note that member P_170 aka Tricorders is part of the form.  Yes, I do actually write these posts in serial.

July is loaded

There it is.  Rinse, rather, repeat through the three approaches to loading data to follow.

Workspace

As with metadata Workspace is dead easy and fast.  Do you see a pattern here?  And no, it’s not because I’m used to Workspace, it’s because there are just so few steps compared to the SI.

Navigate to Administration->Import And Export->Import Data From File.

Pick your poison.

I’ve selected Native Essbase and pointed it to Plan1:
Ta-da, we’re done.

Did that in five (give or take) steps.  Will I match that with the SI?  When Sus scrofa domesticus becomes airborne should be your guess.

Simplified Interface

By now you should be familiar with the Console as this is where most processing occurs.

Get a job but not this time

One thing to note is that I’m not explicitly creating a job this go round as I’m directly loading the file.  Regardless, I’ll be viewing the success or failure of this load via the Job console.  Weird.

Putting aside weirdness, I’ll go to the Actions button and select Import Data.

You can see that I’ve already created jobs to import data.  No mind, I’m going to create a new one:

Oooh, I’m at the bit where I define the file I want to import.  Let’s go pick it.  

What could possibly go wrong?  I’m loading the Essbase format file.

As I am a bit on the cautious side, let’s validate that file before actually loading it.

Hmm, something isn’t working out.

The Plan Type name isn’t there.

The problem is I picked an Essbase file format where a Planning file is requried.  Bugger.

Let’s try that again but this time select the right file type.  Duh.  

Sparing you the act of getting back to the file, this time Click on import having selected Essbase Source Type.

Ahhh, that tastes better.

This time it’s all good although the message…
…is just wrong Gentle Reader as I checked to see if the data loaded; it did.  Oh well, what would life be without a scintilla of uncertainty?

So long as we’re counting steps, I make that seven.

Get a job you bum!

This time round we’re going to load this to the PBCS InBox.

Our first stop naturally then is the Inbox/Outbox Explorer.

And then Upload the file to the InBox:

Pick the file.

Actually upload it

There’s the data file:

It worked:

A job can’t be run that doesn’t exist.   Create it by beginning an Import Data action.

Then Create it:

Does this look familiar?  It should as it’s the same screen we used for the local file import.

Change the location to the inbox, set the format as Essbase, and name the file with care.  It must match the file name in the inbox.  Then click on Save as Job.

Give it a name

We now have an import data job:

Close out, back to the Console, select Jobs, and now click on Schedule Jobs.

Once in the scheduler, click on Import Data, Run Now, and then Next.

Pick the Inbox file and then click Next.

This is your very last chance.  Since you are that devil-may-care sort of a chap that I know you to be, throw caution to the wind and click on Finish.

Lo, the mighty Infernal Machines begin their terrible work:

It works!

Let’s see what happened by clicking on the title, “Load Forecast Data”.  Oh, PBCS has a sense of humor.  Zero records?  Really?  Really?  Really?  No, not really.  

In fact it loaded:

Whew.

What have we learnt?

Other than I could keep this down to a mere 37 pages?  

More importantly, there’s an interesting observation for all to see with regard to effort:  the traditional Workspace requires less effort than local file Simplified Interface which requires less effort than jobs in the Simplified Interface.

Using this post as a guide, to load a single data file into a single plan type takes by environment:
  • Workspace – five steps
  • Simplified Interface Local – seven steps
  • Simplified Interface Job – 25 steps

Hmmm.

The other thing you have figured out is that I’m not fond of the Planning data file format but in comparison to an “old-fashioned” user interface that is five times as easy (using the rough number of clicks as a measure of effort) as the super-duper new one that dislike ain’t much cop.

We’re almost at the end of the follow-the-monkey approach.  There’s just the Business Rules that clear and aggregate the data.  I will note a handful of differences between on-premises and PBCS in that post.

And after that, I’ll finally tie this all together using both PBCS’ EPM Automate as well as traditional on-premises scripting.

Be seeing you.

5 comments:

Steve said...

Nice post, but I can't seem to load even the simplest data file!

I am trying to load a Native Essbase file to PBCS and am getting an error, so I assume that my file format is junk. The (currently) single-row input file looks like this (tab-delimited):

"KRO" "Actual" "Final" "D100" "W1" "Day7" "FY16" "S1101" "P5" "A91003" 0.00

The error I get is: Member "��"" not found in Essbase database. Error code: 3303 Record: ��"

All of the members above do exist in the DB. I have also tried space- and comma-delimited formats with the same result.

Any thoughts as to what about the format is bad?

Cameron Lackpour said...

Steve,

Have you tried using PBCS' data export to see if you get *exactly* the same members?

I know the format is ugly (and I am slowly coming round on it for text and smart lists although I still think it's your-baby-is-ugly ugly.

Here's the official error message documentation:
https://docs.oracle.com/cd/E17236_01/epm.1112/eis_sysadmin/frameset.htm?apas06.html

There are a bunch of hits on google as well:
http://john-goodwin.blogspot.com/2009_12_01_archive.html
https://community.oracle.com/thread/2316035?tstart=0

I think you really don't have one of those members in the app.

Regards,

Cameron Lackpour

Steve said...

Hi Cameron,

After performing an export using the PBCS functionality, I got this file (now with two rows of data in it), that I am able to load using the Planning data format:

Account, Actual, Point-of-View, Data Load Cube Name
A91003, 4.25, "KRO, Final, FY16, P5, W1, Day7, D100, S1101", Ops
A91003, 5.75, "KRO, Final, FY16, P6, W1, Day7, D100, S1101", Ops


But it's still not Essbase format and yes, it's gawdugly. Although I can format the load in this way, I sure would like to use the Essbase format, but I cannot for the life of me figure out how!

Steve said...

One more thing: I absolutely do have all of those members in the app (which the above clearly shows). It ain't my first rodeo, my friend (and I mean that quite kindly)!

Steve said...

Ugh.

Never mind - found the problem.

The text file was encoded as UCS-2 LE BOM instead of ANSI.

I shall now proceed to skulk away in shame.

Thanks for your timely responses!