Want to know The Truth About CPM?

21 November 2013

Stupid Excel Linked Workbook Trick

Introduction

This post, I’m afraid, will be about frustration.  Specifically, how Excel ate a day of my life, and project timeline, and project budget.  And also how Excel added yet more grey to my head (oh well, at least I have hair).  Read on, gentle reader, and yr. obdnt. srvnt. will show you how to avoid Excel driving you nuts.

Introduction to the introduction

My saga of frustration, wonder, and pain all came about because of a simple request to load data from an Excel workbook into Essbase.  A lock and send (this was to a BSO cube) is amongst the most basic tasks we can do with Essbase.  And I managed to get it wrong.  Sigh.  Some days are just like that, right?  But the strange thing was that I tried really, really, really hard not to get that send of data wrong.  In one of those cruel and ironic twists of fate, that very act of trying to get it right is what doomed me.  Like I wrote, it was just one of those days.

Some Excel functionality you may not be aware of

Just about everyone knows how linked Excel formulas work.  For the one or two of you who do not, here’s a recap of this supposedly straightforward process.  Don’t be put off by the simplicity of the examples – it illustrates the issue quite nicely.

Step 1 – create a source workbook

Easy peasy, lemon squeezy.  This is about as simple as a file can get with a single cell of data and a label to identify it.  Oh, Source.xls, I love thee.

Step 2 – Create a target workbook

It couldn’t be simpler, right?  Create a formula and link cell C2 in Target.xls to C2 in Source.xls.  Oh Target.xls, I love thee too.

Step 3 – Create a second target workbook

Huh?  Why would anyone do this?  Well, in my case because multiple workbooks linked to the same source file is highly germane to this Frustrating Excel “feature” so please humor me.  And really, it isn’t that unreasonable to have multiple target workbooks linked to the same file.  So Target_2.xls, love is still in the air.



Now we have three files, all with the 762 value.  What oh what oh what could the big deal be?

Step 4 – Close one of the linked workbooks

This couldn’t be easier – simply close Target_2.xls.  Now there are two files:  Source.xls and Target.xls open, both still with a value of 762 in cell C2.  

Pay attention, pay attention, pay attention

This is important:  Target_2.xls also has a value of 762 in cell C2 as linked data in Excel is stored in two ways, both as a formula link and as a local value.  I know that’s confusing, but that feature is how Excel can display information in target workbooks when the source isn’t available.

Stick with me folks, it is about to get very interesting.

Step 5 – Change the value in the source workbook

In Source.xls, change cell C2’s value from 762 to 303.  Now both Source.xls and Target.xls have a value of 303 in cell C2.  

And of course Target.xls changes as well as the link is live between the two workbooks.  All is as it should be, right?


Step 6 – Close both workbooks

By closing both Source.xls and Target.xls I now have Excel with absolutely nothing open.  

If I were to think about the values in the workbooks, they would be as follows:
Workbook name
C2 value
Source.xls
303
Target.xls
303
Target_2.xls
762

Step 7 – Open Target_2.xls

When I open Target_2.xls, I see that the old value of 762 is still in C2.  And that an automatic update of links has been disabled.  Thank you, Excel 2010, for not exposing me to Bad Things.  Alas, not the Bad Thing I am about to (finally) demonstrate.

Step 8 – Open Target.xls

Before I do this, a quick quiz.  Yes, it’s Professor Lackpour and I’m here to make your life a living Excel-themed hell.  What value do you think will be in Target.xls?

Did you answer 303?  Good.  That is consistent with what we saw with Target_2.xls – the last linked value (note again that automatic link updates are disabled) displays.

Now the trick question:  What value do you think is in Target_2.xls?

Step 9 – The Stupid Excel Trick

Would you believe not the value we saw a moment ago?

Well, you should, because THE VALUE HAS CHANGED.  

Let me repeat the steps:
  1. I opened Target_2.xls.  Its link-disabled C2 cell value was 762.
  2. I opened Target.xls.  Its link-disabled C2 cell value was (and is) 303.
  3. I toggled worksheets (both are in memory, I simply Ctrl-Tab switched to Target_2.xls)  back to Target_2.xls.  What value did I find in cell C2?  Not 762, but 303.
  4. Insert three hours of dumbfounded, flustered, and frustrated experimentation trying to prove that what I saw couldn’t possibly be so.  

What I saw, and what you now see, in fact is possible.  Not only possible, but documented behavior.  And we geeks complain about Oracle not fixing bugs in a timely manner.  This one is a whopper and Oracle Corporation, Inc.’s fingerprints are nowhere to be found.

Excel, I HATE you.

Explaining the Stupid Excel Trick

A slight exercise of google-fu brings up this Microsoft support document:  Multiple workbooks with links to the same source update each other


At your own discretion, throw in a, “You bastard!” interjection when you read the article; repeat as necessary.  Those were at least my exact words.

Let me direct you to the really interesting bit with emphasis added by yr. obdnt. srvnt.:
If you open multiple Excel workbooks that all contain links to the same source workbook, and you choose not to update the links, the links in all of the workbooks will be updated to the values that are stored in the last workbook that was opened.

And this other interesting bit, again with a bit of emphasis:
This behavior occurs because link information is stored in an internal link table. This link table is shared among all open workbooks in that instance of Excel. The most recently opened workbook provides the link table that is in memory at any specific time.

There is a workaround, although not one that anyone who uses the Essbase Excel add-in is likely to use:
To work around to this behavior, open the workbooks in different instances of Excel. As a result, each workbook maintains its own internal link table and stored values.

So just to summarize:  Multiple target Excel workbooks that link to the same closed source workbook will reflect the data value in the last opened target workbook.  

Some would call this a data integrity problem.  

Tim Tow would tell you that’s why Excel is so dangerous.  I agree.

Explaining what happened to me, and potentially you

I received an updated data workbook via email.  This updated workbook had new values that were linked to multiple source workbooks that I did not have access to.

To be sure (oh, the irony) that I was dealing with the right data set, I opened an older, differently named, version of the workbook.

As both the first and the second workbook were linked to the same closed (and not even available) source files, the first workbook’s linked data values were updated by Excel to the second workbook’s linked data values.

I then sent the data values in the first workbook (which were now wrong because they reflected the values in the second workbook) into Essbase.

Thank you, Microsoft Excel.

Arrrgh.  But one must be philosophical about things like this lest they drive you mad.  It’s an Excel issue, not an Essbase issue, and not even a Cameron issue (well, mostly).

I don’t think I can say it better than my friend Dave Farnsworth, “Linked workbooks are nothing but trouble and always have been.  Shoot anyone who uses them and save yourself the pain.  No one who knows Excel will vote to convict.”

Be seeing you.

01 November 2013

ODTUG Board of Directors election results, 2013

Introduction

Most (Some?  None?  All?) of you know that I am on the ODTUG board of directors.  Of course you, the ODTUG membership are responsible for that and I am obliged for your confidence in me.  It sounds a cliché, but serving as a board member really is a tremendous privilege and responsibility.

Board members who care, a lot, are the hallmark of a successful user group and I am pleased to state that your ODTUG board of directors is filled with some of the most caring, passionate, and smart people I have ever met.  How can I drive this home?  Sometimes, when I meet with people professionally, I think to myself (and yes, I do have the conversations with myself and yes, it worries me too, as it should), “Hmm Cameron, this lot definitely isn’t top drawer, you know more than these bozos do combined.”  Other times, I have flashes of clarity that say, “Cameron, you’re the bozo.  By a long shot.  Try not to embarrass yourself too much.”  And so it is with the ODTUG board.

With that little bit of probably too close for comfort humor, you hopefully get the idea:  the ODTUG board is made up of really smart people whose service has helped produce the very best Oracle user group extant.  

Introducing…

With that, I hope you understand the importance that individuals can make to the board and by extension ODTUG itself.

And that brings us to the point of this blog post – the just-declared ODTUG board of directors elections.

You haven’t kicked us out yet

Four of us did not run for election, although our moment of doom comes up next year.   Additionally, two incumbents were reelected.  I’m honored to count as my peers:
  • Tim Tow
  • Barbara Morris
  • Tim Gorman
  • Martin D’Souza (incumbent, reelected)
  • Monty Latiolais (incumbent, reelected)

And the new ones

But of course new people were elected as well.  They are:
  • Natalie Delemar
  • David Schleis
  • Mia Urman

Natalie is a fellow EPM geek whom I met for the first time at Kaleidoscope (as it was then called) 2008 – the very first ODTUG conference that included the EPM community.  Oddly, I had “conversed” with Natalie through web forums but didn’t even know her real name as she went by DaveJonJoshMom.  That ability to actually physically meet people is one of the things that makes ODTUG so special.  Natalie and I went on to work together on the Hyperion SIG, the EPM/BI content selection process, and of course as coauthors in Developing Essbase Applications.  I’ve been after Natalie for years to run for the board because I think she will bring great insight and energy to the job.  I am therefore beyond pleased to see that she made it and will help represent the not entirely small EPM/BI Kscope contingent at the board level.  

David actually has been on the board before as an appointee for a board member who did not complete his term so I have worked with him before.  He is equally passionate about the technologies he represents and additionally has an atrocious taste in shirts (wait, is it me who commits the fashion faux pas?) that simply must be admired. David also has a sense of humor that embraces the ridiculous and is an engaging speaker.  In short, I am really looking forward to working with him again.

I’m afraid that I don’t know Mia, but my ignorance is in no way a reflection of her qualities.  A quick google-stalk of Mia shows that she is the president of her own firm, an Oracle ACE, and an expert in ADF and Forms.  In short, she is the typical highly accomplished ODTUG board member (I am overlooking yr. obdnt. srvnt.’s somewhat inexplicable inclusion in that august group of geeks).  I look forward, a lot, to working with her as a board member.

The ones that are going away

Lest you think that all of my Sammy Davis Jr.-like encomiums are mere window dressing, please know that I am beyond sorry to see Bambi Price, Jerry Ireland, and John King roll off the board.  All of them were (and are) fantastic contributors to ODTUG.  I know that they will remain active in ODTUG.

This is what democracy looks like

You, the ODTUG membership (only full ODTUG members can vote – if the above election fills you with joy, great, if instead you are filled with a rage inchoate, next year’s election is your chance to throw we bums out, but only if you are a fully paid member) are responsible for the results of the election.  The board represents your interests, your needs, and your technologies.  I hope you are pleased with the results of your voting – I know I am.

Be seeing you.

15 October 2013

Row-driven Focused Aggregations in Planning, finally

It couldn’t be done, but has

And apparently by someone quite a bit more inventive than me just playing around.  Who would have thunk it?  Not me.  In fact, quite explicitly not me, as I have told multiple clients (I think two, maybe three) that this very thing cannot be done.  Sigh.  But thanks to this thread on Network54, it most definitely has been done.  All thanks must go to Christian M. (who oh who oh who is Christian M.?  Christian, if you’re coming to Kscope14 I want to buy a beer.  Several, in fact.) as he figured this out.  So this is most definitely not something I have invented – I am just expanding on Christian’s discovery.

Why this matters

Yes, I know, I haven’t exactly spilled the beans on what this is, but let’s set some background so you can understand why this is so important.  

The technique that I call Focused Aggregations is a way to read a Planning form and only aggregate the bits of the hierarchy that are germane to the form.  Calc Mgr/HBR can read POV and dropdown values from Run Time Prompt (RTP) variables and “faking” the aggregation process (really, it’s just aggregating as if a CALC ALL or AGG (dimnames) was issued but it only does the aggregation for the relevant hierarchies all the way to the top of the dimension).   This approach is fast because instead of aggregating an entire Entity or Product or Employee or whatever dimension, much of which is not going to be relevant to the data entered on the form, only the hierarchies that matter get aggregated to the dimension top.  

This approach is fast, fast, fast (it isn’t magic, but only aggregating the bits you need versus an entire dimension can make “too big” BSO Planning applications perform acceptably) and I have written about it here in Hyperion Business Rules, Calculation Manager, and Dodeca.  The Dodeca post is (or was) an important one within the field of Focused Aggregations because focused row-based aggregations are possible as Dodeca can provide the row contents.  That’s what a Planning form cannot do.  Or at least it couldn’t, until now.  Hopefully you now understand why I am so excited about this.

Why Calculate Form isn’t good enough

Some (like Celvin Kattookaran, my coopetition on this subject) have suggested that maybe the in-built <Form Calculate> might do the same thing and then why oh why oh why would anyone bother with focused aggregations?  Here’s why this approach doesn’t make the grade and Celvin mostly hit on the reason why in his post.

A bit of review

Just in case you’ve forgotten what a <Form Calculate> Business Rule looks like, here it is in 11.1.2.3’s form designer.  Note that this is not a rule that you can look at (maybe it’s buried in the EAR file that makes up Hyperion Planning but it is not something you can casually examine).

Note that I’ve moved the <Calculate Form> rule over to the “Selected Business Rules” listbox and ticked the “Run on Save” box.

Going into Smart View, I can open up a form (note the dynamic user variable selection of E032 – that will become important a little later on), and enter in values of 100 and 200 into the January column.

When I click on Submit, I see the following in the form:

Very nice, right?  There’s 300 in E032.  Is this enough?  Nope.  

The problem with Form Calculate

The issue with Form Calculate is that it only aggregates the descendants of the topmost member in the form.  As E032 was the form’s top of the Entity dimension, all of the intermediate members were aggregated.  That is a Good Thing.  But what Form Calculate does not do is aggregate the ancestors of the topmost member in the form.  Note that E03 does not equal 400 (and it should).  And TotalGeography does not equal 1,173 (ditto).

Here’s what the totals ought to look like:

So unless you live in some weird Planning world where looking at top level numbers isn’t important, I’d say that Form Calculate is nice, but not enough.  Remember, for anyone looking at data above E032, it’s as if the Form Calculate didn’t happen.

The traditional way of aggregating the form dimension

It’s really simple:  

But remember, Entity could be a really big dimension with lots of lower level blocks and thus  pretty slow performance.  Are you sure you want to aggregate that entire dimension?

Enter Christian M.’s most awesome hack

What would be nice would be a way to read the dimension, or even more to the point the topmost form member and then aggregate the descendants (like Form Calculate) and the ancestors (which Form Calculate cannot do).  Until now, this is what could not be done.  

Christian’s hack still can’t read the Planning form’s row definition, but what it can do is read a Planning User Variable.  And if a form’s row definition is driven off of a Planning User Variable, then you have squared the circle because that Planning User Variable can (somehow) be read into Calculation Manager.  Huzzah!  Hurrah!  Hot diggity-dog, etc.

Here’s the technique in summary:
  1. Create a Planning User Variable
  2. Create a Planning form that uses the User Variable to drive the row contents
  3. Create a Calculation Manager Variable that is explicitly assigned the Planning User Variable
    1. The Calc Mgr Variable type must be Members, not Member, and no, I don’t know why, and yes, Members versus Member doesn’t make sense but why are we complaining about a great hack?
  4. Use the Calc Mgr Variable in the standard Focused Aggregation code approach
  5. Deploy the rule and assign it to run on save in the form.
  6. Hope that Oracle doesn’t “fix” this hack.

Let’s now go through this in detail.

Create a Planning User Variable

Simply go to the Administration->Manage->Variables and create a new variable.  I like to sort of mnemonically name them, so my Entity dimension variable name will be varUVEntity.

Create a Planning form that uses a User Variable

I simply modified a form in the Planning sample application and in the row definition selected the descendants of the user variable:

The My Region variable comes with the sample application.  I don’t like spaces in variable names and I don’t like variables that aren’t immediately identifiable as such, hence the name “varUVEntity”.

I wanted the user to be able to change the row contents, so I made sure that the user variable was selected in the Other Options form design tab and I selected “Enable dynamic user variables”.  That last tick box means that the user can change the form rows as required so long as he stays within his Entity dimension metadata security.

So far all of this is bog-standard Planning.  We’re about to enter the undocumented hack area.

Create a Calculation Manager Variable that reflects the Planning User Variable

This is the bit where I really have to take my hat off to Christian.  I am pretty sure I would have come up with this approach in, oh, never.

Go into Calculation Managers Variable Designer and create a variable of type Members.  Assign (you are going to have to remember the name of your variable as it isn’t going to pop up in Calc Mgr) the name of the Planning User Variable to the Calc Mgr Members variable’s Default Value.  Be sure to prefix it with a “&” symbol.  And yes, that is sort of like an Essbase Substitution Variable, but isn’t.
You will also need to select a Dimension Name (in this case, Entity), make sure it’s a RTP type, and even enter in a RTP Text message.  Remember, it’s the Default Value with an ampersand before the Planning User Variable that does the passing of the Planning User Variable’s value to Calc Mgr.

NB – It seems logical to me to use a Calc Mgr variable type of “Member” instead of “Members” and in fact it works, but when using that variable type a web browser will force a dialog box.  Oddly, Smart View does not do this.  
Don’t try to find reason where I suspect none exists – none of this makes any sense.  I’ve done the pig headedness bit for everyone, so just go with Members.

Write a Focused Aggregation Business Rule in Calculation Manager

Calc Mgr script


Please disregard the lack of variables in the FIX statement – you can add that in later.  The important bits are the @IDESCENDANTS and @ANCESTORS statements.  

In the first sub FIX statement, the code fixes on a Calc Mgr variable called varSegments.  This is just a standard RTP Calc Mgr variable.

What’s inside the FIX({varSegments}) is what’s important and heretofore impossible.  The code does an @IDESCENDANTS of the Planning user variable varUVEntity as passed to the Calc Mgr variable varBRUVEntity to mimic the Form Calculate functionality.  The next line issues an @ANCESTOR calculation of that Planning User Variable via a Calc Mgr variable to aggregate up to the top of the dimension.  Ta da!  We have just hit the super cool hack.

The second FIX statement uses that same Planning User Variable as expressed in a Calc Mgr Variable but now selects the relevant bits of the Entity hierarchy and then does an @ANCESTORS aggregation of the Page dimension Segments.

Double Ta-Da!  We have just squared the row-based Focused Aggregation circle.

Calc Mgr rule

Stick the script into a rule by dragging and dropping it into the rule itself.  

NB – If you do not have Flash 10.x installed on your machine (in my case, a Windows 2008 R2 VM) you won’t be able to do this.


When you do this, be sure to go to the Variables tab and tick the “Is Hidden” boxes for the variables.  This will ensure that neither RTP pops up when the rule is run on form save.

If you click on the Script tab you will see something kind of interesting:

Instead of varBRUVEntity the default value, &varUVEntity, which does look awfully Essbase Substitution Variable-ish, shows up.  

If in fact you were to go to the Errors & Warnings tab and have Calc Mgr analyze the script, you would see this bogus warning:

Interesting, isn’t it?  But it still works.

Deploy the rule

But don’t validate

As Celvin pointed out in his blog post, don’t bother trying to validate the rule.  That error message in the Errors & Warnings tab will surface:
So just deploy.  And fix the code yourself.  C’mon, if you’re writing code like this you can do syntax checking in your head, right?

Deploying is easy

Hook it up to the form

We are back to bog standard Planning.  Simply attach the deployed Calc Mgr rule to the form:
As always, make sure it runs on save, uses the members on the form to drive the POV/Page dimensions, and hide all RTP prompts.  That’s it.  Let’s now go prove that it works.

Hmmm, I love pudding

The proof of the pudding is in the eating.  

Of course we cannot eat a Planning form, but we can:
  1. Change a dynamic user variable
  2. Enter in data values
  3. Save the form
  4. Prove that it aggregates all the way up to the tippy-top of the dimensions

Let’s do just that.

Change a dynamic user variable

Select E01_101 as an Entity member.
Clicking on OK and then doing a form refresh will change the form selection:

Enter values and Submit Data



Review the fully aggregated data


And here’s the data aggregated all the way up to the top of the hierarchy in an Essbase ad-hoc retrieval:

Magic.  Thanks again, Christian.

So why is this faster?

It’s all in the amount of data that Essbase has to slog through in its aggregation.

I turned SET MSG DETAIL on in the script (and got rid of the CALC DIM of Accounts and Period as those should be fully dynamic – ask Oracle why the sample app isn’t set up that way, not me):
[Tue Oct 15 12:10:53 2013]Local/SampApp1/Consol/hypadmin@Native Directory/1792/Info(1012672)
Calculator Information Message:

Total Block Created: [0.0000e+000] Blocks
Sparse Calculations: [2.2500e+002] Writes and [8.5500e+002] Reads
Dense Calculations: [0.0000e+000] Writes and [0.0000e+000] Reads
Sparse Calculations: [1.0303e+006] Cells
Dense Calculations: [0.0000e+000] Cells

[Tue Oct 15 12:10:53 2013]Local/SampApp1/Consol/hypadmin@Native Directory/1792/Info(1012550)
Total Calc Elapsed Time : [0.059] seconds

I then created a calc script that used an AGG for Entity and Segments:

And ran it with the following results:
[Tue Oct 15 12:11:38 2013]Local/SampApp1/Consol/hypadmin@Native Directory/360/Info(1012672)
Calculator Information Message:

Total Block Created: [0.0000e+000] Blocks
Sparse Calculations: [2.8690e+003] Writes and [1.0772e+004] Reads
Dense Calculations: [0.0000e+000] Writes and [0.0000e+000] Reads
Sparse Calculations: [1.3137e+007] Cells
Dense Calculations: [0.0000e+000] Cells

[Tue Oct 15 12:11:38 2013]Local/SampApp1/Consol/hypadmin@Native Directory/360/Info(1012579)
Total Calc Elapsed Time for [BRAggTst.csc] : [0.434] seconds

Some analysis

Agg
Focused
Variance
Variance Percent
0.434 seconds
0.059 seconds
-0.375
-86.406%

Nothing to sneeze at here – that’s a decrease of 86% in the amount of time the calculation took to run.  Worthwhile when applied to your real world application?  I think so.

Let’s look at the number of transactions.
Approach
Sprase writes and reads
Sparse cells addressed
AGG of Entity and Segments
2,869 writes, 10,722 reads
1,313,700
Planning User Variable Focused Aggregation
225 writes, 855 reads
103,030

There’s an enormous difference in the number of writes, reads, and overall sparse cells.  No wonder a Focused Aggregation is so much faster.

Where do we go from here?

Well, as far as this blog post is concerned, hopefully nowhere.  This is quite the post (19 pages in MS Word which is where I draft all of my blogs – that’s quite enough of the Cameron Effect for one day) but I think it was important to showcase how absolutely brilliant Christian’s hack is, and how important it can be for the Focused Aggregation approach which can now finally apply that technique to rows.

I am going to reach out to the Calculation Manager Product Manager (Sadly, I really only have the emails to a couple of people within Oracle – I lay that at the feet of being an independent with just about zero impact on sales.  Or maybe I am just lazy and don’t work the contacts the way others do.) and beg, beg, beg that this functionality not be removed.  Ever.  In fact, it would be super if this functionality was formally incorporated into the tool so we didn’t have to play games with default values and ampersands.  I am going to guess most of the functionality is already there within Calc Mgr so I am hoping we’ll see it soon, officially supported.

One last time – Christian, thanks man.  You have no idea how happy this has made me.

Be seeing you.