Want to know The Truth About CPM?
Showing posts with label Data integration. Show all posts
Showing posts with label Data integration. Show all posts

18 December 2015

The Compleat Idiot's Guide to PBCS, No. 2. – Philip Hulsebosch's blog

Yeah, I’m stealing the content

Well, not really, more like giving a little publicity to a blog that most of this particular blog’s readers do not read.  

It’s by Philip Hulsebosch.  For those of you who don’t know Philip, he’s an accomplished EPM practitioner who has presented multiple times at Kscope as well as a prolific writer of white papers.

For some odd reason (Dear non-anglophone readers, please enjoy the following example of cultural arrogance) this blog is written in German instead of English.  Given that my German mostly consists of types of beers and descriptions of delicious meat dishes, I have been happy to turn to Google translate to get a rather good account of Philip has been up to.  

For the record, I understand that people not in the States speak other languages.  Once upon a time, even yr. obt. svt. could speak Vlaams.  Disuse has ended whatever I could communicate in something other than English.  Although some would argue that even my native tounge isn't something I've really mastered.

For those of you who are not fully monolingual, here’s Philip’s post in the original German:

And here it is in the Queen’s English:

Or to shorten that:  PBCS for a week.

What’s it all about, Alfie?

Do I have to put in that Michael Caine reference again?  Just listen to Cilla Black sing.  What a fantastic movie.  But I digress.  Again.

Moving beyond British movies from the 1960s, Philip covers:
  • Data center location
  • Planning user roles
  • Reports
  • Data export (quite a bit on this)

What I particularly like about Philip’s approach is that he is coming at these functions as someone who has experience in on-premises Planning but is a Cloud n00b.  Sort of just like me and perhaps you as well.

I have high hopes for many more posts from Philip.  You should too.

Be seeing you.

07 May 2015

Stupid Programming Trick No. 24 – Sub Vars to tables via Calc Scripts

Introduction

This is another in a series of good ideas that have been shamelessly stolen inspired by Martin Slack.  I’m learning an awful lot from Martin; I’m not entirely sure that he is from me.  Such are the fortunes of war.

The problem

I needed a way to get the current month value from an Essbase substitution variable so I could do all sorts of weird and wonderful data integration things with it.  How to do it?


The idea that springs to mind for most of us (or at least yr. obt. svt.) is some way of getting it out from MaxL using the display variable syntax, piping it to a log file, and then face the somewhat unpleasant task of reading in a text file, parsing it via some language (VBScript, Perl, Powershell, etc;) and then from that getting it into a table.  

Ugh.

Stolen glory

Martin said something to the effect of, “What if you could just get it from a calc script?”  A calc script?  Hmmm.  Calc scripts, even using DATAEXPORT, write data, not metadata.  Or do they?

The Stupid Trick its own self

DATAEXPORT is an eponymous command – it exports data.  But what if there was no data to export?  Even more useless, right?  Here’s the Stupid Trick bit:  this is exactly what is needed – no data is fine because we’re going after metadata.  So how does metadata come from data?  And how does it get into a relational table?  Read on Gentle Reader and all will be revealed.

It’s all in your head in four parts

Part the first – the outline

Using my VFEDITWWW, I’ve added members that, at least in this blog, can never have data.  To wit, I have added the members No Measures, No Product, and No Market.  You will likely have different requirements unless you are The Beverage Company.


Part the second – the calc script

This member combination now gets exported to disk.  Ah, you say, yet another worthless bit of programming drivel from Cameron.  You’re wrong, for once, to think that.

Looking at the code below, note the structure of the export, with the DataExportColHeader option set to  Year.  When that is combined with an export of the Members That Can Never Have Data, look at what gets exported.

Part the third – the output

Brilliant, isn’t it?  And I mention that because the brilliance is all Martin’s.

I now have an output file with just the current month and nothing else.  This is a lot easier than trying to running a MaxL file and then parsing the output.  But how to get that into a relational data store?

Part the fourth – the SQL

The last problem now facing this approach is getting the value of Oct into a relational database.  Again, there are all sorts of complex and sophisticated ways of doing this:  ODI or SSIS spring to mind but there is a significantly easier way to do this, and to even do it within a SQL query.

Say hello to your new friend BULK INSERT

In SQL Server (in Oracle there is a concept called external tables that does much the same thing), there is a command called BULK INSERT.  While it is mostly used for doing just what it sounds like – a bulk insert of lots of data – in fact it can be used to bring any amount of data into a database from a file.  And if it can do that, and there is a file that has the current month, connecting the dots is pretty straightforward.

Here’s what my code looks like:
BULK INSERT Current_Period FROM
    '\\epm11123\C$\Tempdir\Current_Period.txt'
    WITH
    (
        CODEPAGE = 'ACP',
        DATAFILETYPE = 'char',
        FIRSTROW = 1,
        LASTROW = 1,
        FIELDTERMINATOR = '\t',
        ROWTERMINATOR = '\n'
    )
One consideration is that there’s no way to map the columns in the data source to the target.  Instead BULK INSERT is positional, e.g., the first field in the source file is loaded into the first field in the target table.  This is different from INSERT INTO…SELECT FROM syntax which allows an alteration of field order.

I used UNC naming not because I had to – c:\tempdir would have worked as well – but because mapped drives *must* use UNC naming, not a shared drive so it’s best to just get in the habit.   Also, this continues the tradition of Martin + UNC so what’s not to like?

Lastly, I defined the file by specifying the character set, file type, first and last row, the tab field terminator, and lastly DATAEXPORT’s line feed row terminator.  For giggles I threw in a first and last row but really everything other than the CODEPAGE and DATAFILETYPE settings are just a belt for the braces.

What does it look like?

Pretty good, actually, but there is a fly in the soup:

Do you see the issue?  The text file contents is “Oct” and that is exactly what BULK INSERT imports.  

There’s a pretty good chance that the double quotes are superfluous; the REPLACE function ought to take care of that:

There’s only one more issue to fix

This query will be run more than once, and that’s a problem as can be seen below when I ran the query twice:

The issue is that the query as it stands doesn’t clear out the table before inserting the Current_Period.txt file.  My brilliant data integration code expects one and only one record so perhaps it is not all that brilliant and will thus go KABOOM when two records are encountered.  Even worse is that this table could have multiple values, e.g., if this were to be run in both September and October the first record would be Sep which isn’t the current month.  

The solution is to run a TRUNCATE TABLE first, then the BULK INSERT, and finally the SELECT REPLACE command as below:

Now no matter how many times I run this, I get one and only one result.  Huzzah!

So what do we have?

A combination Stupid Trick that pulls out metadata, not data, via DATAEXPORT and then painlessly imports an evaluated Essbase Substitution variable into a table with nary a hint of MaxL script output or complex data import to SQL.  Take this query and stick it into a stored procedure, an ODI variable or procedure, etc.  The sky is the limit with this and it is dead easy.

Thanks again, Martin.  You keep thinking them up, I keep stealing them, and world+dog gets the benefit.  

Be seeing you.

06 June 2013

What Kscope13 sessions am I looking forward to, part three

Introduction

I’ve already covered the Essbase and Planning side of the house in parts one and two of this series. What about the foundation for these tools? You know, the data and metadata that make EPM applications, well, right, accurate, useful, etc. Without good data (and metadata) all we EPM practitioners have is a pretty design and bad numbers. Hmm, I may have written a book (or at least a chapter) about this.

Happily, ODTUG agrees with me (Or do I agree with them? Whatever) and they have an EPM Foundations and Data Management track. This is the third in the series of sessions I am looking forward to, and if history and culture are any guide: there is the Rule of three, the Page Cavanaugh Trio’s version of The Three Bears, and perhaps most famously, “All Gaul is divided into three parts”. In other words, three is an important number. And so is this track.

Whoops, before I continue, I should mention that I am friends with, or at least am acquainted with most of the people below. Am I just shilling for them? Nope, these are good sessions. Of course you pick and decide what you want to attend – this is what I am interested in.

Note – you will notice that the name Cameron Lackpour is absent from the below sessions. This is not some kind of false modesty as I do think that the SQL session I am presenting is at least worth considering. I will cover that later in the week – this block is for everyone not named Cameron.

EPM Foundation Data Management sessions (stolen right off of Kscope13.com) with my comments

Integrating PeopleSoft with Planning -- How Amerigroup Replaced HAL with ERPi & FDM

Roger Balducci , Amerigroup
When: Jun 26, 2013, Session 14, 1:45 pm - 2:45 pm
Topic: EPM Foundations & Data Management - Subtopic: FDM
In this session discover how Amerigroup replaced a black box HAL process with FDM & ERPi to load their Planning application. During this session the presenter will review the decision to use ERPi in conjunction with FDM to enable drill through to PeopleSoft. The session will highlight the automation that provides flexibility to process data for the entire company or a single business unit. Finally the session will demo the drill-through capabilities that ERPi provides - not only to the ledger but also to the subledger.

A project that replaces HAL? Death to HAL, I say, death to HAL. That product caused me grief, pain, and psychic discomfort every time I brushed up its mediocre spaghetti diagrams. Yes, yes, I know, it has its defenders, but they’re wrong. Proof? Come see this presentation. You’ll feel clean afterwards, like after a mountain hike whilst eating a York Peppermint Patty. Or am I confusing that with Irish Spring soap and cheesy faux-Irish dialogue? Anway, see how HAL got the coup de grace. And cheer.

Stump the Experts - Hyperion EPM Panel

Natalie Delemar , Ernst & Young
When: Jun 24, 2013, Session 4, 1:45 pm - 2:45 pm
Topic: EPM Foundations & Data Management - Subtopic: No Subtopic
TBD

Intriguing content there, yes? :) I have no idea who is to be on this panel but Kscope always does these right with a good mix of freewheeling questions and lots of opinion. You know, the things consultants are afraid to say to their clients lest they be bounced out on their noggins. Ouch. But no clients (other than the punters in the seats) in this. I am looking forward to it.

ODI - Tips and Tricks to Build Better Integrations

Matthias Heilos , MindStream Analytics
When: Jun 25, 2013, Session 8, 11:15 am - 12:15 pm
Topic: EPM Foundations & Data Management - Subtopic: ODI
Oracle Data Integrator (ODI) is a powerful data integration suite which allows you to build integration processes with enormous productivity gains over conventional tools of the same breed. In this session you will gain insights in how ODI works and what you should consider to build master-class integrations. Learn about tricks and tips on architecture, Knowledge Module optimization, migration, flexible load processes, and many other areas that your organization should be aware of when working with ODI.

Matthias knows ODI. Really, really well. If he does a session on it, it’ll be good.

Think Outside the Box - New Ideas in Financial Data Management

Matthias Heilos , MindStream Analytics
When: Jun 26, 2013, Session 11, 8:30 am - 9:30 am
Topic: EPM Foundations & Data Management - Subtopic: No Subtopic
Are you wondering if you could manage your (financial) data more efficiently? Often, the answer is yes. In this session you will see how other organizations found unusual ways to improve their financial processes. Looking at the bigger picture often allows discovery of new solutions to either automate more effectively, increase transparency, or improve your ability to adapt to change faster. Join this session to learn about unconventional ways to use Hyperion products and OBIEE.

See the above on my opinion on Matthias’ knowledge level and presentation skills. Also, I get sort of obsessed about data, so this ought to be really interesting.

How to Turn New Recruits into Oracle EPM Support Gurus

Nicholas King , Google
When: Jun 26, 2013, Session 16, 4:15 pm - 5:15 pm
Topic: EPM Foundations & Data Management - Subtopic: Infrastructure & Essbase Exalytics
Oracle EPM requires a knowledgeable team to provide production support due to its criticality as a service. Typically skill levels vary in the team as resources are pulled from other areas or are required to support multiple services. Consequently, the need for infrastructure training is a recurring theme in an organization. This presentation covers how to explain Hyperion and its architecture in a way to fully engage new support staff. It includes getting started with EPM modules, logs, and troubleshooting.

This is an interesting session and more of an Organizational Psychology topic than technical – I find these fascinating. Some of my clients understand how to do this and some…do not. It’s not easy finding the right person or persons and as the EPM stack become more and more sophisticated and complicated the profile of the right EPM administrator has changed. And a bad admin = a bad system (yes, I have all too painfully experienced this), so this ought to be an informative session.

How Windstream Leverages Essbase Analtyic Link to Increase Their Analytic Capabilities
Alexander Ladd , MindStream Analytics
Co-presenter(s): Jennifer Moline, Windstream Corporation
When: Jun 24, 2013, Session 5, 3:00 pm - 4:00 pm
Topic: EPM Foundations & Data Management - Subtopic: No Subtopic
Windstream Corporation utilizes Essbase and Essbase Analytic Link to unlock analytic value from their HFM application. This presenation will detail how Windstream implemented Essbase and Essbase Analytic Link with drill through to transactional detail via FDM. See the architecture, the data flows, and how this environment was built, and hear the lessons learned about Essbase Analytic Link.

Once upon a time at a mildly disagreeable client, I worked on a HFM to Extended Analytics project. Which was somewhat amusing as I could and still can barely spell H-F-M but it wasn’t my choice and I met some great people along the way; nothing builds teamwork like adversity. In any case, this was back in the System 9 (remember that?) days and the link was…crude. Wouldn’t it have been great if there was a HyperRoll module that pulled data out of HFM in real time and then pushed it to BSO Essbase as a quasi-transparent partition via a CDF? Why yes, it would have, and I wish it existed back then. And now it does, so come see how it works. Although, if I had had this, would I have made those friends? One of life’s imponderables.

Exalytics - An Apples to Apples Comparison

Daniel Pressman , nTuple
Co-presenter(s): Cameron Lackpour John Booth, Tim German
When: Jun 25, 2013, Session 6, 8:30 am - 9:30 am
Topic: EPM Foundations & Data Management - Subtopic: Infrastructure & Essbase Exalytics
This session will be a panel discussion highlighting the results of our apples to apples test comparing an Exalytics-based solution to a comparable machine in the Amazon Cloud. These tests encompassed ASO and BSO; they covered data loads, BSO calculation, and ASO Aggregation; and finally multi-user performance tests of BSO Planning Rules and ASO Queries. Given the breadth of this testing some of the results are applicable to non-Exalytics solutions (assuming you have the "lots" of CPU and or Memory).

Okay, full disclosure here – I am involved in this one but I am but a supporting player. This is a really interesting session. And yes, I broke my own rule but if I can’t do that on my own blog, where can I?

FDM to ERPi - Upgrade/Migration Strategies & Considerations

Anthony Scalese , Edgewater Ranzal
When: Jun 24, 2013, Session 2, 9:45 am - 10:45 am
Topic: EPM Foundations & Data Management - Subtopic: FDM
Not the fish, anything but the fish! The FDM product is nearing the end of its life. This session will introduce you to FDM 2.0, aka ERP Integrator (ERPi). The session will begin with a technology overview of the new product - architecturally and functionally. The session will continue on to explore key features/changes from FDM. The session will explore strategies, techniques, and key watch-outs for migration from your existing FDM application. Finally the session will discuss best/leading practices for ERPi implementation and go-forward maintenance.

The fish, the fish, oh the humanity! Hmm, something about that doesn’t make sense. Anyway, FDM has always struck me as somewhat old fashioned. And we all know that ODI is all kinds of awesome. And now we see FDM replaced with…ODI in a wrapper. This ought to be interesting. And I’m glad I never learnt how to be an FDM consultant. :)

The New and Improved FDM -- Financial Data Quality Management Enterprise Edition 11.1.2.3

Richard Wilkie , Oracle Corporation
When: Jun 24, 2013, Session 3, 11:30 am - 12:30 pm
Topic: EPM Foundations & Data Management - Subtopic: FDM
The FDM EE 11.1.2.3 release combines the deep functional flows of classic FDM with the deeply integrated technical aspects of ERP Integrator. This new solution allows customers to build deep integrations directly against popular ERP's like E-Business Suite, Peoplesoft and SAP while taking advantage of the functional workflow required in any end user driven data quality process. This session will deep dive into the changes that were made, how they benefit new and existing customers, and typical use cases across the EPM product family.

Whoops, there I go slagging off Oracle’s (well, Hyperion’s which should actually be Upstream’s) fine products and yet I suggest that you attend an Oracle session on FDM EE. If you want to know where the product is going, and what it’s all about, I can’t think of a better person to listen to.

Are you crying Uncle yet?

That is nine, count ‘em nine (you will note that this is divisible by three, I stick to my themes come Hell or high water), different EPM Foundation and Data Management sessions. Is that the sum total of these sessions at Kscope? Absolutely not. In fact there are 17 on offer. And that is just a subset of all the EPM sessions. Kscope has content, content, content.

The next blog post will be The Truth About EPM Reporting at Kscope.

Be seeing you at Kscope13.