Want to know The Truth About CPM?
Showing posts with label FDM. Show all posts
Showing posts with label FDM. Show all posts

24 June 2013

Kscope13, day 0

Introduction

Sundays at Kscope are always Symposium day.  I understand that there are multiple other Symposiums, one for each track.  Of course, being an Essbase geek, the only thing I care about is EPM, and here I am, sitting in the EPM Symposium, listening to Oracle management talk about the latest and greatest in the EPM space.

And the content is…

Sorry, can’t tell you.  That’s the deal – come to the symposium, do NOT blog about what you hear.  Safe harbor statements abound (so they’re going to tell us the future, but reserve the right to change their collective Oracle mind) as well as requests to NOT take photographs, NOT blog about what we learn.  Those are the rules.


So that means if you aren’t at Kscope13, you don’t know what is coming in the EPM space.  And I’m not (I cannot) tell you.  Stinks, doesn’t it?  The way to solve this is to come to Kscope14, and every year thereafter.  That’s my plan for career futures.  :)


What I can tell you is there is some very interesting news about Essbase.  It’s stuff we have all wanted for a long time.  Again, sorry if you are not at Kscope but we attendees are not allowed to tell you more about it per Oracle’s request.  So yes, a big, big tease.


And some very interesting news about Planning.  I have wanted this functionality for approximately forever, or at least since 2002 (ah, Planning 1.5, or maybe 1.1 – I no longer remember but oh my goodness you were buggy).  Alas, I again cannot tell you much of anything.  In fact nothing.

Conclusion

Are you gathering that Kscope gives you information that you cannot get anywhere else?  This is important stuff that defines the future of what we do and no other user conference delivers this information.


The brutal sadist in me sort of enjoys telling you that there is all sorts of cool stuff on offer at the Sunday symposium.  The caring nurturing inner Cameron wishes you were here.  Square the circle, bind the wound, cut the Gordian knot, for goodness’ sakes stop me from tortured metaphors and just make sure you are here next year at Kscope14 so I don’t have to keep on telling you about all the cool things I (and everyone else at Kscope13) know, and you don’t.  See, I really am the caring, nurturing sort.


Be seeing you at Kscope13 (oh, we happy few) and Kscope14.

28 August 2012

Bringing the Cloud down to the Ground and no, the result is not fog, part 2


Getting it onto your local machine

Still with me?  You must be else you wouldn't be reading this.  I think.  Anyway, you’ve dutifully read part 1 of this two part series and converted an AWS EC2 instance to a VMWare Workstation (in my case, at least) instance.  So now the question is – how oh how oh how do you get a BIG file off of the Cloud?  

This is not a hard step but it takes a long time because of the size of the files.  Note that if you get rid of those media files or if you have a faster connection than my DSL line it isn’t quite so painful.

Compress it to make it fast(er)

Although I suppose you could do this without using the installed 7-Zip compression program, I can’t see why you would.  

Dan did a bunch of experiments with getting the best performance out of 7-Zip, and found that the Lzma2 method with 24bit word and 256mb block and 8 threads was the fastest options settings.

I am not going to show the individual steps for doing this – you can just take the defaults on the compression but it will take longer/be bigger/be slower on the download but I suggest trying Dan’s settings.

Downloading the compressed files

I’ve done this four different (What, you think I know all of this stuff before I write it down?  If only.  Nope, I have to blunder through the options until I get to the right answer.) ways:
  • Transferring the file(s) from the AWS instance to a FTP server and then download them (this got me a nastygram from my internet provider because of size and download threads which ate up the box for everyone – whoops, so firmly rejected on my part).
  • Use Terminal Services to transfer the files.  Just follow the three steps below in the TS client.  You must set this before you connect to your instance.  Your local drives will then look like mapped drives from AWS.
 
  • Setting up an FTP server on your AWS instance and downloading from there.  Note that you will need to open up the default port of 21 in your AWS Security Group/firewall.
  • Use AWS’ S3 – This is the way I did it.  It’s a little confusing at first, but Cloudberry Explorer makes it dead simple.

Using Simple Storage Service (S3)

Given the other three (two really, I would avoid the first approach of sending the file (or files if you split them up) to an external ftp source) approaches, why use S3?

I used 7-Zip to both compress the VMWare files and to split it up into DVD sized (4.7 gigabyte) files.  I have had (Oracle e-delivery is where I’ve experienced this before) issues with my wonderful (can you tell it annoys me?) DSL connection.  What happens is that the files get downloaded, look like they’re valid, but in fact are corrupt.  

S3 allows me to redownload parts of my VM that fail.  It’s a pain to do that, and slow, and it costs (S3 charges you for downloads – pray that you have a better internet connection than me) but it is better than downloading everything over and over again.  Also, it gives me (and you, too) a chance to learn a new technology.  I should mention that John Booth mentioned S3 as an approach – as always, he has some really great ideas and I am at least smart enough to listen to them.  :)

I am not going to provide a detailed tutorial on S3 but suggest that you read here.  I essentially treat S3 as a super easy to set up FTP Cloud server that does not require me to configure the AWS instance’s IIS settings.  Note that most Cloud-based services such as OpenDrive or Box limit the size of uploaded files.  My provider, OpenDrive, has a file limit of 100 megabytes per file, so with my 4.7 gigabyte files, I really had to come up with another approach.

If you are interested in other tools other than Cloudberry, have a read of this thread.

If you are interested in using Cloudberry, read this very nice tutorial.

NB – You can also use the AWS console’s S3 component to move the data around – that’s what Dan did.  It is as simple as opening up the AWS console on your AWS instance (sort of like a mirror facing a mirror) and then right clicking inside your bucket like the below:

I did the same thing via Cloudberry to a S3 bucket I called VMWareMetavero.

To get it onto my laptop, I installed Cloudberry again and then downloaded it to my external hard drive.  

Alternatively, I could have just gone into S3 via the AWS console and done this:

Cloudberry made it a little less painful so I went that way.

That’s it.  

Unzipping the files (or even combining them)

Once you have the download to the Ground completed, 7-Zip needs to be installed on your laptop if not already done.  And once that is done, decompress.  Again, this takes a while.


Avoiding the Blue Screen of Death (BSOD)

At this point you have:
  1. Removed the media files from c:\media unless you really, really, really want them.  You might, but probably not.
  2. Converted the 11.1.2.2 AMI to a VM Workstation VM
  3. Compressed that VM on a using 7-Zip’s Lzma2 method with 24bit word and 256mb block and 8 threads.
  4. Downloaded that through S3 (or whatever method you prefer but that’s the easiest) to your laptop

So you’re all ready to go, right?  Uh, no, because here’s what happens when you try to fire up that VM in Workstation.  Auuuuuuuuugggggggghhhhhhhh!!!!!!!!!!   It’s the Blue Screen of Death!!!!!!!!!!!

And not just the BSOD, but a BSOD that will immediately reboot Windows so that you have a lovely endless loop of BSODs.  Fun times, fun times.

At least it’s fast – it took me about five tries (so we are talking 30 minutes of reboots) to get that screenshot with Snagit.  It will flash very, very, very quickly on your screen.  Is there a cure?  You betcha.

The cure for the Blues

If you want it all in one succinct (but not terribly well explained or at least I couldn’t follow the directions until I did it three times) thread, read this on the VMware support forum.  I’m going to show it to you step by step and will make it a tiny bit less painful.

Just to be completely up front, I am taking everything I read in that thread and putting pictures to it – the brains behind figuring this out belong solely to ivivanov and leonardw who figured all of this out.

The issue is the RedHat SCSI Bus Driver (really all of the Red Hat services, all of which start with “rhel”) despite the storport.sys message in the BSOD.  Who would believe that an error message is misleading or doesn’t give all of the information you need?  Why I would, and so should you.  The RedHat services are part of the EC2 Amazon offering and simply don’t work (why I know not as I am no hardware expert, but I can certainly attest to their super-duper not working).  It blows up Windows 2008 R2 on VMWare real good.

Richard Philipson tried out part 1 and pointed out (yep, some people actually read this blog, thankfully otherwise this is the most involved echo chamber ever) that the ec2config service is superfluous (and causes a wallpaper error on startup) and that those RedHat services are “a set of drivers to permit access to the Xen virtualized hardware presented by Amazon EC2 to the guest operating system.”  It makes sense that without Xen under the covers, there is no Xen virtualized hardware.

NB -- There is a separate intermittent error in VMWare if you have more than two cores to your laptop.  If you are on a machine with more than two cores, you may get a multiple processor error (a different BSOD).  If that is the case, you should set the number of processors to 1 and the number of cores to 2 in VMWare.

Step 1 – Getting into Boot Options

Open up your nifty new VM in VMWorkstation and start it.

As it starts up hover your mouse over the VMWorkstation window and press Ctrl+G.  You need the VM to get control of the keyboard/mouse as you are going to be holding down the F8 key.  If your Windows host has control, F8 will toggle a selector bar in the VMWorkstation application and you will not be able to get into Advanced Boot Options.  It’s a total Pain In The You Know What.

No worries if you don’t get it the first time as the Metavero VM will crash very quickly indeed.  :)

A VMWorkstation bar will pop up telling you to install VMWare Tools.  Ignore that for now but you will need to install this.

Step 2 – Go through the System Recover Options
First select a keyboard.

Then log in.  The cool thing is this is just like the AMI – username Administrator, password epmtestdrive.

Step 2 – Run a Command Prompt and then Regedit
Select Command Prompt.  

You will then run Regedit from x:\windows\system32.

Here it is:

Click on HKEY_LOCAL_MACHINE and then File->Load Hive.

Navigate to c:\windows\system32\config and select the SYSTEM file and click on Open.

NB – This must be on the C: drive, not the X (that’s the repair drive).  Here’s what x:\windows\system32\config looks like.  Note the two SYSTEM files.  You do NOT want this as it does not contain the services.

What you do want to see is this and it’s only available off of the C drive:

Type in “p2v” into the Key Name field and click OK.

Navigate to HKEY_LOCAL_MACHINE\p2v\ControlSet001\services.

For each of the rhelfltr, rhelnet, rhelscsi, amd rhelsvc services, click on the service in question and select the Start parameter.


In the below screenshot I have selected rhelscsi.  Note that there is some discussion on that VMWare thread that only rhelscsi needs to be disabled.  I’ve tried that and sometimes it works and sometimes it doesn’t.  Mu suggestion is to disable all four.

Right click on Start, select Modify, and change the value from 0 (or whatever) to 4 which disables the service.

From this:
 

To this and click OK:
 

Note the value of 4:

Do it again for rhelfltr, rhelnet, and rhelsvc.  All of these services need to be stopped.

With all four services disabled, select the key again.
 

Then select Unload Hive.
 

Select Yes in the confirmation dialog box.
 

Minimize Regedit, and then select Shut Down.  You will then restart the VM.



Start the Metavero (that’s what I named it) VM back up.

Ta da, you are now running (and not BSODing) Windows 2008 R2:

In VMWorkstation, VM->Send Ctrl+Alt+Del to get the login.  

NB – You can also hit Ctrl+Alt+Insert to get the same thing.

And there you are:

And finally (at least on my laptop, there is a fair amount of time before this all boots up):

Don’t forget to activate Windows

You have three days to apply that valid key for Windows 2008 R2 Datacenter.

No, I am not going to give you a valid license key.  But I have given you a lot of ways to get one, all of them legal.

Interestingly, Microsoft Action Pack (to which both Dan and I belong) does not support 2008 R2 Datacenter (bummer for those of us with a MAPS subscription), but with MAPS you get TechNet (after signing up for it), which does allow you to sign up for Technet for free, and then one can get a valid 2008 R2 Datacenter key.  Whew.  Thanks to Dan for figuring this out as it was not exactly straightforward.  

If you are on the fence between MAPS and Technet, note that Technet Professional only costs $349 for the first year but MAPS (you do have to qualify as a partner) gives mulitple internal use licenses.  You decide.

So what do we have?

Well, in the case both the case of Dan and me, slightly different outcomes.

In Dan’s environment, running on a 24 gigabyte laptop, he has a pretty awesome EPM installation.

In my world, running on an 8 gigbyte laptop, I pretty much have an unusable EPM installation because my host laptop simply doesn’t have enough horsepower.  Although I do have a nice blog post.  :)

Based on our tests, you simply must have a 16 gigabyte laptop to make this work acceptably.

What’s the right choice – the Cloud or the Ground?

As I wrote above, if you don’t have a multiprocessor, 16+ gigabyte laptop, with plenty of disk space, you can pretty much forget this approach.  A valid Windows 2008 R2 Datacenter key would be nice as well.

Assuming that you do have the above, is the Ground worth it?  I think the answer, despite the pain, effort, and time (I’m pretty sure this must be a world record for me for the length of a single blog post) is, “Yes, absolutely!”

You get a professionally installed EPM instance that is right there on your laptop/PC without the AWS charges.  That’s pretty cool.  And you (and Dan and I) got to perform, and learn, a whole bunch of tools that are pretty darn useful.  All I can say is that I will be getting a Dell Precision 4600 or 4700 in the near future.  That’s putting money where my mouth is.

I hope you enjoyed the multiple hacks.

And a big thanks to Dan Pressman and Richard Philipson for helping out with this monster of a post.

04 August 2012

11.1.2.2 is here, with a twist


With very little thanks to me

Anyone who has read this blog knows I am infrastructure challenged.  Thankfully, there are at least a few others who are better/smarter/more determined than I.  In this case, the glory (is that damning with faint praise given the previous sentence?  Nope, what he’s done is pretty glorious.) is John Booth’s as he did all the hard work making everything in 11.1.2.2 function on Amazon Web Services – I was merely his test monkey.  Go check out John’s excellent blog here to get his take on it.


Yup, that’s right, first he gave us 11.1.2.1 and now 11.1.2.2.  Pretty awesome, eh?  I will note that there are many infrastructure consultants out there and no one, and I mean no one with the exception of the two Johns (Booth and Goodwin) give anything, be that AMIs, knowledge, tips, suggestions, or whatever, away.  Think about it – there are lots of blogs and posts from application developers but those two are the only infrastructure-related contributors to the EPM community.  Given how difficult the EPM installs are (John and I went around and around on a few niggling issues with his AMI so yes, even he gets a little challenged by the install process) we owe them both, but in this case particularly John Booth, a big round of thanks.

Beyond a new version, what’s on offer?

Oh, a few things such as:
  • Essbase
  • Essbase Studio
  • Planning
  • HFM
  • FDM
  • ODI (my favorite)
  • ERPi
  • Financial Reports

In other words, just about everything that is commonly known as the Oracle EPM core.  Like I wrote above, pretty awesome, eh?  This is an expansion of what was there in his 11.1.2.1 AMI.



The only thing that isn’t there is Smart View, and that’s because it requires a copy of Excel.  I’ve already posted how to use your local copy of Excel to go against Essbase – I will be installing Excel itself on the server as it is easier but note that you need a license to do that.

One big difference

Whereas the 11.1.2.1 AMI ran as a normal set of services, this AMI uses the compact deployment option.  That means that EPM takes up way less memory than before and in turn that leads to some interesting options.  See the last section of this post for a teaser.

So where do you go?

Back to business – the AMI is ami-ef933886.  Read the 11.1.2.1 AWS tutorial post on how to launch it – everything is the same as before except of course the AMI id.

What’s the twist?

Merely this:
Take a good hard look at that.  What do you see?  Intrigued?  :)  I will be posting about my experiences with this in a little bit.

22 May 2012

KScope12 session highlight No. 2

Introduction

Ah, Oracle Data Integrator.  From the first time I opened up your nearly incomprehensible documentation (much better today, oh, you should have seen it with “SUNOPSIS” sprinkled all over it and nary a mention of Essbase, Planning, or HFM to be found) I knew you were special.  Powerful, obscure, chock full of technologies I had (and still have) almost no idea about, and with really unhelpful error messages.  How could I not love it?  And I do, although it sometimes drives me almost to tears of frustration.  When I get it (whatever “it” is) working, I want to dance a reel.  That is pretty sad, isn’t it?  No matter.  Excelsior!

ODTUG’s KScope is here to help you dance

I’m not the only one who has seen the power and possibility of ODI.  Kscope12 has 10 presentations on the subject including one from yr. obdnt. srvnt. and if you have any interest (and if you don’t, you should) in this powerful and exciting tool, I strongly suggest you make time in your convention schedule to attend a few of the sessions.

Here’s a list of the ODI sessions that I consider important:
Extending ODI - Hyperion Automation and Error Trapping
Opal Alapat, TopDown Consulting
When: Tuesday June 26, Session 11, 5:00 pm - 6:00 pm
Topic: Essbase - Subtopic: Data Management

ODI 11g for OWB Developers
Holger Friedrich, sumIT AG
When: Wednesday June 27, Session 15, 1:45 pm - 2:45 pm
Topic: Business Intelligence - Subtopic: Business Intelligence

Financial Data Quality On-demand - Seamlessly Integrate Hyperion FDM and Oracle Data Integrator
Matthias Heilos, MindStream Analytics
When: Tuesday June 26, Session 10, 3:45 pm - 4:45 pm
Topic: Essbase - Subtopic: Data Management

Oracle Data Integrator - Best Practices That You Should Be Aware Of
Matthias Heilos, MindStream Analytics
When: Wednesday June 27, Session 14, 11:15 am - 12:15 pm
Topic: Essbase - Subtopic: Data Management

Implementing ODI and OGG 11g to Maximise Performance and Scalability in a BI Enterprise
John Jeffries, Spirotek Limited
When: Tuesday June 26, Session 10, 3:45 pm - 4:45 pm
Topic: Business Intelligence - Subtopic: Business Intelligence

Slay the Evil of Bad Data in Essbase with ODI
Cameron Lackpour, CL Solve
When: Monday June 25, Session 2, 10:00 am - 11:00 am
Topic: Essbase - Subtopic: Data Management

Essbase/Planning Metadata Management with ODI Across Environments
Terry Ledet,
When: Wednesday June 27, Session 15, 1:45 pm - 2:45 pm
Topic: Essbase - Subtopic: Data Management

Hand Free HFM Automation
Alex Mathew, Oil and Gas
When: Tuesday June 26, Session 6, 8:30 am - 9:30 am
Topic: Hyperion Applications - Subtopic: HFM

Beginners Guide to Oracle Data Integrator for Oracle EPM Developers
Markus Shipley, interRel Consulting
When: Wednesday June 27, Session 12, 8:30 am - 9:30 am
Topic: Essbase - Subtopic: Data Management

Moving Beyond the P&L: Essbase for Bank Planning and Technology Reporting
Evan Thayer, E*TRADE Financial
When: Monday June 25, Session 4, 2:45 pm - 3:45 pm
Topic: EPM Business Content - Subtopic: Case Studies/Panels

Isn’t that awesome?

10 sessions for ODI – where else are you going to find that many presentations with a scope as wide as shown above?  Not anywhere else is where.  And that is why Kscope is the singlemost best place to go for all of your EPM/BI education.  See you there.