OracleBIBlog Search

Thursday, December 31, 2009

Optimization of Established Planning Applications – Part 1

A common question that I am often asked when first arriving at a client with established Planning applications is, “How can I improve my applications performance.” Two short–term initiatives that can often provide more robust performance for established Planning environments are the following:
1. Remove unnecessary history
2. Reorder outlines

Remove Unnecessary History

Often an inordinate amount of history maintained within Planning applications to facilitate year over year reporting. Maintaining excessive history in a Planning application creates unneeded blocks, the greater the number of blocks, the greater the processing time for calculations. While year over year analysis capabilities undoubtedly have to be maintained, I often leverage the practice of creating a reporting application to facilitate Planning Optimization.

Native Essbase reporting databases are developed to archive historical data. These new reporting databases are based on the existing databases within the Planning applications. All data not associated with generating future budgets or forecasts from each of the Planning applications is moved to the reporting database through partitioning, xrefs or data extracts and loads. Planning data (i.e. Budgets, Forecasts, and Plans) would then be moved into the reporting databases at scheduled intervals to allow the year over year analysis, in addition to allowing for an optimal configuration of planning.

Basic steps to moving history out of Planning:

  1. Create native essbase reporting cubes to archive historical data
  2. These new reporting cubes should be based on the existing cubes in the Planning applications
  3. Remove all data not associated with generating future budgets or forecasts from each of the Planning applications
  4. Load all historical to the new reporting cubes
  5. Remove dimension members pertaining to historical data and alternate hierarchies from the Planning applications
  6. Integrate current year data from the Planning applications to the reporting cubes
Reordering of Outlines

Industry wide standards recommend that outlines be structured in an hourglass shape. From top to bottom; dense dimension members with the most stored members to the dense dimension with the smallest number of stored members. Then sparse dimension members with the least stored members to the sparse dimension with the largest number of stored members.

I acknowledge that this rule doesn’t apply to every model; however I do suggest that application administrators run multiple iterations of a baseline calculation reordering the dimension to ascertain the optimal outline order for calculation processing. Reordering outlines to increase the chance of achieving a single anchoring dimension with multiple bitmaps often result in achieving the optimal calculation performance.

Administrators can simulate calculations using SET MSG ONLY in a calculation script. A simulated calculation produces results that help you analyze the performance of a real calculation that is based on the same data and outline.

By running a simulated calculation with a command like SET NOTICE HIGH, you can mark the relative amount of time each sparse dimension takes to complete. Then, by performing a real calculation on one or more dimensions, you can estimate how long the full calculation will take, because the time a simulated calculation takes to run is proportional to the time that the actual calculation takes to run.

While these steps aren’t the be all, end all to optimization, these initial steps will help get you started.

In the next blog, we’ll address leveraging multiple plan types and calculation running in topdown and serial modes

0 comments: