One of the most asked questions I encounter is “How do I get my data into planning?” The answer is it depends.
For existing planning users, the old standby for loading data to planning was through the use of the Hyperion Application Link (HAL) utility. However, this utility has been phased out and is no longer supported by Oracle for use with Planning, therefore customers who are on older version of planning and want to upgrade face a brand new question: What is the HAL replacement? In most cases Oracle Data Integrator is the replacement tool for HAL, however, there are situations when a different tool is a better fit.
For new planning implementations the answer is a little more difficult due to the number of options available for loading data to planning.
There are multiple tools that are capable of loading data to planning. Each tool has its own pros and cons in regards to loading data to planning. The options for loading data into Planning are:
- Oracle Data Integrator (ODI)
- Data Integration Manager (DIM)
- Financial Data Quality Management (FDM)
- Enterprise Performance Management Architect (EPMA)
- Spreadsheet lock and send
I have worked with each of the tools listed and found that there is a lot of overlap in the tool choices so I decided to provide a brief description of the tool along with the strengths and weaknesses that I have found.
Oracle Data Integrator (ODI)
Oracle Data Integrator Enterprise Edition is a comprehensive data integration platform that covers all data integration requirements from high-volume, high-performance batches, to event-driven, trickle-feed integration processes, to SOA-enabled data services
Strengths
- High performance, E-LT architecture
- Does not require new servers/hardware; scales with existing source and target hardware
- Bundled with Hyperion Planning - No Extra Cost
- Oracle Strategic Tool for Data Integration
- Works with data and meta data
Weaknesses
- Usually IT Managed. There is a web application to enable business users to execute routines as needed.
- Hyperion Knowledge modules 1.0 release
- Not fully integrated with all Hyperion Products
- Smaller resource pool to pull from
Data Integration Manager (DIM)
Strengths
- Full scale ETL tool
- Flexible and Powerful
- Time proven track record for ETL excellence
- Based on Informatica - Lots of experience in Marketplace
- Works with data and meta data
Weaknesses
- IT Managed
- Additional Licensing Required
- Additional hardware required (unless PowerCenter already exists)
Financial Data Quality Management (FDM)
Strengths
- Provides Business Mapping Layer
- Provides Drill back to transactional detail
- Provides drill through to select ERP systems
- Business Managed
- Auditable
Weaknesses
- Does not handle complex transformation
- Limited Functionality
- Very limited source and target capabilities
- Limited Automation capabilities
- Additional Licensing Required
- Only works with data
Enterprise Performance Management Architect (EPMA)
Strengths
- Manages meta data
- Allows for data sharing between applications
- Not a direct to source system - still requires data process to get data directly from data source
- Very limited source and target capabilities
- Limited automation capabilities
So the big question is which tool do I use. I mentioned earlier that it depends. Each tool will load data to planning but each tool has its strengths and weaknesses. When determining which tool is the right tool to choose, I consider the below questions.
- What is the license enablement
- Types of Data Sources – Flat files or direct database connections
- Level of Automation Required for the load process
- Data mapping requirements
- Does the load process require auditing capabilities
- Who will own the data load process - IT or Business Analysts
This will give you a start in selecting the right tool. Each tool has its own set of integration options so I have summarized the tools and the data load options in the below chart to help determine which tool is the right tool to use.
I usually lead with Oracle Data Integrator in new environment architecture and in most cases use ODI as the tool for loading data into planning. However, FDM does fit in many environments where there is less IT involvement and business will manage the entire load process. FDM is also the only tool that provides drill through from planning to transactional detail or to the ERP system. DIM is a great alternative if you have in house Informatica expertise and you don’t mind spending the extra licensing costs to enable DIM.
3 comments:
Dennis,
Excellent information. Thank you for sharing!
can i know the process of loading data into Planning using ODI or DIM tool .Its urgnt..Can u reply quickly.
There are two ways to load data to planning through ODI. The first way is to load data directly to the planning Essbase application directly using the Hyperion Essbase Data Load knowledge modules. I have found that this is the most efficient way to load data via ODI or DIM. The out of the box ODI knowledge modules will by default overwrite any intersections which you have data for in the underlying data set. If you need to append to existing data in Planning then you will need to use an Essbase Load Rule in conjunction with the ODI knowledge modules.
The other way to load data is to use the ODI Hyperion Planning Knowledge modules. I would use this approach if I was loading data to Planning that needed to be loaded via a Planning smartlist / textual information. I have found that loading via the Planning knowledge modules are not as efficient as loading directly to the Planning Essbase application.
Loading via DIM is much the same way using the DIM Adapters.
Note that when using the ODI knowledge modules to load data, if you run into errors such as member name not found in essbase the performance will be slow due to the way that the knowledge modules handles errors. ODI will go to a row by row commit interval if an error is found anywhere in the commit batch data set. To help minimize the performance lag, you can utilize some of the knowledge module properties such as the commit level and the number of errors allowed. I also highly suggest that you build error trapping into the package to help bypass errors in the load process before trying to load the data.
Post a Comment