It is possible that a DAC ETL load for an OBI Apps DW implementation can fail due to source system data or improperly maintained configuration files. As much as we hope that the business users and or the OLTP system itself remain error-free, this is not always the case especially when you take into account human errors (manual updates to key .csv config files) or OLTP data integrity issues (unstable source system, multiple external feeds to source system, improperly coded or outdated OLTP processes). Normally, if a DAC ETL load is scheduled to be nightly, the earliest notice will be sent by the DAC upon success/failure and no corrective action can be taken until the logs are traced and the business is notified to correct the OLTP or .csv file error(s) and the load most often needs to be restarted and the business will have lost that time.
To mitigate this risk, The DAC and Informatica can also be used to create a proactive email alert notification to allow the business or OLTP support team to correct any issues that may break the nightly ETL load. One common example of an avoidable OBI Apps ETL error is the notorious ORA-01452: cannot CREATE UNIQUE INDEX error. The DAC will issue a CREATE UNIQUE INDEX command to eliminate redundancies and ensure accuracies of the source system.
Step 1: Find the INTEGRATION_D ID that caused the UNIQUE INDEX ERROR via the following method replacing the W_XXX_D with the appropriate Target table from the error log
SELECT INTEGRATION_ID, DATASOURCE_NUM_ID, EFFECTIVE_FROM_DT, COUNT(*) FROM W_XXXX_D
GROUP BY INTEGRATION_ID, DATASOURCE_NUM_ID, EFFECTIVE_FROM_DT HAVING COUNT(*) > 1;
Step 2: Find the appropriate Source Qualifier extract SQL that is used for the mapping/mapplet that populates the target table (make sure to check for any SQ overrides at the session level). For example, the example below uses the SQ for the PeopleSoft mapplet C_mplt_BC_EmployeeDimension_Biography1. Use the same SQL concept as STEP1 to identify the exact records that are causing the downstream UNIQUE INDEX error by concatenating the fields that together may be used to create the INTEGRATION_ID downstream.
Using the above method should allow you to pinpoint the .CSV or the OLTP data anomaly that is causing the downstream Index Constraint.
Step 3: Create a simple Informatica Mapping/Session that uses the above formulated query to query the datasource (.CSV or a set of OLTP tables) depending on the source of the data issue. As a target, its best to create a .csv file in the same SrcFiles directory that is already defined on the INFA/DAC server. The mappings should populate the .csv file.
Step 4: Create a Session based on mapping in Step 3 that populates the .csv target with the problematic rows and conditionally sends an email to the appropriate business users or OLTP system admins with an attachment of the problem records.
Step 5: Create the associated DAC components (tables, tasks, execution plan) etc and schedule this as a separate execution plan that is run at the same frequency of the actual Load. Preferably the alert should go to the correct business users who maintain the config file or the OLTP system contacts who can correct the source system issue detected.
The above approach can be useful for OBI Apps implementations that have issues related to source system data and/or config file errors. It will proactively notify the business to take corrective action on data anomalies that can cause Execution plan failures due to ORA UNIQUE INDEX error.
0 comments:
Post a Comment