Tuesday, 6 January 2026

Troubleshooting Oracle Fusion HCM HDL Load Failures Using ESS Logs – A Step-by-Step Guide

 

Troubleshooting HDL Load Failures Using ESS Logs

Description

Learn how to troubleshoot Oracle Fusion HCM HDL load failures using ESS logs. This guide explains ESS job flow, key log files, common errors, and practical debugging techniques for Fusion HCM technical consultants.


Introduction

HCM Data Loader (HDL) issues are common in Oracle Fusion HCM projects, especially during data migration, integrations, and post–go-live support. Most HDL failures can be diagnosed effectively by understanding ESS jobs and log files.

This blog provides a practical, step-by-step approach to troubleshooting HDL load failures using ESS logs, based on real implementation and support experience.


Understanding the HDL ESS Job Flow

When an HDL load is submitted, Fusion HCM internally runs multiple ESS jobs. Understanding this sequence is the first step in troubleshooting.

Typical HDL Job Flow

  1. Load HCM Data (Parent ESS Job)

  2. Import and Load Data

  3. HCM Data Loader

  4. Object-specific processing jobs (Worker, Assignment, etc.)

⚠️ Even if the parent job shows Succeeded, child jobs may still have errors.


Where to Find ESS Logs

Navigation Path

Tools → Scheduled Processes → Search → Enter Request ID

Click on the job and navigate to:

  • Log Files

  • Output Files

These files are the primary sources for HDL troubleshooting.


Key HDL Log and Output Files Explained

1️⃣ .log File

  • Technical execution details

  • Object processing sequence

  • Database-level validations

  • Best for identifying root cause

2️⃣ .out File

  • Business-friendly error messages

  • Validation and data issues

  • Often references line numbers from HDL files

3️⃣ .err File (if generated)

  • Summarized critical errors

  • Usually points to blocking failures


Step-by-Step HDL Troubleshooting Approach

Step 1: Check ESS Job Status

  • Identify whether failure occurred at parent or child job level

  • Drill down to the lowest failed child job


Step 2: Review the .out File First

Look for messages like:

  • Record rejected

  • Invalid value

  • Effective date error

Example:

No Assignment record exists from 2018-10-27

This usually indicates data or effective dating issues.


Step 3: Analyze the .log File for Root Cause

Search keywords:

  • ERROR

  • SEVERE

  • ORA-

The .log file explains why the system rejected the record.


Step 4: Validate HDL File Data

Check for:

  • Correct object sequencing

  • SourceSystemOwner consistency

  • Effective start and end dates

  • Operation type (CREATE vs MERGE)


Step 5: Correct and Reload

  • Fix only failed records where possible

  • Use MERGE to avoid duplicates

  • Re-run HDL with corrected files


Common HDL Errors and Log Indicators

❌ Missing Assignment Record

Log Indicator:

No Assignment record exists

Fix:

  • Ensure Assignment exists for the effective date


❌ Source System Owner Errors

Log Indicator:

SourceSystemOwner is invalid

Fix:

  • Validate lookup under HRC_SOURCE_SYSTEM_OWNER


❌ Effective Date Validation Errors

Log Indicator:

Effective start date must be on or after Period of Service start date

Fix:

  • Align Worker, PeriodOfService, and Assignment dates


❌ Duplicate Records

Log Indicator:

Duplicate record found

Fix:

  • Use MERGE instead of CREATE

  • Verify SourceSystemId uniqueness


Best Practices for HDL Troubleshooting

  • Always start with the .out file, then deep dive into .log

  • Track Request IDs for every HDL load

  • Maintain a common HDL error resolution document

  • Test fixes in lower environments

  • Avoid Full Loads for minor corrections


Pro Tips from Real Projects

  • One ESS job can generate multiple log files—check all

  • Do not trust parent job status alone

  • HDL errors are mostly data-related, not system bugs

  • Keep HDL files small for faster troubleshooting



Incremental vs Full Loads in HDL – When and Why

 

Incremental vs Full Loads in Oracle Fusion HCM HDL – When to Use Which and Why

Description

Understand the difference between Incremental and Full Loads in Oracle Fusion HCM HDL. Learn when to use each approach, with real project scenarios, best practices, and common pitfalls.


Introduction

Oracle Fusion HCM Data Loader (HDL) supports two primary data loading strategies: Incremental Loads and Full Loads. Choosing the wrong approach can lead to data corruption, missing records, or unnecessary rework.

This blog explains what incremental and full loads are, how they work in HDL, and when you should use each, based on real-world Fusion HCM implementation and support scenarios.


What Is a Full Load in HDL?

A Full Load means loading the entire dataset for an object, regardless of whether the data already exists in Fusion HCM.

Key Characteristics

  • Loads all records (existing + new)

  • Often uses CREATE or MERGE operations

  • Commonly used during initial data migration

  • Higher processing time and system impact

Example Scenarios

  • Initial employee migration during implementation

  • POD refresh followed by complete data reload

  • Rebuilding corrupted or missing data

Sample Use Case

Loading all employees, assignments, jobs, and positions into a new Fusion environment before go-live.


What Is an Incremental Load in HDL?

An Incremental Load updates or inserts only changed or new records since the last successful load.

Key Characteristics

  • Loads delta data only

  • Typically uses MERGE operation

  • Faster execution

  • Lower risk when handled correctly

Example Scenarios

  • Daily employee hires, updates, or terminations

  • Compensation changes

  • Assignment changes


Key Differences: Incremental vs Full Load






When Should You Use a Full Load?

Use Full Load when:

  • Performing initial data migration

  • Reloading data after environment refresh

  • Correcting major data inconsistencies

  • Migrating large master data like jobs, grades, or positions

⚠️ Caution

Full loads can:

  • Overwrite data unintentionally

  • Impact system performance

  • Cause duplicate or effective-dating issues

Always test in a lower environment first.


When Should You Use Incremental Load?

Use Incremental Load when:

  • Making day-to-day HR updates

  • Loading new hires or assignment changes

  • Updating compensation or manager changes

  • Handling regular integrations from source systems

Best Practice

  • Track changes using Last Updated Date from source systems

  • Maintain strict SourceSystemOwner consistency


Common Mistakes to Avoid

❌ Using Full Load for Daily Updates

This increases risk and processing time.

❌ Incorrect Operation Type

Using CREATE instead of MERGE can cause duplicate records.

❌ Poor Effective Date Management

Misaligned dates can lead to missing assignment or period of service errors.


Best Practices for HDL Load Strategy

  • Use Full Load only when absolutely necessary

  • Default to Incremental Loads for production support

  • Validate effective dates before every load

  • Maintain a load control and audit mechanism

  • Review HDL .log and .out files after each run


Real Project Recommendation

Implementation Phase: Full Load

Post Go-Live: Incremental Load only

Post Refresh: Full Load (selective objects)

This hybrid approach minimizes risk and ensures data integrity.


Conclusion

Understanding when to use Incremental vs Full Loads in HDL is critical for a successful Fusion HCM implementation and support model. The right strategy improves performance, reduces errors, and protects production data.

A disciplined approach to HDL loading separates an average consultant from a strong Fusion HCM Technical expert.


Monday, 5 January 2026

Error during Balance Adjustment HDL load

Error during Balance Adjustment HDL load


Error Message1 : You Must submit an entry value for input value amount.
Solution: Input Value amount is blank. 

Error Message2 : A Previous Process that was run for this payroll relationship couldn't be complete or marked to be run again.
Solution: Please check whether any process or flow has errored out for the person in Person Results.

Thursday, 1 January 2026

Error during Worker HDL load

 

Error during Worker HDL load


Error Message1 : You Must a Primary Value
Solution: Add PrimaryWorkTermsFlag attribute in Workterm Metadata.

Error Message2 : Person can have only 1 active Primary WorkRelationship 
Solution: Update the termination date tagged with previously terminated Workrelationship.

Error Message3 : SourceSystemOwner is unknown 
Solution: Update the HRC_SOURCE_SYSTEM_OWNER lookup by removing FUSION from the custom lookup code, then update the same value in the Worker (or relevant) HDL file.
Zip the updated file and load it via My Client Groups → Data Exchange → Import and Load File, and refresh to confirm successful completion.


Error Message4 : JBO-FND:::FND_FLEX_SEG_VAL_NOT_IN_LIST: xxx is not in the list
Solution: Resolve the issue by either updating the Site Code value for the affected Position record and removing the end date, or by extending the end date of the Site Code.
After making the correction, reload the Worker HDL file to apply the changes.


Error Message5 : You must enter a valid value for the GradeId field. 
Solution: Provide the correct GradeId/GradeCode for the new Position in the DAT file, or alternatively pass #NULL (with PER_ENFORCE_VALID_GRADE = Y) or set PER_ENFORCE_VALID_GRADE = N to avoid passing grade values.
In both scenarios, run the ESS job – Synchronize Person Assignment from Position to sync grade, job, and other position-related fields.

Error Message6 : The values xxxx aren't valid for the attribute LegislationCode. 
Solution: Update the existing Legal Entity with a unique Registration Number via Setup & Maintenance → Legal Structures → Manage Legal Reporting Unit Registrations, then save the changes.
After the update, reload the Worker HDL file to complete the process.

Error Message7 : METADATA line for the {BO_NAME} business object is invalid, When attempting to load a Worker HDL file to add Person Extra Information
Solution: Refresh the Worker HDL object, download the latest template, and update the Worker HDL file with the required PEI EFF, EFF_CATEGORY_CODE, and FLEX attributes by following the Oracle documentation for loading extensible flexfields.
Save, zip, and reload the file via My Client Groups → Data Exchange → Import and Load Data to confirm the error is resolved.

Error Message8 : You cannot update this record because the SourceSystemId <SOURCE_SYSTEM_ID_1> and the SourceSystemOwner ABC are invalid when attempting Assignment Supervisor. 
Solution: The user can either remove Source System ID/Owner and use surrogate or user keys for Assignment Supervisor, or align the Source System ID/Owner with Fusion-supported keys, ensuring parent/child objects do not mix different key types.
Alternatively, identify and update source key mappings via HRC_INTEGRATION_KEY_MAP using SourceKeys.dat, after which the original AssignmentSupervisor data can be successfully loaded—this approach is best suited for existing Fusion integrations.

Error Message9 : You must provide only one parent record Worker and it must start on the earliest effective start date and not have an end date. 
Solution: Pass the same date for both StartDate and EffectiveStartDate, ensuring it matches the first effective start date from PER_ALL_PEOPLE_F for the person.
If only creating a new work relationship or assignment, remove the METADATA Worker and MERGE Worker records from the data file.

Error Message10 :  No Assignment record from 2018-10-27
Solution: Assignment data is missing in Worker.dat from the effective date 2018-10-27.



Not able to End date the Person Contact Relationship using HDL

Not able to End date the Person Contact Relationship using HDL



Description : 
Using HDL trying to end date the Person Contact Relationship.

Resolution :
You can not end date contact relationship using HDL as this feature is not supported by Oracle.
In this case you can delete the Person Contact Relationship.

You can't delete the contact because the person is designated as a benefit dependent or beneficiary

 

You can't delete the contact because the person is designated as a benefit dependent or beneficiary

Error Message : 

You cannot delete the contact because the person is designated as a benefit dependent or beneficiary, or an attempt was made to process the benefit designation

Or

Error removing a duplicate beneficiary/contact


Solution : 

1. Check if Dependent is add as beneficiary or not. if yes then remove it.

2. Check if dependent is elected as beneficiary in any of the plan, remove if yes.

3. If Dependent is covered in any plan then void/purge the life event and Run Purge Backed-Out or Voided Life Event Data for the employee. 


After this you can delete or end date Person Contact Relationship either using HDL or UI.


 



Wednesday, 1 October 2025

HDL Template for Benefit Participant Enrollment

HDL Template for Benefit Participant Enrollment


For this use ParticipantEnrollment.dat file.


Please find the below sample template :

METADATA|ParticipantEnrollment|PersonNumber|ParticipantLastName|ParticipantFirstName|BenefitRelationship|LifeEvent|LifeEventOccuredDate|EffectiveDate

MERGE|ParticipantEnrollment|XXTEST_PER1|Adam|Baro|Default|NewHire|2016/11/05|2016/11/05

METADATA|CompensationObject|Program|OriginalEnrollmentDate|PersonNumber|LineNumber

MERGE|CompensationObject|XXTEST Benefits Program|2016/11/05|XXTEST_PER1|1


The values {attributes} are not valid for the attribute BankAccountId

Error While Loading PersonalPaymentMethod Using HCM Data Loader Introduction Oracle Fusion HCM HCM Data Loader (HDL) is commonly used to lo...