Wednesday, 7 January 2026

The values {attributes} are not valid for the attribute BankAccountId


Error While Loading PersonalPaymentMethod Using HCM Data Loader

Introduction

Oracle Fusion HCM HCM Data Loader (HDL) is commonly used to load payroll-related data such as Personal Payment Methods. However, while loading PersonalPaymentMethod.dat, you may encounter validation errors related to bank account resolution.

One of the most common errors is:

HRC-1035539 – The values {attributes} are not valid for the attribute BankAccountId

This blog explains why this error occurs, the root causes, and multiple solutions to fix it using real project scenarios.


Cause Analysis

Cause 1: Missing BankAccountType

Consider a scenario where two bank accounts exist for the same person:

  • Bank Account 1 → No BankAccountType

  • Bank Account 2 → Has BankAccountType

In this case:

  • If you refer to Bank Account 1, HDL can resolve it without BankAccountType

  • If you refer to Bank Account 2, BankAccountType becomes mandatory

❗ If the existing ExternalBankAccount has a value for BankAccountType, it must be provided in PersonalPaymentMethod.dat.


Cause 2: Invalid Bank Account Attribute Combination

The system is unable to identify a valid BankAccountId because the values provided for:

  • BankName

  • BankCountryCode

  • BankAccountNumber

  • BankAccountType

  • BankBranchName

  • BankBranchNumber

do not match any existing External Bank Account in Oracle Fusion.


Solution 1: Provide BankAccountType When Required

If the External Bank Account already has a BankAccountType, then it must be passed in PersonalPaymentMethod.dat.

Best Practice

✔ Always provide BankAccountType when multiple bank accounts exist
✔ Ensure the value exactly matches what exists in Fusion

This ensures unique resolution of ExternalBankAccount.


Solution 2: Validate Bank Account Details in Fusion

Ensure that the following attributes in PersonalPaymentMethod.dat match an existing bank account:

  • Bank Name

  • Bank Country Code

  • Bank Account Number

  • Bank Account Type

  • Bank Branch Name

  • Bank Branch Number

If any of these values are incorrect, Fusion will fail to resolve the BankAccountId.


Query to Validate Bank Account Data

Use the below SQL query to verify whether the bank account exists and matches the provided details:

SELECT ppr.payroll_relationship_id payrollrelationshipid, ppr.person_id personid, h.party_id partyid, eba.bank_account_id bankaccountid, eba.bank_account_num bankaccountnumber, eba.bank_id bankid, eba.bank_name bankname, eba.bank_number banknumber, eba.branch_number branchnumber, eba.branch_id branchid, eba.bank_branch_name branchname, eba.eft_swift_code eftswiftcode, eba.bank_home_country homecountry, eba.bank_account_type bankaccounttype FROM pay_bank_accounts eba, iby_account_owners ebao, hz_parties h, pay_pay_relationships_dn ppr, per_persons p, hz_orig_sys_references hosp WHERE eba.bank_account_id = ebao.ext_bank_account_id AND ebao.account_owner_party_id = h.party_id AND hosp.owner_table_id = h.party_id AND hosp.orig_system_reference = TO_CHAR(ppr.person_id) AND hosp.owner_table_name = 'HZ_PARTIES' AND hosp.orig_system = 'FUSION_HCM' AND ppr.person_id = p.person_id AND h.status = 'A' AND eba.bank_account_num = <enter bank account num> AND eba.bank_name = <enter Bank Name> AND eba.bank_branch_name = <bank branch name>;

🔎 Replace data with values from your PersonalPaymentMethod.dat file.


Final Root Causes Summary

Cause 1

  • BankAccountType not provided even though it exists in the system

Cause 2

  • No matching BankAccountId exists for the provided bank details

Conclusion

The error occurs when Oracle Fusion HCM cannot uniquely identify the External Bank Account for a Personal Payment Method. By ensuring accurate bank details and providing BankAccountType when required, this issue can be resolved effectively.

This approach is widely used in payroll implementations and data migration projects.

Tuesday, 6 January 2026

HDL Element Entry Error JBO-27014: Attribute RelationshipId Required

 

Resolving JBO-27014: Attribute RelationshipId in ElementEntryDEO Required Error in HDL

Description

Learn how to fix the HDL error 'JBO-27014: Attribute RelationshipId in ElementEntryDEO is required' in Oracle Fusion HCM. Step-by-step solution includes work relationship cancellation, Source Key updates, and historical Element Entry reload.


Introduction

During Element Entry record loads using HCM Data Loader (HDL) in Oracle Fusion HCM (version 11.13.22.10.0), you may encounter the following error:

An error occurred. To review details of the error run the HCM Data Loader Error Analysis Report diagnostic test.

Message details: JBO-27014: Attribute RelationshipId in ElementEntryDEO is required

This error occurs because existing Element Entry records reference Source Keys associated with Payroll Relationships that start before Work Relationship and Assignment records, causing conflicts in date-effective loads.

This blog provides a step-by-step solution, SQL queries for verification, and best practices for handling historical Element Entry records.


Cause of the Error

  • Existing Element Entry records on Fusion HCM use the same Source Keys as the new HDL file.

  • Payroll Relationship start dates are earlier than the Work Relationship and Assignment records.

  • The Attribute RelationshipId is mandatory in ElementEntryDEO to link Element Entries with valid employment records.

Conflicts arise when the existing date-effective records overlap or reference the same keys.


Steps to Reproduce the Error

  1. Attempt to load new Element Entry records via HDL.

  2. Observe the JBO-27014 error.


Step-by-Step Solution

1. Extract Existing Employment Details

If no HDL integration exists with a third-party system (e.g., PeopleSoft):

  • Navigate to Run Diagnostics Tests via the login icon.

  • Add the role Application Diagnostics Administrator if the link is unavailable.

  • Run the Worker Data Extract diagnostic test:

    • Input parameters: Person Number, Exclude Highly Restricted Columns = false

    • Execute and download the WorkerDataExtract.zip file

If an HDL integration exists, update the Hire or Start Date in the third-party system to match the Payroll Relationship's Start Date.

2. Cancel the Work Relationship

  • Option A – Source Keys

SET PURGE_FUTURE_CHANGES_N
METADATA|WorkRelationship|DateStart|LegalEmployerName|CancelWorkRelationshipFlag|SourceSystemOwner|SourceSystemId|PersonId(SourceSystemId)
MERGE|WorkRelationship|<DateStart>|<LegalEmployerName>|Y|<SourceSystemId>|<SourceSystemOwner>|<PersonId(SourceSystemId)>
  • Option B – User Keys

SET PURGE_FUTURE_CHANGES_N
METADATA|WorkRelationship|DateStart|PersonNumber|LegalEmployerName|PeriodType|CancelWorkRelationshipFlag
MERGE|WorkRelationship|<DateStart>|<PersonNumber>|<LegalEmployerName>|<PeriodType>|Y

3. Reload or Recreate Work Relationship, Work Terms, and Assignments

  • If HDL integration exists, submit historical Worker, WorkTerms, Assignment and child Assignment objects in HDL.

  • If no integration, use Worker Data Extract or manual UI updates.

  • Use User Keys for object references; surrogate keys may be used for attributes (Business Unit ID, Organization ID, Job ID).

4. Reload the Original Element Entry HDL File

  • Update Source Keys for existing Element Entry records to avoid conflicts.

  • SQL queries for parent and child records:

    • Parent Element Entry

select hikm.GUID EE_GUID, hikm.SOURCE_SYSTEM_ID EE_SSID, hikm.SOURCE_SYSTEM_OWNER EE_SSO, 
peef.ELEMENT_ENTRY_ID EE_ID, TO_CHAR(peef.EFFECTIVE_START_DATE,'YYYY-MM-DD') EE_ESD, 
TO_CHAR(peef.EFFECTIVE_END_DATE,'YYYY-MM-DD') EE_EED, peef.ENTRY_TYPE ENTRY_TYPE
from fusion.HRC_INTEGRATION_KEY_MAP hikm, fusion.PAY_ELEMENT_ENTRIES_F peef
where hikm.OBJECT_NAME = 'ElementEntry'
  • Child Element Entry Value

select hikm.GUID EEV_GUID, hikm.SOURCE_SYSTEM_ID EEV_SSID, hikm.SOURCE_SYSTEM_OWNER EEV_SSO, 
peevf.ELEMENT_ENTRY_VALUE_ID EE_VAL_ID, peevf.ELEMENT_ENTRY_ID EE_ID,
 TO_CHAR(peevf.EFFECTIVE_START_DATE,'YYYY-MM-DD') EE_ESD,
 TO_CHAR(peevf.EFFECTIVE_END_DATE,'YYYY-MM-DD') EE_EED
from fusion.HRC_INTEGRATION_KEY_MAP hikm, 
fusion.PAY_ELEMENT_ENTRY_VALUES_F peevf
where hikm.OBJECT_NAME = 'ElementEntryValue'
  • Use these queries to generate a SourceKey.dat HDL file:

METADATA|SourceKey|FusionGUID|NewSourceSystemOwner|NewSourceSystemId
MERGE|SourceKey|<FusionGUID1>|<SourceSystemOwner>|<SourceSystemId1DONOTUSE>
...
  • Reload original Element Entry HDL file, referencing updated Source Keys.

  • Include historical date-effective records to maintain continuity.


Verification Queries

  • Work Relationship

select papf.PERSON_NUMBER PER_NUM, TO_CHAR(ppos.DATE_START,'YYYY-MM-DD') DATE_START, 
ppos.PERIOD_TYPE POS_TYPE, houftl.NAME LE_NAME
from fusion.PER_ALL_PEOPLE_F papf, fusion.PER_PERIODS_OF_SERVICE ppos, 
fusion.HR_ORGANIZATION_UNITS_F_TL houftl
where papf.PERSON_NUMBER = '<PersonNumber>'
  • Payroll Relationship

select papf.PERSON_NUMBER PER_NUM, TO_CHAR(pprdn.START_DATE,'YYYY-MM-DD') START_DATE, 
pprdn.PAYROLL_RELATIONSHIP_NUMBER PR_NUMBER, prttl.RELATIONSHIP_TYPE_NAME REL_NAME
from fusion.PAY_PAY_RELATIONSHIPS_DN pprdn, fusion.PAY_RELATIONSHIP_TYPES_TL prttl
where pprdn.PERSON_ID = papf.PERSON_ID

Ensure Work Relationship Period Type matches Payroll Relationship Type Name.


Best Practices

  • Maintain consistent Source Keys for Element Entry records.

  • Align Hire/Start Dates between Payroll Relationship and Work Relationship.

  • Use User Keys wherever possible; surrogate keys only for attributes.

  • Run diagnostic Worker Data Extracts before HDL reloads.

  • Test in lower environments before production loads.



HDL Benefits Enrollment Error: Payroll Relationship or Assignment Isn't Eligible

 

Resolving Payroll Relationship Eligibility Error in HDL Benefits Enrollment

Description

Learn how to fix the HDL error 'The payroll relationship or assignment isn't eligible for the element' in Oracle Fusion HCM Benefits Enrollment. Understand causes, steps to create new elements, and best practices for successful enrollments.


Introduction

When loading participant files or performing enrollments through HDL in Oracle Fusion HCM, you may encounter the following error:

Error occurred processing election information.

ORA-20001: The payroll relationship or assignment isn't eligible for the element XYZ on the date YYYY-MM-DD.

This error occurs because the employment record for the employee does not meet the eligibility criteria defined for the benefits element. It often happens when elements are incorrectly configured at the Assignment level instead of Payroll relationship level.

This blog provides a detailed explanation of the cause and a step-by-step solution to resolve this error.


Cause of the Error

The root cause is related to the element setup for benefits:

  • Elements have been configured with Employment Level = Assignment Level.

  • Benefits elements must always use Employment Level = Payroll Relationship.

  • Once an element is created, the Employment Level cannot be changed.

Incorrect configuration prevents the employee's payroll relationship from being eligible for the element, causing the HDL enrollment error.


Steps to Reproduce the Issue

  1. Navigate to Benefits Administration > Enrollment.

  2. Search for and select an employee.

  3. Change the Effective As-of Date.

  4. Click the Enroll button and make a selection.

  5. The system throws the eligibility error.

This reproduction confirms the setup problem is at the element configuration level.


Step-by-Step Solution

1. Create New Elements for Benefits

  • Create new benefit elements with Employment Level = Payroll Relationship.

  • Use consistent naming conventions to differentiate from old elements.

2. Verify Employment Level

  • Ensure that Employment Level for new elements is correctly set to Payroll Relationship.

  • This is critical because the value cannot be changed once the element is created.

3. Replace Incorrect Elements

  • Identify all activity rates or dependent records using the old elements.

  • Replace them with the newly created elements to maintain correct configuration.

4. Retest the Enrollment

  • Perform the enrollment again in the system.

  • Confirm that the ORA-20001 error no longer appears.

5. Migrate to Other Environments

  • Once validated in the development environment, migrate new elements and updates to QA and production environments using HDL or migration tools.


Best Practices

  • Always set Employment Level = Payroll Relationship when creating new benefits elements.

  • Validate eligibility rules before using elements in HDL loads.

  • Maintain a list of incorrectly configured elements for cleanup in legacy systems.

  • Use incremental testing for HDL participant file loads to catch errors early.

  • Document element creation steps for repeatable migration.


Resolving payroll eligibility error in HDL Flow

Resolving payroll eligibility error in HDL Flow






"The Party Identified by the Specified Party ID Does Not Exist" Error in HDL

 

Resolving "The Party Identified by the Specified Party ID Does Not Exist" Error in HDL

Description

Learn why the "The party identified by the specified party ID does not exist" error occurs in Oracle Fusion HCM HDL when loading Personal Payment Methods, and how to fix it using Party and Location maintenance and Person synchronization.


Introduction

While loading Personal Payment Methods in Oracle Fusion HCM using HCM Data Loader (HDL), you may encounter the following error:

"The party identified by the specified party ID does not exist."

This error usually occurs for workers whose Party record has not been created. Without a valid Party ID, HDL cannot link the Personal Payment Method to the worker.

This blog explains the cause of this error and provides a step-by-step solution to resolve it efficiently.


Reason for the Error

The primary reason for this error is:

  • Party ID not created for the worker: In Fusion HCM, every worker requires a Party record to store personal information. If the Party record is missing, HDL cannot load dependent objects like Personal Payment Methods.

Other potential causes include:

  • Missing or incomplete Location current records

  • Delayed or incomplete Person synchronization processes

Note: Party IDs are crucial for linking workers to all personal data in Fusion HCM.


Step-by-Step Solution

To resolve this issue, follow these two scheduled processes in Fusion HCM:

1. Maintain Party and Location Current Record Information

This process ensures that all workers have valid Party and Location current records.

Steps:

  1. Navigate to Scheduled Processes in Fusion HCM.

  2. Search for Maintain Party and Location Current Record Information.

  3. Submit the process for affected workers.

  4. Verify the process completes successfully.

Tip: Run this process for all workers missing Party records to avoid partial issues.

2. Synchronize Person Records

After maintaining Party and Location records, synchronize Person records to update the Party IDs for workers.

Steps:

  1. Navigate to Scheduled Processes.

  2. Search for Synchronize Person Records.

  3. Submit the process for all affected workers.

  4. Verify successful completion and check that Party IDs are now created.


Post-Process Verification

After completing both scheduled processes:

  • Retry loading the Personal Payment Method HDL file.

  • Check ESS request logs to confirm the error is resolved.

  • Validate that Personal Payment Methods are correctly assigned in Person Management → Payments.

The error should no longer appear if Party IDs exist for all workers.


Best Practices

  • Always pre-check Party and Location records before loading dependent objects in HDL.

  • Schedule Person synchronization regularly in mass load scenarios.

  • Validate HDL files with a small test load before full-scale processing.

  • Maintain a list of workers who have missing Party IDs to proactively fix issues.


Conclusion

The "The party identified by the specified party ID does not exist" error occurs because workers lack Party records. Running the scheduled processes Maintain Party and Location Current Record Information and Synchronize Person Records resolves this issue and ensures successful Personal Payment Method loads in HDL.

By following these steps and best practices, Fusion HCM consultants can avoid this common HDL error and maintain clean, accurate payroll and payment data.



HCM Conversion Strategy in Oracle Fusion HCM – Best Practices for Successful Data Migration

 

Conversion Strategy for HCM Conversions in Oracle Fusion HCM

Description

Learn the best practices and strategies for successful HCM conversions in Oracle Fusion HCM. This blog covers HDL, data cleansing, mapping, testing, and migration techniques for seamless employee data transition.


Introduction

Data conversion is a critical part of Oracle Fusion HCM implementation. Accurate and efficient conversion ensures that historical and current employee data is migrated seamlessly from legacy systems to Fusion HCM. Poorly executed conversions lead to errors, failed HDL loads, and downstream payroll or reporting issues.

This blog provides a comprehensive HCM conversion strategy, including HDL usage, data cleansing, mapping, testing, and best practices.


Step 1: Data Assessment and Inventory

Before starting a conversion, perform a detailed data assessment:

  • Identify all legacy HR objects (Employee, Assignment, Compensation, Payroll, etc.)

  • Determine data quality and completeness

  • Check for duplicate records or missing information

  • Document object relationships and dependencies

Tip: Maintain a data dictionary for each object to streamline mapping.


Step 2: Data Cleansing

Clean your legacy data to avoid errors during HDL loads:

  • Remove duplicate employees and assignments

  • Correct invalid dates (hire, termination, assignment start/end)

  • Standardize codes for Job, Position, Department, Legal Employer

  • Validate mandatory fields required by Fusion HCM

Data quality at this stage significantly reduces load errors.


Step 3: Data Mapping

Map legacy system fields to Fusion HCM objects:

  • Worker → Worker.dat

  • WorkRelationship → WorkRelationship.dat

  • WorkTerms → WorkTerms.dat

  • Assignment → Assignment.dat

  • Payroll & Compensation → respective HDL objects

Mapping Tips:

  • Maintain source system codes for reference

  • Handle value conversions (e.g., department codes, job codes)

  • Document default values for missing fields


Step 4: HDL Load Design

Design HDL files carefully for conversions:

  • Use incremental or batch loads based on data volume

  • Split files object-wise (Worker, WorkRelationship, WorkTerms, Assignment)

  • Maintain consistent SourceSystemOwner

  • Use MERGE for existing employees, CREATE for new ones

Tip: Maintain clear file naming (e.g., Worker_0001_to_5000.dat).


Step 5: Load Sequencing

Proper sequencing avoids dependency errors:

  1. Worker.dat → create or merge employees

  2. WorkRelationship.dat → create or end-date relationships

  3. WorkTerms.dat → create or update terms

  4. Assignment.dat → create or update assignments

  5. Payroll, Compensation, and Benefits → load after core HR data


Step 6: Validation and Testing

Test all HDL loads in lower environments:

  • Validate successful load of all employees and assignments

  • Check for duplicate records and missing dependencies

  • Verify effective dates and assignment categories

  • Test edge cases like rehires, global transfers, multiple assignments

Tip: Use sample audit reports to cross-check conversion accuracy.


Step 7: Go-Live Strategy

  • Schedule staged loads during off-peak hours

  • Maintain backup of legacy data before final load

  • Monitor ESS requests and capture logs for failures

  • Communicate with HR and Payroll teams for verification

Post go-live, perform parallel verification before decommissioning legacy systems.


Common Challenges and Solutions

ChallengeSolution
Data inconsistencyPre-cleanse and validate legacy data
Missing dependenciesCorrect sequencing of HDL objects
Large volume load failuresSplit files, incremental loads, schedule ESS jobs off-peak
Incorrect mappingsMaintain mapping documents and value conversion tables

Best Practices for HCM Conversions

  • Document all steps and mappings for repeatability

  • Use incremental HDL loads for testing and full loads for go-live

  • Pre-validate and cleanse data before conversion

  • Maintain audit logs and ESS request IDs

  • Include functional and technical consultants in validation

  • Run mock conversions to simulate real data loads


HCM Conversion strategy workflow

HCM Conversion strategy workflow



Conclusion

A well-planned HCM conversion strategy is key to a successful Fusion HCM implementation. Focus on data quality, mapping accuracy, HDL sequencing, and thorough validation to minimize errors and ensure seamless migration.

Mastering these steps not only improves conversion success but also builds trust with clients and HR teams.

Local Transfer vs Global Transfer in Oracle Fusion HCM HDL – Technical Differences, Use Cases, and Best Practices

 

Local Transfer vs Global Transfer (HDL)


Description

Understand the difference between Local and Global Transfers in Oracle Fusion HCM HDL. Learn object-level changes, effective dating rules, HDL sequencing, and common errors with a clear technical comparison.


Introduction

Employee transfers are common in enterprise organizations, but in Oracle Fusion HCM, not all transfers are the same. From an HDL (HCM Data Loader) perspective, Local Transfers and Global Transfers behave very differently and impact different Core HR objects.

Misunderstanding this difference often leads to:

  • Assignment failures

  • WorkRelationship errors

  • Incorrect legal employer data

This blog provides a complete technical comparison of Local vs Global Transfers using HDL, including object structure, effective dating rules, HDL sequencing, and real project guidance.



High-Level Difference

Local Vs Global Transfer Comparison Flow 




Local Transfer – Technical Overview

What Is a Local Transfer?

A Local Transfer occurs when an employee moves within the same Legal Employer, such as:

  • Department change

  • Job or position change

  • Location or manager change

The WorkRelationship remains unchanged.


Local Transfer – Data Model Impact

Worker
 └─ WorkRelationship (Same LE)
     └─ WorkTerms (Same)
         └─ Assignment (Updated)

Local Transfer – HDL Approach

  • Use MERGE on Assignment

  • No need to create new WorkRelationship or WorkTerms

  • Effective date reflects the transfer date

Objects Used

  • Worker (optional)

  • Assignment (MERGE)


Common Local Transfer Errors

  • Incorrect effective start date

  • Using CREATE instead of MERGE

  • Overwriting historical assignment data


Global Transfer – Technical Overview

What Is a Global Transfer?

A Global Transfer occurs when an employee moves from one Legal Employer to another, often across countries.

This requires creation of a new WorkRelationship.


Global Transfer – Data Model Impact

Worker
 ├─ WorkRelationship (Old LE) → End-dated
 └─ WorkRelationship (New LE)
     └─ WorkTerms
         └─ Assignment

Global Transfer – HDL Approach

  • End-date old WorkRelationship

  • Create new WorkRelationship

  • Create new WorkTerms

  • Create new Assignment

Objects Used

  • Worker (MERGE – optional)

  • WorkRelationship (MERGE + CREATE)

  • WorkTerms (CREATE)

  • Assignment (CREATE)


Effective Dating Comparison

RuleLocal TransferGlobal Transfer
WorkRelationshipNo changeEnd-date old, create new
WorkTermsNo changeStart with new WR
Assignment Start DateTransfer dateSame as new WR
Historical DataPreservedPreserved

HDL Sequencing Comparison

Local Transfer

  1. Assignment.dat (MERGE)

Global Transfer

  1. WorkRelationship.dat (End-date old)

  2. WorkRelationship.dat (Create new)

  3. WorkTerms.dat (Create)

  4. Assignment.dat (Create)


How to Decide: Local vs Global Transfer

Ask these questions:

  1. Is the Legal Employer changing?

  2. Is the payroll or country changing?

  3. Is a new WorkRelationship required?

If the answer to #1 is YES, it is a Global Transfer.


Common Mistakes to Avoid

❌ Treating Global Transfer as Local Transfer
❌ Reusing Assignment across Legal Employers
❌ Missing WorkTerms during Global Transfer
❌ Incorrect effective dating across objects


Best Practices

  • Always confirm Legal Employer change first

  • Use incremental HDL loads

  • Keep separate files for each object

  • Test transfer scenarios in lower environments

  • Validate results in Person Management UI


Conclusion

Understanding the technical difference between Local and Global Transfers in HDL is essential for accurate data management in Oracle Fusion HCM. While Local Transfers are assignment-level changes, Global Transfers require precise handling of WorkRelationship, WorkTerms, and Assignment objects.

Mastering this distinction is a key skill for senior Fusion HCM Technical Consultants.



HDL for Global Transfers in Oracle Fusion HCM – Complete Technical Walkthrough

 

HDL for Global Transfers: Technical Walkthrough

Description

Learn how to perform Global Transfers using Oracle Fusion HCM HDL with the WorkRelationship data model. This technical walkthrough explains required objects, effective dating, HDL structure, common errors, and best practices.


Introduction

A Global Transfer in Oracle Fusion HCM occurs when an employee moves from one legal employer to another while retaining the same person record. From an HDL perspective, global transfers are one of the most complex Core HR transactions due to effective dating, object relationships, and sequencing rules.

With the current HCM data model, Global Transfers are handled using WorkRelationshipWorkTerms and Assignment

This blog provides a step-by-step technical walkthrough of handling Global Transfers using HCM Data Loader (HDL) based on the correct object structure.


Global Transfer – Correct Data Model Structure

Visual Diagram – Before vs After Global Transfer

BEFORE Global Transfer

Worker (PersonNumber: E1001)
 └─ WorkRelationship (India Legal Employer)
     └─ WorkTerms
         └─ Assignment (Primary)

AFTER Global Transfer

Worker (PersonNumber: E1001)
 ├─ WorkRelationship (India Legal Employer) → End-dated
 │   └─ WorkTerms
 │       └─ Assignment (End-dated)
 │
 └─ WorkRelationship (US Legal Employer)
     └─ WorkTerms
         └─ Assignment (Primary)

This visual clearly shows that during a Global Transfer:

  • The Person record remains unchanged

  • A new WorkRelationship is created for the new Legal Employer

  • WorkTerms and Assignment are never reused across Legal Employers


Worker (same PersonNumber)
 ├─ WorkRelationship (Old Legal Employer) → End-dated
 └─ WorkRelationship (New Legal Employer)
     └─ WorkTerms
         └─ Assignment

Key HDL Objects Involved

To process a Global Transfer using HDL, the following objects are required:

  1. Worker

  2. WorkRelationship (End-date old)

  3. WorkRelationship (Create new)

  4. WorkTerms (Create new)

  5. Assignment (Create new)

⚠️ WorkTerms and Assignments cannot be reused across Legal Employers.


Important Pre-Requisites

Before loading HDL:

  • New Legal Employer must exist

  • Business Unit must be correctly mapped

  • Job, Position, Grade, Location must be valid for the new LE

  • SourceSystemOwner must remain consistent


Effective Dating Rules for Global Transfers

Follow these critical rules:

  • New WorkRelationship start date = Global transfer date

  • Old WorkRelationship end date = Day before new WorkRelationship start date

  • WorkTerms start date = New WorkRelationship start date

  • Assignment start date = WorkTerms start date

Violating these rules will cause HDL failures.


HDL Load Sequencing (Mandatory)

Load HDL files in this exact order:

  1. Worker.dat (MERGE – optional updates)

  2. WorkRelationship.dat (End-date old WR)

  3. WorkRelationship.dat (Create new WR)

  4. WorkTerms.dat (Create new WT)

  5. Assignment.dat (Create new Assignment)


Common HDL Errors and Fixes

❌ Assignment Exists Under Old Legal Employer

Error:

Assignment already exists

Fix:

  • Do not reuse old AssignmentNumber

  • Create a new Assignment under the new WorkRelationship


❌ Effective Date Validation Error

Error:

Effective start date must be on or after WorkRelationship start date

Fix:

  • Align WorkRelationship, WorkTerms, and Assignment start dates


❌ Missing WorkRelationship

Error:

No WorkRelationship exists

Fix:

  • Ensure new WorkRelationship load completed successfully


Post-Load Validation Checklist

After HDL load:

  • Verify old WorkRelationship is end-dated

  • Confirm new Legal Employer on WorkRelationship

  • Validate WorkTerms and Assignment details

  • Check Person Management UI


Best Practices for Global Transfers via HDL

  • Always test in lower environments

  • Use incremental loads only

  • Keep separate HDL files per object

  • Maintain clear SourceSystemId strategy

  • Capture ESS Request IDs


Real Project Scenario

Scenario: Employee transferred from India LE to US LE

Approach:

  • End-date India WorkRelationship on 31-Mar

  • Create US WorkRelationship on 01-Apr

  • Create WorkTerms and Assignment under US LE

Result: Clean Global Transfer without data corruption.


Conclusion

Global Transfers using HDL require precise control over WorkRelationship, WorkTerms, and Assignment objects, along with strict effective dating and sequencing. When executed correctly, HDL provides a reliable and auditable way to manage cross–legal employer movements.

This knowledge is a key differentiator for senior Fusion HCM Technical Consultants.


How to Roll Back a Failed HDL Load Safely in Oracle Fusion HCM – Best Practices & Scenarios

 

How to Roll Back a Failed HDL Load Safely

Description

Learn how to safely roll back a failed HDL load in Oracle Fusion HCM. This blog explains rollback strategies, corrective approaches, common mistakes, and best practices for Fusion HCM technical consultants.


Introduction

Failures during HCM Data Loader (HDL) runs are common in Oracle Fusion HCM projects. However, HDL does not provide a one-click rollback option like traditional database transactions.

This makes it critical for Fusion HCM Technical Consultants to understand safe rollback and recovery strategies after a failed HDL load.

This blog explains:

  • What rollback means in HDL

  • Safe ways to reverse or correct failed loads

  • Real project scenarios and best practices


Understanding HDL Load Behaviour

Before discussing rollback, it is important to understand how HDL works.

Key Characteristics of HDL

  • HDL commits data object by object, not as a single transaction

  • Partial data may be loaded even if the job fails

  • Parent records may succeed while child records fail

⚠️ This is why careful rollback planning is essential.


Common HDL Failure Scenarios

  • Worker created but Assignment failed

  • Assignment loaded with wrong effective date

  • Duplicate records due to incorrect operation type

  • Invalid SourceSystemOwner or lookup values

Each scenario requires a different rollback approach.


What “Rollback” Means in HDL

In Fusion HCM HDL, rollback usually means:

  • Correcting incorrect data

  • Reversing records using effective dating

  • Deleting test or invalid records (when allowed)

  • Rollback Provided for limited objects like Element Entry.

It rarely means a full data wipe.


Safe Rollback Strategies in HDL

1️⃣ Correct-and-Reprocess (Most Recommended)

When to Use

  • Partial load success

  • Data exists but contains errors

Approach

  • Fix incorrect HDL data

  • Use MERGE operation

  • Re-load only failed or incorrect records

Example

Correcting assignment effective start date and reloading Assignment.dat.


2️⃣ End-Date Incorrect Records

When to Use

  • Data loaded with wrong effective dates

  • Historical correction required

Approach

  • End-date incorrect records

  • Reload correct data with new effective dates

This approach maintains audit history.


3️⃣ Delete Records (Use with Extreme Caution)

When to Use

  • Test data in lower environments

  • Duplicate or invalid records created

Approach

  • Use HDL DELETE operation (only where supported)

  • Validate object supports deletion

⚠️ Avoid DELETE in Production unless approved.


4️⃣ Full Reload After Environment Refresh

When to Use

  • Massive data corruption

  • Implementation or test environments

Approach

  • Refresh environment

  • Perform controlled Full Load

Not recommended for Production environments.



HDL Rollback Decision Flow


HDL Rollback Flow



What NOT to Do During Rollback

❌ Do not reload the same file blindly
❌ Do not switch CREATE to MERGE without analysis
❌ Do not perform Full Load for small issues
❌ Do not delete Production data casually


Best Practices for Safe HDL Rollback

  • Always capture Request IDs

  • Keep HDL files version-controlled

  • Use incremental loads wherever possible

  • Validate data using reports before and after reload

  • Test rollback strategy in lower PODs


Real Project Rollback Example

Scenario: Assignment load failed due to incorrect effective date

Solution:

  1. Identify failed records from .out file

  2. Correct effective dates

  3. Reload Assignment.dat using MERGE

  4. Validate using Person Management UI


Preventing Rollback Situations

  • Validate HDL files before load

  • Use smaller file batches

  • Perform dry-run validation in lower environments

  • Follow strict SourceSystemOwner governance


Conclusion

Rollback in HDL is about controlled correction, not reversal. A well-planned recovery strategy ensures data integrity while minimizing business impact.

Understanding rollback techniques is a key skill that differentiates a confident Fusion HCM Technical Consultant from an average one.


Troubleshooting Oracle Fusion HCM HDL Load Failures Using ESS Logs – A Step-by-Step Guide

 

Troubleshooting HDL Load Failures Using ESS Logs

Description

Learn how to troubleshoot Oracle Fusion HCM HDL load failures using ESS logs. This guide explains ESS job flow, key log files, common errors, and practical debugging techniques for Fusion HCM technical consultants.


Introduction

HCM Data Loader (HDL) issues are common in Oracle Fusion HCM projects, especially during data migration, integrations, and post–go-live support. Most HDL failures can be diagnosed effectively by understanding ESS jobs and log files.

This blog provides a practical, step-by-step approach to troubleshooting HDL load failures using ESS logs, based on real implementation and support experience.


Understanding the HDL ESS Job Flow

When an HDL load is submitted, Fusion HCM internally runs multiple ESS jobs. Understanding this sequence is the first step in troubleshooting.

Typical HDL Job Flow

  1. Load HCM Data (Parent ESS Job)

  2. Import and Load Data

  3. HCM Data Loader

  4. Object-specific processing jobs (Worker, Assignment, etc.)

⚠️ Even if the parent job shows Succeeded, child jobs may still have errors.


Where to Find ESS Logs

Navigation Path

Tools → Scheduled Processes → Search → Enter Request ID

Click on the job and navigate to:

  • Log Files

  • Output Files

These files are the primary sources for HDL troubleshooting.


Key HDL Log and Output Files Explained

1️⃣ .log File

  • Technical execution details

  • Object processing sequence

  • Database-level validations

  • Best for identifying root cause

2️⃣ .out File

  • Business-friendly error messages

  • Validation and data issues

  • Often references line numbers from HDL files

3️⃣ .err File (if generated)

  • Summarized critical errors

  • Usually points to blocking failures


Step-by-Step HDL Troubleshooting Approach

Step 1: Check ESS Job Status

  • Identify whether failure occurred at parent or child job level

  • Drill down to the lowest failed child job


Step 2: Review the .out File First

Look for messages like:

  • Record rejected

  • Invalid value

  • Effective date error

Example:

No Assignment record exists from 2018-10-27

This usually indicates data or effective dating issues.


Step 3: Analyze the .log File for Root Cause

Search keywords:

  • ERROR

  • SEVERE

  • ORA-

The .log file explains why the system rejected the record.


Step 4: Validate HDL File Data

Check for:

  • Correct object sequencing

  • SourceSystemOwner consistency

  • Effective start and end dates

  • Operation type (CREATE vs MERGE)


Step 5: Correct and Reload

  • Fix only failed records where possible

  • Use MERGE to avoid duplicates

  • Re-run HDL with corrected files


Common HDL Errors and Log Indicators

❌ Missing Assignment Record

Log Indicator:

No Assignment record exists

Fix:

  • Ensure Assignment exists for the effective date


❌ Source System Owner Errors

Log Indicator:

SourceSystemOwner is invalid

Fix:

  • Validate lookup under HRC_SOURCE_SYSTEM_OWNER


❌ Effective Date Validation Errors

Log Indicator:

Effective start date must be on or after Period of Service start date

Fix:

  • Align Worker, PeriodOfService, and Assignment dates


❌ Duplicate Records

Log Indicator:

Duplicate record found

Fix:

  • Use MERGE instead of CREATE

  • Verify SourceSystemId uniqueness


Best Practices for HDL Troubleshooting

  • Always start with the .out file, then deep dive into .log

  • Track Request IDs for every HDL load

  • Maintain a common HDL error resolution document

  • Test fixes in lower environments

  • Avoid Full Loads for minor corrections


Pro Tips from Real Projects

  • One ESS job can generate multiple log files—check all

  • Do not trust parent job status alone

  • HDL errors are mostly data-related, not system bugs

  • Keep HDL files small for faster troubleshooting



Incremental vs Full Loads in HDL – When and Why

 

Incremental vs Full Loads in Oracle Fusion HCM HDL – When to Use Which and Why

Description

Understand the difference between Incremental and Full Loads in Oracle Fusion HCM HDL. Learn when to use each approach, with real project scenarios, best practices, and common pitfalls.


Introduction

Oracle Fusion HCM Data Loader (HDL) supports two primary data loading strategies: Incremental Loads and Full Loads. Choosing the wrong approach can lead to data corruption, missing records, or unnecessary rework.

This blog explains what incremental and full loads are, how they work in HDL, and when you should use each, based on real-world Fusion HCM implementation and support scenarios.


What Is a Full Load in HDL?

A Full Load means loading the entire dataset for an object, regardless of whether the data already exists in Fusion HCM.

Key Characteristics

  • Loads all records (existing + new)

  • Often uses CREATE or MERGE operations

  • Commonly used during initial data migration

  • Higher processing time and system impact

Example Scenarios

  • Initial employee migration during implementation

  • POD refresh followed by complete data reload

  • Rebuilding corrupted or missing data

Sample Use Case

Loading all employees, assignments, jobs, and positions into a new Fusion environment before go-live.


What Is an Incremental Load in HDL?

An Incremental Load updates or inserts only changed or new records since the last successful load.

Key Characteristics

  • Loads delta data only

  • Typically uses MERGE operation

  • Faster execution

  • Lower risk when handled correctly

Example Scenarios

  • Daily employee hires, updates, or terminations

  • Compensation changes

  • Assignment changes


Key Differences: Incremental vs Full Load






When Should You Use a Full Load?

Use Full Load when:

  • Performing initial data migration

  • Reloading data after environment refresh

  • Correcting major data inconsistencies

  • Migrating large master data like jobs, grades, or positions

⚠️ Caution

Full loads can:

  • Overwrite data unintentionally

  • Impact system performance

  • Cause duplicate or effective-dating issues

Always test in a lower environment first.


When Should You Use Incremental Load?

Use Incremental Load when:

  • Making day-to-day HR updates

  • Loading new hires or assignment changes

  • Updating compensation or manager changes

  • Handling regular integrations from source systems

Best Practice

  • Track changes using Last Updated Date from source systems

  • Maintain strict SourceSystemOwner consistency


Common Mistakes to Avoid

❌ Using Full Load for Daily Updates

This increases risk and processing time.

❌ Incorrect Operation Type

Using CREATE instead of MERGE can cause duplicate records.

❌ Poor Effective Date Management

Misaligned dates can lead to missing assignment or period of service errors.


Best Practices for HDL Load Strategy

  • Use Full Load only when absolutely necessary

  • Default to Incremental Loads for production support

  • Validate effective dates before every load

  • Maintain a load control and audit mechanism

  • Review HDL .log and .out files after each run


Real Project Recommendation

Implementation Phase: Full Load

Post Go-Live: Incremental Load only

Post Refresh: Full Load (selective objects)

This hybrid approach minimizes risk and ensures data integrity.


Conclusion

Understanding when to use Incremental vs Full Loads in HDL is critical for a successful Fusion HCM implementation and support model. The right strategy improves performance, reduces errors, and protects production data.

A disciplined approach to HDL loading separates an average consultant from a strong Fusion HCM Technical expert.


Monday, 5 January 2026

Error during Balance Adjustment HDL load

Error during Balance Adjustment HDL load


Error Message1 : You Must submit an entry value for input value amount.
Solution: Input Value amount is blank. 

Error Message2 : A Previous Process that was run for this payroll relationship couldn't be complete or marked to be run again.
Solution: Please check whether any process or flow has errored out for the person in Person Results.

Thursday, 1 January 2026

Error during Worker HDL load

 

Error during Worker HDL load


Error Message1 : You Must a Primary Value
Solution: Add PrimaryWorkTermsFlag attribute in Workterm Metadata.

Error Message2 : Person can have only 1 active Primary WorkRelationship 
Solution: Update the termination date tagged with previously terminated Workrelationship.

Error Message3 : SourceSystemOwner is unknown 
Solution: Update the HRC_SOURCE_SYSTEM_OWNER lookup by removing FUSION from the custom lookup code, then update the same value in the Worker (or relevant) HDL file.
Zip the updated file and load it via My Client Groups → Data Exchange → Import and Load File, and refresh to confirm successful completion.


Error Message4 : JBO-FND:::FND_FLEX_SEG_VAL_NOT_IN_LIST: xxx is not in the list
Solution: Resolve the issue by either updating the Site Code value for the affected Position record and removing the end date, or by extending the end date of the Site Code.
After making the correction, reload the Worker HDL file to apply the changes.


Error Message5 : You must enter a valid value for the GradeId field. 
Solution: Provide the correct GradeId/GradeCode for the new Position in the DAT file, or alternatively pass #NULL (with PER_ENFORCE_VALID_GRADE = Y) or set PER_ENFORCE_VALID_GRADE = N to avoid passing grade values.
In both scenarios, run the ESS job – Synchronize Person Assignment from Position to sync grade, job, and other position-related fields.

Error Message6 : The values xxxx aren't valid for the attribute LegislationCode. 
Solution: Update the existing Legal Entity with a unique Registration Number via Setup & Maintenance → Legal Structures → Manage Legal Reporting Unit Registrations, then save the changes.
After the update, reload the Worker HDL file to complete the process.

Error Message7 : METADATA line for the {BO_NAME} business object is invalid, When attempting to load a Worker HDL file to add Person Extra Information
Solution: Refresh the Worker HDL object, download the latest template, and update the Worker HDL file with the required PEI EFF, EFF_CATEGORY_CODE, and FLEX attributes by following the Oracle documentation for loading extensible flexfields.
Save, zip, and reload the file via My Client Groups → Data Exchange → Import and Load Data to confirm the error is resolved.

Error Message8 : You cannot update this record because the SourceSystemId <SOURCE_SYSTEM_ID_1> and the SourceSystemOwner ABC are invalid when attempting Assignment Supervisor. 
Solution: The user can either remove Source System ID/Owner and use surrogate or user keys for Assignment Supervisor, or align the Source System ID/Owner with Fusion-supported keys, ensuring parent/child objects do not mix different key types.
Alternatively, identify and update source key mappings via HRC_INTEGRATION_KEY_MAP using SourceKeys.dat, after which the original AssignmentSupervisor data can be successfully loaded—this approach is best suited for existing Fusion integrations.

Error Message9 : You must provide only one parent record Worker and it must start on the earliest effective start date and not have an end date. 
Solution: Pass the same date for both StartDate and EffectiveStartDate, ensuring it matches the first effective start date from PER_ALL_PEOPLE_F for the person.
If only creating a new work relationship or assignment, remove the METADATA Worker and MERGE Worker records from the data file.

Error Message10 :  No Assignment record from 2018-10-27
Solution: Assignment data is missing in Worker.dat from the effective date 2018-10-27.



Not able to End date the Person Contact Relationship using HDL

Not able to End date the Person Contact Relationship using HDL



Description : 
Using HDL trying to end date the Person Contact Relationship.

Resolution :
You can not end date contact relationship using HDL as this feature is not supported by Oracle.
In this case you can delete the Person Contact Relationship.

You can't delete the contact because the person is designated as a benefit dependent or beneficiary

 

You can't delete the contact because the person is designated as a benefit dependent or beneficiary

Error Message : 

You cannot delete the contact because the person is designated as a benefit dependent or beneficiary, or an attempt was made to process the benefit designation

Or

Error removing a duplicate beneficiary/contact


Solution : 

1. Check if Dependent is add as beneficiary or not. if yes then remove it.

2. Check if dependent is elected as beneficiary in any of the plan, remove if yes.

3. If Dependent is covered in any plan then void/purge the life event and Run Purge Backed-Out or Voided Life Event Data for the employee. 


After this you can delete or end date Person Contact Relationship either using HDL or UI.


 



Wednesday, 1 October 2025

HDL Template for Benefit Participant Enrollment

HDL Template for Benefit Participant Enrollment


For this use ParticipantEnrollment.dat file.


Please find the below sample template :

METADATA|ParticipantEnrollment|PersonNumber|ParticipantLastName|ParticipantFirstName|BenefitRelationship|LifeEvent|LifeEventOccuredDate|EffectiveDate

MERGE|ParticipantEnrollment|XXTEST_PER1|Adam|Baro|Default|NewHire|2016/11/05|2016/11/05

METADATA|CompensationObject|Program|OriginalEnrollmentDate|PersonNumber|LineNumber

MERGE|CompensationObject|XXTEST Benefits Program|2016/11/05|XXTEST_PER1|1


Tuesday, 2 July 2024

How to sort data in rtf template

 How to sort data in rtf template


Sort the data in Data model is the recommended approach.
But if you want to sort the data in rtf template then you have to use below xml tag code.

<?sort:fieldname?><?fieldname?>  

this will sort the data in ascending order be default as it is not mentioned.

if user want to sort the data in descending order then use below code

<?sort:fieldname;'descending'?><?fieldname?>  

Prenotification Status of Personal Payment Method in Oracle HCM

 

Prenotification Status of Personal Payment Method in Oracle HCM


Prenote Status is not stored directly in table, you can find the corresponding details in pay_bank_account_prenotes table.

Prenote status you have to use below logic.

CASE
WHEN pay_bank_account_prenotes.PRENOTE_DATE=TO_DATE('31/12/4712','DD/MM/YYYY') THEN 'ORA_R'
WHEN pay_bank_account_prenotes.PRENOTE_DATE=TO_DATE('01/01/0001','DD/MM/YYYY') THEN 'ORA_SK'
WHEN NVL((pay_bank_account_prenotes.PRENOTE_DATE),TO_DATE('31/12/4712','DD/MM/YYYY'))=TO_DATE('31/12/4712','DD/MM/YYYY') THEN 'ORA_N_S'
WHEN TRUNC(pay_bank_account_prenotes.PRENOTE_DATE+NVL(OrganizationPaymentMethodDEO.VALIDATION_DAYS,0)) <= TRUNC(SYSDATE) THEN 'ORA_C'
ELSE 'ORA_S'
END AS PRENOTE_STATUS



Monday, 1 July 2024

Customize the Seeded Job Offer Letter Template

 Steps to Customize Oracle Seeded Job Offer Letter Template

Click here for step by step video to create job offer letter template customization

Below are the steps to customize the seeded job offer letter.

First download the seeded job offer template from below path.

      Path - Shared Folders > Human Capital Management > Recruiting > Job Offer

Select Job Offer Letter Report and click on Edit



Click on Edit to download the seeded template.



Save the Downloaded .rtf template.

To modify the template we need .xml to do the changes as per needed. 

We need offer id which need to provide the job offer letter Data model to get XML so first we need to find offer id using below SQL.


SELECT

offer.OFFER_ID

FROM

IRC_OFFERS offer,

IRC_CANDIDATES candidate,

IRC_REQUISITIONS_B req,

IRC_SUBMISSIONS sub

WHERE

req.REQUISITION_ID = sub.REQUISITION_ID

AND candidate.PERSON_ID = sub.PERSON_ID

AND sub.SUBMISSION_ID = offer.SUBMISSION_ID

AND REQUISITION_NUMBER = 'Enter the Requisition Number'

AND CANDIDATE_NUMBER = 'Enter the Candidate Number'


Once we get the offer letter then go to the below path to get XML.

  Path - Shared Folders > Human Capital Management > Recruiting > Job Offer
 
Open the JobOfferLetterDM


Pass the Job offer ID as mentioned below and click on view.



Once data is fetched by Data model then export the xml.




Then load the same xml in .rtf template as mentioned below.


Do the necessary changes as per the requirement then save the rtf template.
Add the rtf template to .zip file.

Now upload the zip file to Recruiting Content Library.
Go to Setup and Maintenance.


 
Select Recruiting and Candidate Experience Management then click on Recruiting Content Library.









Click Create to create the content Item.







Enter the details.







Name:- User defined Name

Code :- User defined

Category :- Job Offer Letter Template

Visibility: - Internal,External







Select Start on Activation Check box and upload the zip file.

Then click on Save and activate button.










Now Create/Edit the job offer with new job offer template.

Path - My Client Group > Hiring

Select the Requisition and click on active application.

Click on Action to create job offer or If Offer is already created then click edit job offer.

It will open new dialogue. Select appropriate Check Box and click on continue.

Enter the details as per the section on Job Creation Page.

In offer section Select the newly created job offer letter and click on submit. 

Click on Preview to verify the customize offer letter.





The values {attributes} are not valid for the attribute BankAccountId

Error While Loading PersonalPaymentMethod Using HCM Data Loader Introduction Oracle Fusion HCM HCM Data Loader (HDL) is commonly used to lo...