Share via

Copy Merchant Data, Error: Failure happened on 'Sink' side. ErrorCode=UserErrorDocumentDBWriteError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Documents failed to import. Error message:Missing partition key header for point

Venkat Pendyala - Azure 5 Reputation points
2026-03-16T07:21:35.49+00:00

Operation on target ForEach failed: Activity failed because an inner activity failed; Inner activity name: Copy Merchant Data, Error: Failure happened on 'Sink' side. ErrorCode=UserErrorDocumentDBWriteError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Documents failed to import. Error message:Missing partition key header for point operation

ActivityId: 9aa439d6-dadc-4e61-b423-779167166aba, Windows/10.0.20348 cosmos-netstandard-sdk/3.18.0, documentdb-dotnet-sdk/2.5.1 Host/64-bit MicrosoftWindowsNT/6.2.9200.0.,Source=Microsoft.DataTransfer.DocumentDbManagement,''Type=Microsoft.Azure.Documents.DocumentClientException,Message=Missing partition key header for point operation

ActivityId: 9aa439d6-dadc-4e61-b423-779167166aba, Windows/10.0.20348 cosmos-netstandard-sdk/3.18.0, documentdb-dotnet-sdk/2.5.1 Host/64-bit MicrosoftWindowsNT/6.2.9200.0,Source=Microsoft.Azure.Documents.Client,'

Azure Data Factory
Azure Data Factory

An Azure service for ingesting, preparing, and transforming data at scale.

{count} vote

1 answer

Sort by: Most helpful
  1. Smaran Thoomu 34,155 Reputation points Microsoft External Staff Moderator
    2026-03-17T06:41:32.9066667+00:00

    @Venkat Pendyala - Azure Hey Venkat, this “Missing partition key header for point operation” means that when ADF’s Copy activity is writing to Cosmos DB it isn’t sending the required partition-key header in the request. Even if your source JSON has the key field, ADF needs to know which property to use so it can populate that header on each write.

    Here’s what to check and update:

    1. Validate your Cosmos container’s partition-key path • In the Azure portal, go to your Cosmos DB account → Data Explorer → select your database/container → Settings → “Partition key” • Note the exact path (e.g. /customerId, /region), including the leading slash and correct casing.
    2. Configure the partition key in your ADF sink dataset • Edit the Cosmos DB (DocumentDB) dataset you’re using as the sink. • Under Connection → DocumentDB partition key, enter the same path you saw in the portal (e.g. /customerId).
    3. Map the source field to that partition key • In your Copy activity’s Sink tab → Mapping, ensure the column in your source that holds the partition-key value maps to the partition-key property in Cosmos. • If you don’t explicitly map, ADF assumes your JSON has the field, but you still must define the path in the dataset.
    4. Enable Upsert (optional but often helpful) • Still in the Sink tab, toggle “Enable upsert.” • Upsert operations still require the partition-key header, but this setting can reduce failures when items already exist.
    5. Double-check your data • Make sure every row/document in your source has a non-null, non-empty value in that partition-key field. Any missing value will trigger the same error.

    After you’re sure the path in the dataset matches your container and the mapping is correct, run the pipeline again.

    —Cheers,

    Your Azure Support Team

    References

    1. Copy data to Azure Cosmos DB with Azure Data Factory https://docs.microsoft.com/azure/data-factory/connector-azure-cosmos-db
    2. Partition key design for Azure Cosmos DB https://docs.microsoft.com/azure/cosmos-db/partitioning-overview
    3. Troubleshoot Common Azure Data Factory copy activity errors https://docs.microsoft.com/azure/data-factory/copy-activity-error-codes

    If you still see the error, can you share:

    • The exact partition-key path your Cosmos container uses?
    • A snippet of your sink dataset JSON showing the “documentDBPartitionKey” setting?
    • A sample source record (with the PK field) that’s failing?

    This will help us pinpoint what’s missing.

    Note: This content was drafted with the help of an AI system. Please verify the information before relying on it for decision-making.


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.