An Azure service for ingesting, preparing, and transforming data at scale.
@Venkat Pendyala - Azure Hey Venkat, this “Missing partition key header for point operation” means that when ADF’s Copy activity is writing to Cosmos DB it isn’t sending the required partition-key header in the request. Even if your source JSON has the key field, ADF needs to know which property to use so it can populate that header on each write.
Here’s what to check and update:
- Validate your Cosmos container’s partition-key path • In the Azure portal, go to your Cosmos DB account → Data Explorer → select your database/container → Settings → “Partition key” • Note the exact path (e.g. /customerId, /region), including the leading slash and correct casing.
- Configure the partition key in your ADF sink dataset • Edit the Cosmos DB (DocumentDB) dataset you’re using as the sink. • Under Connection → DocumentDB partition key, enter the same path you saw in the portal (e.g. /customerId).
- Map the source field to that partition key • In your Copy activity’s Sink tab → Mapping, ensure the column in your source that holds the partition-key value maps to the partition-key property in Cosmos. • If you don’t explicitly map, ADF assumes your JSON has the field, but you still must define the path in the dataset.
- Enable Upsert (optional but often helpful) • Still in the Sink tab, toggle “Enable upsert.” • Upsert operations still require the partition-key header, but this setting can reduce failures when items already exist.
- Double-check your data • Make sure every row/document in your source has a non-null, non-empty value in that partition-key field. Any missing value will trigger the same error.
After you’re sure the path in the dataset matches your container and the mapping is correct, run the pipeline again.
—Cheers,
Your Azure Support Team
References
- Copy data to Azure Cosmos DB with Azure Data Factory https://docs.microsoft.com/azure/data-factory/connector-azure-cosmos-db
- Partition key design for Azure Cosmos DB https://docs.microsoft.com/azure/cosmos-db/partitioning-overview
- Troubleshoot Common Azure Data Factory copy activity errors https://docs.microsoft.com/azure/data-factory/copy-activity-error-codes
—
If you still see the error, can you share:
- The exact partition-key path your Cosmos container uses?
- A snippet of your sink dataset JSON showing the “documentDBPartitionKey” setting?
- A sample source record (with the PK field) that’s failing?
This will help us pinpoint what’s missing.
Note: This content was drafted with the help of an AI system. Please verify the information before relying on it for decision-making.