Dynamic Mapping of Source to JSON Sink in Azure Data Factory

Apoorva Belsare 0 Reputation points
2024-09-06T12:01:43.23+00:00

We are able to preview the source data from the Salesforce linked service in ADF, and I can also view the sink data using the REST linked service for Business Central in ADF. However, I'm facing an issue where importing the schema is not supported because my sink data is in JSON format. Could you suggest an approach using 'Add Dynamic Content' to map the source data to the sink?

Microsoft Graph
Microsoft Graph
A Microsoft programmability model that exposes REST APIs and client libraries to access data on Microsoft 365 services.
12,274 questions
Azure Synapse Analytics
Azure Synapse Analytics
An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.
4,997 questions
Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
10,842 questions
Microsoft Entra ID
Microsoft Entra ID
A Microsoft Entra identity service that provides identity management and access control capabilities. Replaces Azure Active Directory.
22,125 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 26,186 Reputation points
    2024-09-06T15:53:19.94+00:00
    1. Create a Copy Activity:
      • In your ADF pipeline, add a Copy Data activity. The source will be your Salesforce linked service, and the sink will be your REST linked service (Business Central JSON API).
    2. Define Source:
      • Configure the Salesforce source dataset, specifying the object (e.g., Account, Contact, etc.). You can also add filters in the source query if needed.
    3. Define Sink:
      • Create or configure the sink dataset to handle the JSON format. Since the schema cannot be imported, define it manually or dynamically via "Add Dynamic Content."
    4. Use Mapping Tab:
      • Go to the Mapping tab of the Copy Activity.
      • If schema import is not supported, enable "Map by Name" or add custom column mapping manually. If you don't see automatic mappings, you’ll need to map each source field to the JSON sink field.
    5. Add Dynamic Content in Mapping:
      • To dynamically map fields, select the target field in the JSON sink, then use the Add Dynamic Content feature.
      • In the Add Dynamic Content window, select expressions or parameters to map source fields dynamically to sink fields. For example:
        
             {
        
               "AccountName": @item().Name,
        
               "AccountId": @item().Id,
        
               "Revenue": @item().AnnualRevenue
        
             }
        
        
      • This assumes Name, Id, and AnnualRevenue are fields from your Salesforce source and they will be dynamically mapped to the corresponding fields in the JSON format.
    6. Handle Nested JSON (if applicable):
      • If your JSON sink requires nested structures, use expressions to create nested JSON objects. For instance:
             
             {
             
             "AccountDetails": {
             
             "AccountId": @item().Id,
             
             "AccountName": @item().Name
             
             },
             
             "FinancialInfo": {
             
             "Revenue": @item().AnnualRevenue
             
             }
             
             }
             
        
      • You can use the @item() function to refer to the current item in the iteration when processing records from the source.

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.