Share via


Copy a pipeline from one Data Factory to another

Question

Sunday, October 13, 2019 5:47 PM

Hi,

Is it possible to copy a Pipeline with all its Datasets and connections from one Data Factory to another Data Factory? If so what is the best process to do this?

Thanks

All replies (10)

Sunday, October 13, 2019 5:56 PM

You should be able to use ARM template for the source data factory to create replica factory. This link might be helpful - https://docs.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory-resource-manager-template

- Vaibhav Gujral


Sunday, October 13, 2019 6:16 PM

Hi,

Thanks for your reply, yes I have looked at the ARM template but my Data Factory has quite a few pipelines and there doesn't seem to be a option just to select the one particular pipeline?

I tried copying the JSON code for the pipeline and then pasting this to a new pipeline on my new data factory but I am getting errors which I'm sure what they mean.

Thanks


Sunday, October 13, 2019 8:12 PM

I would suggest to export ARM template of the source data factory and remove the JSON for the piplines that are not needed. Once done, if you see errors while deploying your ARM template, you can validate your template using -

1. In powershell, using Test-AzureRmResourceGroupDeployment

2. In CLI, using az group deployment validate

- Vaibhav Gujral


Sunday, October 13, 2019 9:41 PM

Hi,

Thanks for your reply, not sure what you mean by points 1 and 2 do you have an example?

Thanks


Monday, October 14, 2019 7:06 AM

I tried copying the JSON code for the pipeline and then pasting this to a new pipeline on my new data factory but I am getting errors which I'm sure what they mean.

Thanks

Yes, you can copy paste the Json for the pipelines in target data factory but before that create a datasets, linked services with the same names that you have in source data factory

Regards,
Vaibhav


Monday, October 14, 2019 9:37 AM

Hi, Thanks for your reply, is there a tool out there that tells you what Datasets / Linked Services are being used within a Pipeline? Or is this a manual process I would have to check for myself?


Monday, October 14, 2019 9:50 AM

is there a tool out there that tells you what Datasets / Linked Services are being used within a Pipeline? Or is this a manual process I would have to check for myself?

I don't think there is any tool for this. I have copied json and cloned a pipeline earlier, manually created datasets and LS. The pipeline was very small though.

Another option if the ADF is linked to GIT - 

You can copy the required pipeline.json, dataset.json files etc. and add it to your local branch in GIT repo where ADF is linked and refresh ADF. This way you should see pipeline, DS, LS in one go. 

Regards,
Vaibhav


Monday, October 14, 2019 5:05 PM

Use Pipeline Templates.

https://azure.microsoft.com/en-us/blog/get-started-quickly-using-templates-in-azure-data-factory/

You can take a pipeline in your ADF UI and click "Save as Template". Export the template definition.

Then, from another factory, you can go to the ADF UI and click "New pipeline from template". Click "Use local template" and point to your exported template.

This will add your pipeline definition to the new factory.


Wednesday, November 6, 2019 6:43 AM

Hi there,

Just wanted to check - were the suggestions helpful to you? If yes, please consider upvoting and/or marking them as answer. This would help other community members reading this thread.


Wednesday, November 13, 2019 6:35 AM

Hi there,

We haven't heard back from you in quite some time - was the above suggestion helpful to you? If yes, please consider upvoting and/or marking it as answer. This would help other community members reading this thread.