How to Report File Metadata for 2000+ Files Uploaded by Multiple Users to Azure Blob Storage
Hi Team, I would like to know following questions. Please do needful. 1.How to Report File Metadata for 2000+ Files Uploaded by Multiple Users to Azure Blob Storage 2.How to connect Hadoop hive to databricks Regards, Prasad, 5017654887.
Azure Data Factory

How to iterate through Nested folders and read file after the file in adf
Hi, How can I iterate through folder and inside subfolder by folder and process one file after another without using regex in azure data factory
Azure Data Factory
How to configure ADF pipeline run, linked service, so it uses Databricks serverless compute
Databricks has recently announced serverless compute for workflows: https://learn.microsoft.com/en-us/azure/databricks/workflows/jobs/run-serverless-jobs I would like to be able to execute Azure Data Factory (ADF) jobs using this…
Azure Databricks
Azure Data Factory
How does Azure Data Factory handle data movement ?
How does Azure Data Factory handle data movement and transformation across hybrid environments, and what are the best practices for optimizing pipeline performance and cost efficiency
Azure Data Factory
Seeking solution for intermittent "SqlFailedToConnect" errors when writing to Azure SQL Managed Instance from Azure Data Factory
I have multiple ADF pipelines that copy data from a variety of sources into our Azure SQL Managed Instance. All of these jobs, regardless of source, fail intermittently with a Sink error such…
Azure Data Factory

dataset is using 'AzureFileStorage' linked service type,which is not supported in dataflow. How to resolve this issue?
dataset is using 'AzureFileStorage' linked service type,which is not supported in dataflow. How to resolve this issue?
Azure Data Factory
data extraction issues since upgrading to Odata 4.0 in ADF
Our data extraction process, using the Odata connector in ADF has shown problems since our client upgraded their service to Odata 4.0. The issues consist of not succesfully extracting an endpoint. Our client shows that their network gets overloaded even…
Azure Data Factory
Can I use a KeyVault Connection String with the Azure Data Factory Mysql Linked Service Driver v2.0?
Hi all, I currently use a V1.0 MySQL linked services as part of my Azure Data Factory setup. I use a secret connection string stored in Azure Key Vault. I am being asked to upgrade the linked service driver to 2.0, but it seems this does not support the…
Azure Data Factory


ServiceNow Incident table extraction with V2 Connectors
Using the new ServiceNowV2 connectors in Azure Data Factory, we are trying to extract the incident table. When setting up a copy activity, we receive the error message below. I have tried to limit the number of records by providing a filter to small…
Azure Data Factory
Conditional Split in ADF Data Flow issue - no rows in sink
I have a Conditional Split transformation that works fine except one sink results empty although there are source rows matching the split condition. In my scenario I have - say - 4 conditions, two of which are supposed to end up in the same table sink.…
Azure Data Factory
Azure Data Factory: No 'Name' Key Appearing Under Additional Headers For REST APIs Data Copy Action
Hello, I'm trying to bring in data to SQL Server using a REST API. I've successfully completed the first stage, where I needed to call the Login endpoint to fetch the apiToken in order to request all other endpoints using the bearer authentication. I'm…
Azure Data Factory
Failure happened on 'Source' side. Copy activity Error in Fabric
Failure happened on 'Source' side. ErrorCode=UserErrorWriteFailedFileOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The file operation is failed, upload file failed at path:…
Azure Data Factory
I have added parameter in Dataset but it is not reflecting in copy activity
I have added parameter in Dataset but it is not reflecting in copy activity while coping all sql tables form azure sql to blob storage , created 2 parameter (schema,table) for pass lookup output values
Azure Data Factory
How to make the Chekpoint Key / SAP to stage subscriber process a static value?
Context: I am currently building a pipeline that extracts data from a SAP CDC source, where I want to utilize the delta mechanism this connector enables. My setup is: SAP CDC connector to source system (SAP CRM)- static subscriber name and ODP…
Azure Data Factory
Compatibility and Support for Amazon Corretto in Azure Data Factory
Hello Azure Support Team, We are exploring the possibility of replacing Oracle Java with Amazon Corretto (OpenJDK distribution) in our Azure Data Factory (ADF) environment. Before proceeding, we would like to clarify the following: Compatibility: Is…
Azure Data Factory
AZF Dataflow error connection CosmosDB
I have created a linked service and the testconnection is succeeded But when I use this linked service in a dataflow and test the connection I get the following error: Error codeDFExecutorUserError DetailsStatus code 400,…
Azure Data Factory
Issue with "Too Many Requests" Error in App Provisioning
I ran the code in the test file within Synapse and observed a "Too Many Requests" error while processing records for App provisioning. We also confirmed this issue via Postman, where the same error was encountered. Can anyone help me to…
Azure Synapse Analytics
Azure Data Factory
Azure App Service

What are the recommended steps for setting up a Service Principal in Azure for enterprise use, and how should the IT team manage its lifecycle securely?
We’re working on setting up Service Principals (App Registrations) in Azure to support automated data workflows (e.g., Azure Data Factory accessing Azure SQL, ADLS, etc.). The goal is to enable secure, non-interactive authentication while following…
Azure Data Factory
'json' parameter is not valid after updating secret key in ADF web activity
Hello, We have a pipeline with two web activities: The first retrieves a secret key from the key vault. The second uses that secret key to obtain a bearer token. Recently, we updated the expired secret key retrieved in the first web activity. The…
Azure Data Factory
recommended SHIR configuration to ingest 80TB from DB2 Snapshot to ADLS Gen2 via ADF within 3 days?
We are performing a large-scale one-time data migration project. Our source is an IBM DB2 snapshot (on-prem), and the target is Azure Data Lake Storage Gen2. The entire data volume is ~80TB, distributed across 800 tables. We are using Azure Data Factory…