tagging strategy implementation
Hello, when the Azure Data Lake design is done, partitioning and tagging are key performance factors as they ensure smooth transaction patterns. I am not quite clear on the tagging aspects, where and what all components should be covered, and what needs…
Read AppServiceConsoleLogs and push to Blob Storage container/table
Hello Team, we have a webapp python Django based. We are generating custom logs for user actions using code below. Settings.py: LOGGING = { "version": 1, "disable_existing_loggers": False, "formatters": { …
Loading large nested JSON files into Azure SQL db using Azure Data Factory
Hi all, I am currently working with large nested JSON files stored inside a blob container. Each file has a different structure with varying fields and data types, and nested arrays. My goal is to load these files into an Azure SQL database in an…
Can't write to blob storage from AzureML Spark Cluster
Using the Azure ML Spark compute (serverless or attached), it is not possible to write to gen2 datalake blob storage. The code below produces the error 'Caused by: org.apache.hadoop.fs.azure.AzureException: com.microsoft.azure.storage.StorageException:…
How to load multiple different JSON files into Azure SQL DB using Azure Data Factory
Hi all, I am currently working with some nested JSON files and attempting to load it from the blob container to azure sql db using azure data factory. All of the JSON files have different structures as the fields and columns vary in each one, which is…
Data Share support ADLS Gen2 with private endpoint
Do data share support ADLS Gen2 that is deployed with Private Endpoint?
Azure ML | Support for ADLS Gen2 Datastore with SAS Token Authentication
Hello, I’m currently working on an application where we’re connecting various data sources—such as file shares and ADLS Gen2—to Azure Machine Learning. While we can create file-share datastores with SAS token authentication, I noticed that this option…
Data Quality issue in the Purview
Hello Team, We have configured the ADLS2 as source and scan it. For the data Quality, we did the following the steps : Create the Governance domain and publish it. Create the Data Product and add the tables. In the Data Quality section, add the…
Getting "storage_root does not specify a URI scheme." while setting ADLS path in default metastore.
Hi Team, I am facing while setting the ADLS path for the existing default metastore in account console page in azure. Below is the image showing i have already created the container and have enabled Hierarchical namespace as well. But getting below…
Failure happened on 'destination' side. ErrorCode=DeltaInvalidCharacterInColumnName.
This exception occurred when I use the pipeline of data factory to copy data from sql server to lakehouse. But I didn't find any problems with the raw data.
error code 2011
i am testing a pipeline, i introduced a repeated row in one of the files that i want to upload. i was expecting that the pipeline would have run anyway, uploading the correct files and not the incorrect one. actually, the entire pipeline did not work...…
In MS fabric users able to view data but cannot download it
In MS Fabric, I want users to be able to view data in the workspace but not be able to download it. Please provide clear steps, along with links to verify the authenticity of the solution provided.
Copy Files from sharepoint online site to azure datalake storage
Hello We are trying to setup the flow which will copy files from sharepoint online site to azure datalake storage. As per my understanding there are 2 options : Using ADF to pull the files as mentioned in the link below…
504.0 GatewayTimeout & Invoking Azure function failed with HttpStatusCode - 499.
We've developed an Azure Function in python that connect to a Blob Storage, reads files and writes into in Azure tables. During the process, using Azure Functions & it's running fine for small size files (Less than 100 MB). The problem is that, when…
Alternative Methods for Capturing Data Lake Size in Less Time
Need assistance in capturing the size of the data lake per environment (e.g., Dev, SIT, Prod). Currently, a PowerShell script is used to fetch details, generating a CSV file for each environment with the medallion, folder, subfolder, and size. The…
Transforming JSON files using data flow
Hello! I currently have about 60 json files inside a blob container which most of them have different fields and values. I have created a pipeline with a get metadata activity that points to the container, with the field list set to Child items. I have…
Data lake solutions
We are in the process of Data Lake and going further down the line we are really getting confused whether to go for delta lake , datalakehouse, or synapse analytics. The subtle nuances making things not easier such as " A Data Lake House merges…
Why function could not find file
Hi there, I built an Azure Function to process json data from external requests, and then saved the json to a local file, uploaded it to the Container through the storage client. It worked well locally, and once deployed to Azure, it would prompt that…
Data lake schema enforcement
Hello, In Data Lake data is processed or ingested as schema on read and that is data is read in it format that it comes from the source. But I read an article that says schema enforcement makes data lakes high-performance and data readable. Please…
How to send a mail notification for a failed pipeline in Azure Synapse Analytics?
How can I send a notification email to a specific email address without using a logic app when one of my Synapse Analytics pipelines fails? I would like to include the error message in the email notification.