How to real-time scrape diagnostic log files stored in azure blob storage

Andrew Yanni 0 Reputation points
2024-11-21T12:59:14.3566667+00:00

I tried scraping azure diagnostic logs stored on storage accounts via Grafana Loki and Promtail, and it works with old logs that are already archived. As soon as I try to instantaneously scrape diagnostic logs that are being written, I get only a couple of them read and the rest gets ignored.

After searching, I understood it could be due to the fact that object stores are immutable, and hence new blobs (or json files) get written with every change. This means the file I am scraping is being replaced during the process, rather than just appended as the case with normal file system.

Is there a reasonable way to scrape log files while being written to azure storage accounts? I am aware that streaming log entries from event hub works better for this scenario, but I am interested in the diagnostic logs being written to azure storage accounts.

Azure Monitor
Azure Monitor
An Azure service that is used to collect, analyze, and act on telemetry data from Azure and on-premises environments.
3,340 questions
Azure Storage Accounts
Azure Storage Accounts
Globally unique resources that provide access to data management services and serve as the parent namespace for the services.
3,246 questions
Azure Blob Storage
Azure Blob Storage
An Azure service that stores unstructured data in the cloud as blobs.
2,946 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Ashok Gandhi Kotnana 945 Reputation points Microsoft Vendor
    2024-11-22T10:02:23.16+00:00

    Hi Andrew Yanni,

    Welcome to Microsoft Q&A Forum, thank you for posting your query here!

    If you are looking for a solution specific to a particular storage account, here’s an approach to consider.

    You can append data to a blob by creating an append blob. Append blobs are made up of blocks like block blobs but are optimized for append operations. Append blobs are ideal for scenarios such as logging data from virtual machines.

    Please make sure you enable Change feed option in your storage account
    User's image

    However, we have some limitations here

    • Maximum Size: An append blob can grow up to a maximum size of 195 GB12. This might be insufficient for some large-scale applications.
    • Block Limit: Each append blob can contain up to 50,000 blocks, with each block being up to 4 MB in size12.
    • No Random Write Operations: Append blobs are designed for append-only operations, meaning you cannot perform random read-write operations
    • No Tiering Support: Append blobs do not support tiering, so you cannot move them between different access tiers like hot, cool, or archive.

    For more information Refer below links:
    https://learn.microsoft.com/en-us/rest/api/storageservices/understanding-block-blobs--append-blobs--and-page-blobs
    https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-change-feed?tabs=azure-portal
    https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-append


    User's image

    Please do not forget to "Accept the answer” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.