How to Stream Data from Azure SQL CDC to Fabric Event Streaming and Load into Eventhouse KQL Database

Akshay Patel 150 Reputation points
2025-04-24T10:46:24.99+00:00

I want to set up a data pipeline where I can stream changes from an Azure SQL Database (CDC) and ingest that data into a Fabric Event Streaming pipeline. The goal is to load the streaming data into an Eventhouse KQL Database for real-time analytics.
The connection is already done and i can preview the data into the event streaming in JSON format.

What is the recommended approach to achieve this?

How should the data be structured or transformed to be ingested into the Eventhouse KQL Database?

Any guidance, documentation links, or best practices would be appreciated.

Azure SQL Database
{count} votes

Accepted answer
  1. Mallaiah Sangi 645 Reputation points Microsoft External Staff
    2025-04-25T09:37:13.2266667+00:00

    Hi Akshay Patel

    Please find configuration details and minimum requirements for implementations.

    To complete the steps in this article, you need the following resources:

    Configure Stream analytics integration

    1. Sign in to the Azure portal.
    2. Navigate to the database where you want to ingest your streaming data. Select Stream analytics (preview).
    3. To start ingesting your streaming data into this database, select Create and give a name to your streaming job, and then select Next: Input.
    4. Enter your events source details, and then select Next: Output.
    • Input type: Event Hub/IoT Hub
    • Input alias: Enter a name to identify your events source
    • Subscription: Same as Azure SQL Database subscription
    • Event Hub namespace: Name for namespace
    • Event Hub name: Name of event hub within selected namespace
    • Event Hub policy name (Default to create new): Give a policy name
    • Event Hub consumer group (Default to create new): Give a consumer group name
    1. Select which table you want to ingest your streaming data into. Once done, select Create.

    Username, Password: Enter your credentials for SQL server authentication. Select Validate.

    • Table: Select Create new or Use existing. In this flow, let's select Create. This will create a new table when you start the stream Analytics job
    1. A query page opens with following details:

    Your Input (input events source) from which you'll ingest data

    Your Output (output table) which will store transformed data

    Sample SAQL query with SELECT statement.

    • Input preview: Shows snapshot of latest incoming data from input
    1. After you're done authoring & testing the query, select Save query. Select Start Stream Analytics job to start ingesting transformed data into the SQL table. Once you finalize the following fields, start the job.
    2. Once you start the job, you'll see the Running job in the list, and you can take following actions:

    Start/stop the job: If the job is running, you can stop the job. If the job is stopped, you can start the job.

    Edit job: You can edit the query. If you want to do more changes to the job ex, add more inputs/outputs, then open the job in Stream Analytics. Edit option is disabled when the job is running.

    Preview output table: You can preview the table in the SQL query editor.

    • Open in Stream Analytics: Open the job in Stream Analytics to view monitoring, debugging details of the job.

    Please refer the Microsoft official documentation for more details and configuration steps.

    https://learn.microsoft.com/en-us/azure/azure-sql/database/stream-data-stream-analytics-integration?view=azuresql

    Hope this helps. Do let us know if you any further queries.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.