Power BI — streaming dataflows and Azure Event Hub

Today, I show you how to use streaming dataflows along with Azure Event Hub.
So that, let’s create Azure Event Hub namespace— it is an event hubs namespace.

Next step is to create event hub.

And create SAS Policy for the hub.

Once our Event Hub is ready we need to ingest it with messages. In order to do this we can use Power Automate Send Event Action.

Which sends messages to Event Hub — one message every minute.
Event values are generated dynamically through expression.
concat(‘{“date and time”:”’, string(utcNow()), ‘“, “somevalue”:’, rand(0, 1000), ‘}’)
As you see, JSON file is sent in the following format:
{
“date and time”: current date and time,
“somevalue”: random value between 1 and 1000
}
Head over to Power BI service. Create Streaming Dataflow.

First obligatory step is adding datasource. In this case it is Event Hub. Provide conncetion string to previously created Event Hub. The second compulsory step is to add output table.
There is a few transformation available. In this example I’ve changed column names and filtered Values with condition “>200”.
You can look the online data as well.

Save the dataflow. And run it.

Now, you can connect to Power Platform dataflows.

And select table which comprises hot data.

After Automatic Page Refresh implemented. Chart and table refreshes every 5 seconds and new events are visiby instantly.

