Power BI — streaming dataflows and Azure Event Hub
Today, I show you how to use streaming dataflows along with Azure Event Hub.
So that, let’s create an Azure Event Hub namespace— it is an Event Hubs Namespace.
Next step is to create an Event Hub Instance.
Create a SAS Policy for the hub.
Once our Event Hub is ready we need to ingest it with messages. In order to do this we can use the Power Automate Send Event Action…
…which sends messages to the Event Hub — one message every minute.
Event values are generated dynamically through an expression.
concat('{"date and time":"', string(utcNow()), '", "somevalue":', rand(0, 1000), '}')
As you see, a JSON file is sent in the following format:
{
"date and time": current date and time,
"somevalue": random value between 1 and 1000
}
Head over to the Power BI Service. Create a Streaming Dataflow.
First obligatory step is to add a data source. In this case, it is the Event Hub. Provide a connection string to previously created Event Hub. The second compulsory step is to add an output table.
There is a few transformations available. In this example I’ve changed column names and I filtered values with a condition “>200”.
You can look at the online data as well.
Save the dataflow. And run it.
Now, you can connect to Power Platform dataflows.
And select a table which comprises hot data.
After an Automatic Page Refresh functionality implemented. A chart and a table refresh every 5 seconds and new events are visible instantly.