Azure Logic Apps are a convenient solution to add workflow logic, decision-making logic, and rules to your Azure solution.
There are hundreds of connectors with connections to all kinds of both Microsoft and third-party resources so you can literally connect the whole world and move data and insights from one place to another.
Azure Logic Apps also supports Azure Data Explorer, the queryable time-series data solution:
As you can see, you can execute both commands (for altering resources inside Azure Data Explorer) and table queries.
It even supports promises support for rendering charts!
In this post, we check out how to make use of these actions and charts so we can monitor and even control our Azure Data Explorer.
This post is the seventh part of this Azure Data Explorer blog series:
- Azure Data Explorer connector for IoT Hub with real-time ingest
- Dynamic routing of IoT Hub telemetry to Azure Data Explorer
- Using ADX table update policies projecting raw messages to target tables
- Azure Data Explorer connector for Blob storage (IoT Hub) files
- Azure Data Explorer as a data source for Azure Managed Grafana dashboards
- ADX Kusto plug-in for Azure Digital Twins history
- Azure Logic App actions for Azure Data Explorer
- Programmatically ingest data into Azure Data Explorer
- Streaming data analysis with Free Azure Data Explorer
- Test KQL table mappings inline
- Bonus challenge: Kusto detective agency
Let’s start by creating a new logic app. We make use of the Consumption plan type so we only ‘pay as we go’:
Note: Check the Logic app pricing if you want to make use of it at scale. Both connectors and actions are billed. So count the number of connectors and actions within your Logic App.
Then, we navigate to the Logic App (visual) designer and start building a Logic App triggered by an HTTP (post) request, (just so we can test by hand the execution of the logic running within the Logic app).
As mentioned already, the Logic app offers five different Azure Data Explorer actions:
These actions are:
- Running a Control command
- Running a Control command, returning a chart
- Running a KQL query
- Running a KQL query, returning a chart
- Running a Show Control command
We will dive into these actions below.
Whatever action you chose, you first need to provide credentials to authorize the connection.
A sign-in, a service principal, or a managed identity can be used.
Here, I go for sign-in using an AAD account:
An AAD dialog is displayed where I need to provide my credentials.
Note: This ADX connection is remembered and reused by other actions.
Now, we are able to add our first Azure Data Explorer action.
Running a KQL query
I start with running a Kusto query within a Logic App:
I need to provide the cluster URL, the database name, and the query I want to execute:
The cluster URL is seen on the overview page of the ADX cluster.
The Database name and Query are not so hard to guess:
This query will give me a list of table rows from a certain table.
Note: Logic Apps offers the insertion of dynamic content so the hardcoded table name could be replaced by a variable.
I want to react to every item in the list so I add a For Each loop:
I then send an email based on every value (row) / current item:
When the design of the Logic App is completed, it results in one Logic app with two connections:
Note: The same connection can be used by multiple steps in the logic app.
I executed a post request for the REST endpoint of the HTTP trigger and the Logic app was executed successfully:
Note: all steps get a green flag.
Check out the details. We can even loop through all two messages:
I actually got two emails:
The body of each message is in JSON format:
It is on par with what is seen in the ADX query Explorer if I run the same query:
By adding a Parse JSON action, a schema can be provided. This way, separate fields of the message will come available for other actions following up this ADX action.
We have proven we can query ADX.
But what about rendering a chart?
Running a KQL query and rending a chart
There is another action available for rendering a chart:
Next to the Cluster URL, the Database name, and the query, we need to select the chart type:
Notice the query is just a query returning a result with rows, the rendering is defined by the selected chart type:
This same query executed in the Azure Data Explorer query editor shows this nice rendering:
So we can expect something similar…
The result of the action is described as HTML output and shows the output two ways: as one base64 output and in several parts (HTML body, related picture attachment, and the links set in HTML):
Again, we send an email with the outcome:
Here, I need to add this extra Compose action to expose the body HTML outputted by the Azure Data Explorer action:
The ‘composed’ HTML is put into the email body and the actual image is put in an attachment (see the attachment name and attachment byte array content).
When I save the actions and execute this Logic App, I actually receive the picture in my email box:
This is a great feature, my Outlook inbox is now turned into an IoT Dashboard 🙂
The two links in the email bring you directly to the ADX query editor where you can elaborate on the query either in a browser or using the Kusto Explorer which you can install on your laptop.
Run show control, run async control, and run control command returning a chart
What we have seen for a KQL query is also applicable to control commands for managing Azure Data Explorer.
Let’s start with the Show Control command.
Runs the show control command and returns the result as a set of rows which can be iterated over in the following connectors e.g .show table TableName policy caching.
Commands are of this form typically:
Note: The text of the request must start with a dot (
.) character and the request must be sent to the management endpoint of the service.
The output can differ. In this case, this flow seems to work:
The output is several emails, all showing one table name:
Although this example is not really useful, this shows we can query the management endpoint of Azure Data Explorer too.
Another example is seen here.
Next, let’s check out the run async control command action.
Runs control command in async mode and returns its ID, state and status on completion. Command can run for maximum 1 hour. The ‘async’ keyword is mandatory e.g .set-or-append async TargetTable <| SourceTable.
If you do not provide an async command, the action will fail:
"Message": "Only async control commands is supported for this action\r\nclientRequestId: XYZ"
Async commands only return the operation ID:
Although not tested, I expect the result can be retrieved by executing the request for the operation outcome using:
.show operations [OperationId]
This is an important addition to the async solution.
Finally, there is also an action for control commands where the outcome is rendered in a chart:
Runs the control command and returns the result as a chart of your choice e.g .clear table TableName data.
Notice this call is not asynchronous.
One use case I think is great to implement this way is GDPR compliance (the right to be forgotten).
If data rows need to be purged from tables, now you can execute Logic Apps workflow for purging data.
We have seen how Azure Logic apps support the use of the Azure Data Explorer connector.
This connector offers the retrieval of query output and even rendered charts.
We have also touched execution of Control commands, both synchronous and asynchronous.
This new tooling opens up Azure Data Explorer for monitoring and management in an automated way.
This even works for Power Automate and Power apps.
8 gedachten over “Azure Logic App actions for Azure Data Explorer”
aⅼl the time i սsed to read ѕmaller аrticles that aⅼso clear their motive, and that is also happening with tһis pоst which I am reading at this place.
Reacties zijn gesloten.