If you followed the Ignite 2017 announcements and Microsoft’s latest trends you figured out that serverless applications are getting more and more boost in Microsoft’s cloud strategy. Logic Apps is one of these new awesome iPaaS (integration Platform as a Service) services, which let’s you reduce complexity, coding and also headache :). Microsoft summarizes the Logic Apps like this…
Logic Apps provide a way to simplify and implement scalable integrations and workflows in the cloud. It provides a visual designer to model and automate your process as a series of steps known as a workflow. There are many connectors across the cloud and on-premises to quickly integrate across services and protocols. A logic app begins with a trigger (like ‘When an account is added to Dynamics CRM’) and after firing can begin many combinations of actions, conversions, and condition logic.
I would like to share one example of such an integration between systems. The goal is to send data from Application Insights (AI) to Azure Log Analytics (ALA). As you know, Microsoft changed the Log Analytics backend to Kusto, which is the same system that is using Application Insights. There is a solution available for OMS which imports the data from Application Insights to Azure Log Analytics. BUT there could be cases, where you want to insert custom data from AI to ALA on a regular schedule, which is not available through this connector. This is exactly what I want to show you in this blog post. This is not a real world scenario, but gives you a pretty good idea how it works and how powerful it can be.
In Azure I created a Logic App which looks like this…
Lets start with the trigger…
This is a simple trigger (scheduler) which starts the workflow every 1 hour. You can go as low as seconds. As soon it starts, it will trigger the next action, which will run a query against my Application Insight monitored blog http://HellShell.com . First you need to connect to Application Insights…
…this is very well described here. After this configuration, you simply need to specify the query to run…
…this returns all web request not older than 1 hour from my web application. If I run the query itself it will return a simple table…
The last action is actually the coolest for FOO (friends of OMS) guys. As you probably know, log analytics has a HTTP Data Collector API which is used for injecting data into ALA. Up to know, you could use PowerShell or any other language to make use of this API. Since a few weeks, I discovered that there is also a Logic App action, which just needs JSON input and a name for the custom log…
…of course first you need to make a connection to your target workspace, like here…
The tricky part is to get the data format for this action in the correct format (JSON). For simplicity I just add, the output value from the Application Insights action into the JSON Request body of Azure Log Analytics Data Collector action. If you are trying to accomplish this using the Logic App designer it will be a bit tricky to do so. Instead we can switch to code view…
…you should see JSON code, like in this screenshot. Note here @{body(‘Run_Analytics_query’)?[‘value’]} the value data which is passed from Application Insights is passed to the Send Data action. This works without converting the output, because Run Analytics query (AI) action sends JSON output already…
Once you start the workflow you can inspect the actual data and you will see what is passed to the next step…
…and if we want to see the result in the “classic” OMS portal, there will be some data available…
So, as you can see it is very easy to pass data from Application Insights to Azure Log Analytics. Keep in mind in this example on each run we send the full stack of data to ALA. This is probably not what you want, instead you could modify the query, add filter options or different data modification actions to the workflow to massage your data. I just wanted to show two things 1) how you can pass data from AI to ALA and 2) there is an action to send data to Azure Log Analytics.