Azure Azure Log Analytics Azure Monitor Bot Development

Azure Bot Service – MAX Automation and Monitoring Bot

Bildergebnis für hello computer

I had a dream! I had a dream about talking and chatting with my computer and then would receive a spoken or written answer to my question. In a similar way like this movie scene . Which sounds like a dream, could become reality for you or maybe it already is in your company. This blog post I will actually show you a solution I built for fun and getting some understanding how things work under the hood in terms of Azure Bot Service, Azure Bot Framework, Azure Logic Apps, Cortana and other cool stuff Microsoft is providing. My first intention was to provide a detailed step-by-step guide how I built the solution but after some time I realized that technology is changing so fast that concepts and ideas are more valuable than actual point in time solutions. I demoed MAX bot at a couple of conferences and I got asked to write about it, so here it is.

What is it all about? As you might know I love monitoring and I love automation. Why? Well, monitoring touches all kind of technologies and I get the chance to learn about new technologies in a deep way. Automation is awesome because it tries to connect technologies together, can save you a lot of time and makes you feel like a programmer Smile. If we combine both disciplines together our job makes even more fun. To make things short, my goal was to talk or chat to my computer, asking questions about monitoring and the bot will return my information. What seems to be a very simple task, can end up with some complexity.

From a high level view we need the following components:

In this blog post I will cover some high level concepts and some technical details, which I think are necessary to understand the full picture. I will not cover any bot code nor go into any bot design topics. As I mentioned, my goal is to share some ideas and trying to give you a starting point to write your own bot. To get an idea how the bot works, we need to have a look at the architecture…

image

First it seems to be a complex bot (aka cloud native application), but as soon we take it apart we will understand each part.

On the left side there is either Cortana or a chat channel which will get the information we are asking for. This means either writing or asking the bot a question.  Each of these channels are connected to the bot connector service, which will handle the data (JSON) between the channel and the actual bot code. The following picture simplifies / visualizes this part…

image

The data entered via voice or text chat needs to be translated into a form, which we can programmatically handle. There is no way for the bot code to understand the meaning of the sentence – “Get me disk space of server01”. Therefore, we need to connect our bot to a service which can be trained to understand the written or spoken language. In Azure, there are a couple of cognitive services available. The following picture will give you a brief impression what services are currently available.

Bildergebnis für cognitive services

For my MAX bot project I used the Language Understanding Intelligent Service API (LUIS) to train and finally used to make sense of my chat or voice data. How does it work? Let’s take an example,  an utterance “What is CPU of computer XYZ?”. The actual intent is to have the CPU performance returned of computer XYZ. To be able to use this intent programmatically we need to follow a certain process and have a closure look at the sentence….

image

…first we need to add an intent. As soon we have defined the intend we train the utterances and mark the entities like CPU and computer. Because there are different ways to ask for an information we need to provide as many different utterances as possible. Identify each entity and train the language understanding (LUIS) model again. The more we train LUIS, the better are the results. Because there are different words for computer like pc, server, system etc. you could use a phrase list, which will treat these words similar to computer. In addition, LUIS provides pre-built domains, intents and entities which allow you to cover some difficult scenarios like getting the date / time extracted. For example, there are different ways to format a date / time in a message, pre-built entities will help LUIS to understand the date correctly.

In order to get a better picture, I show you the intent “GetPerformanceComputer”. This intent is trained using multiple utterances. In the screenshot below, we see the identified entities, which have a prefix of entDisk or entComputerName. Those entities contain the actual value. GetPerformanceComputer is a bit special, because this intent is using a composite entity (compPerfomanceData) which contains multiple single entities. These entities need to be extracted in the bot code. So instead of making multiple single intents like GetMemoryComputer, GetCPUComputer, GetDiskComputer which would provide memory, cpu and disk entities, I covered this in a single generic performance intent like GetPerformanceComputer and packing all the performance entities into a composite entity type compPerformanceData…

image

…and the compPerfomanceData entity looks like this…

image

The actual name of the intent GetPerformanceComputer is mapped in the bot code to trigger the appropriate bot dialog.

I tried to explain how LUIS works and how it will be trained. I hope you get now the idea what LUIS can do for you. The next step is to get the actual CPU performance data from Azure Log Analytics.

Let’s stick to our example, and let’s assume we want to know the CPU performance (% Processor Time counter) of a computer named “prime”. In this case the entities “CPU” and “prime” have been identified. “CPU” is needed to know that the user is asking for cpu performance of the computer and the computer name is needed to query the appropriate computer. Within the bot code, an Azure Log Analytics query is being constructed and passed to a generic Logic App. The Logic App has a HTTP request aka webhook trigger and within the webhook body there is the Azure Log Analytics query passed to the next action which will query the workspace for the data…

image

…the response action returns a well formatted sentence, like this example…

{

"value": [

{

"output": "prime has currently CPU % Processor Time of 37.8 percent. Last data received on 13-11-2018 09:38:46"

}

]

}

…this response is processed by the bot and turned into speech or text chat within the bot code. I call this first example an  “active” way of interacting with the bot, meaning we ask explicitly for some information.

In addition to actively asking the bot questions,  I built another “proactive” path into the bot solution. This means, the bot will send a proactive message in my chat window, as soon something unexpected happens. I used Azure Monitor to monitor for a certain event and if the threshold is breached, I get notified in the chat window. How did I do it? Well, I created an alert rule in Azure Monitor “Malware Detected”…

image

…this alert rule, queries the Log Analytics workspace for a specific event ID 1116 and the source Microsoft-Windows-Windows Defender event log. The following KQL query looks a bit scary, but it was necessary to extract the data from the windows event description in Azure Log Analytics. First we need to extract the event data (XML) out of the event description. Then convert it to string data and parse the JSON string to extract the needed data.

First, this is the raw windows event data…

image

…and if I run this query which is used by the Azure Monitor alert rule…image

…the output is nicely formatted.

The alert rule will check every 5 minutes if there is such an event and if there is an event found, it will trigger an action group…

image

…the action group triggers a Logic App, that has an HTTP request (webhook) trigger and an action to send the body content of the webhook to a Azure Service Bus message…

image

…the body content looks like this…

image

…when the Azure Monitor alert rule triggers, a message will be written to the Azure Service Bus queue. Every time the bot starts up it will subscribe to Azure Service Bus and will listen to the queue for new messages. Subscribing to the Azure Service Bus is handled within the bot code, where the message is formatted and send to the chat client. As soon we quit the bot, the queue will fill up and the next time we start the bot, it will poll the messages in the queue and notify us.

Finally, if we have the bot ready, we simply publish the bot to the Azure Bot Service and connect it to all necessary channels like Cortana, Skype and many others…

image

That’s it we have created our first bot Smile. I recorded two videos to see it in action,. The first recording shows you the chat experience and in the second recording I will talk to the bot.

Chat


image


Speech


image


Note:

In this post I touched only a couple of topics and ways how I solved certain problems.  I didn’t mention anything about designing a bot or what we should consider building a bot (interfaces, chat flow, code etc.) . There are plenty of good sources on the internet some of which I am providing here.

It is really fun playing with the new technologies and I like the directions Microsoft is going. There are easy ways to implement AI and cognitive services into your own applications to improve and enhanced the user experience. One special thingy I added to my bot was to tell Chuck Norris joke. We can see in the overview architecture, that the bot also queries the https://api.chucknorris.io API in order to get some jokes if you ask MAX for it.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.