If you’ve been following along with our series you now have a beautiful Power BI dashboard for your incidents that is using real data injected directly into Power BI from SCSM using Powershell and the Power BI API.
All that to say, we are missing one very important capability, and that’s how we update the data regularly without the SCSM admin having to log into the system in the middle of the night to refresh the data for the CIO currently in the UK. This is where Azure Automation comes in.
If you are just joining us and would like to view the other posts in the series, they can be found here:
Azure Automation vs Orchestrator
Before we run to far down this rabbit hole, I’d like to address a question I hear often from customers when we present this solution to them. Many of them already have Orchestrator in their environment and have concerns on supporting two separate systems, or simply question why Azure Automation would be better. Let me address both of this and a bit more now.
One of the biggest advantages of information technology as a whole and most especially automation is the ability to increase efficiency, while reducing overall effort required to maintain and improve operations. The reason cloud services have become so popular is because of the enormous efficiency gain of a properly managed and maintained cloud platform. Most IT shops just don’t have the budget to hire the broad range of technical expertise that is required to run a truly top tier data center. The cloud provides all the benefits of a top tier data center to companies of all sizes at a price they can afford, not to mention they pay for only what they use.
Azure Automation really capitalizes on the idea of the cloud and provides easy access to automation infrastructure for an incredibly low cost. Azure also provides top tier uptime and performance at a level many small organizations struggle to match and at much greater cost.
That should sufficiently answer the second question, but how about the first one?
The short answer is that you don’t have to maintain two separate infrastructures, but if you wanted to for legacy reasons the overhead of Azure Automation is so low that it would add little burden to your current team while at the same time dramatically increasing their capabilities.
With functionality like built in and shared scheduling, support for visual and code based powershell workflow, and for standard powershell you really can’t go wrong. Most recently we’ve been introduced to webhooks, which opens up a whole new channel of connectivity, allowing for data to flow from just about any system with access to the web into your automation stream. We’ve already demonstrated how we can interface popular applications like Slack and Cireson’s Portal for Service Manager with Azure Automation (both of these will be appearing in coming blogs).
Hopefully this long winded explanation has answer some of your questions as to why and if not please feel free to ask more questions below! For now, lets move on to the how.
First things first, in order to utilize Azure Automation you have to have an Azure account with some form of subscription. In my particular case I’m utilizing my MSDN subscription which comes with all sorts of goodies. Any many cases even the free tier of services will work at this time for some simple automation.
Once you’ve established your Azure account and you have access to the Azure Portal (https://portal.azure.com/) you can create an Azure Automation account for free. Look for this menu option to setup your Azure Automation account…
Select it and fill out the form to create your automation account…
Once you’ve created your account, you should now see it available to your pinned to your dashboard (assuming you didn’t uncheck that option). Select the tile and you should be taken to the Azure Automation blade that will look like this…
This blade is the starting point for nearly all the work we will do involving Azure Automation. For today I’m only going to focus on the functionality we will be using and only as we use it.
Now at this point we can already create and run runbooks, but our runbooks require access to our on-prem data. In order give it the required access we are going to need to setup a hybrid worker in our environment (often a current orchestrator runbook server is great for this) as well as establish ourselves an Operational Insights account.
You can establish this from portal.azure.com using the following link…
Which will take you to the older portal where you can establish your OMS workspace…
Which will then create the Operation Insights / Operations Management Suite workspace for you which will look like this…
From here you can click the Automation tile and configure it to connect to your Azure Automation account that we setup above. I’m not going to walk through this as it’s quite straight forward. The next step is not quite as straight forward though.
Now that we have our OMS workspace and our Azure Automation account attached, we need to make the connection into our on premise server in order to setup the launch point for our runbooks. This will be the first server in our Hybrid Worker Group.
In order to do this we need to install the Microsoft Monitoring Agent on the server we want to act as the Hybrid Worker. So we need to go to the Settings tile…
Then click on connected sources…
From here, we can download the appropriate windows agent, place it onto the server we want to use as the hybrid worker and install it. During the install process it will ask you for the workspace ID and the primary key. This will allow it to connect to your new OMS instance. NOTE: You must install the MMA client from this page, as the standard MMA installation does not provide the same capabilities.
Looking at the steps above, we still have one more documented step. We have to log into the new Hybrid worker / monitored server and enable the machine as a Hybrid worker. Well document instructions on how to do this can be found here:
Now we are almost there, but there is one additional step that is not well documented. We need to install the Azure Powershell module onto the new server that you have setup as your hybrid worker. There are instructions for how to do this here:
Once you’ve completed this process (a restart maybe required) we are not ready to go back into the Azure Portal and our Azure Automation account and view our new Hybrid Worker Group…
With all of this in place, we are now open to run our script on our Hybrid worker on premise. With that in mind, you should be able browse to your runbook and click run, select the hybrid worker…
With the options above selected you can now click OK and it will queue the job to be picked up by the Hybrid Worker and within a few minutes we should see it picked up and run. Assuming we have no issues here, we are ready to schedule our job.
When we click on the schedules tile in our runbook we will work through configuring a new schedule by following through the below image and selected our Hybrid Worker group…
First we are setting the properties for the schedule setting it to run on the Hybrid Group and then we configure the actually timing for the runbook…
Once this is complete, our runbook is scheduled, and that is really all it takes. We can now easily create additional runbooks and schedule them to run on our Hybrid Worker group as we see fit, so lets move into creation of our runbook a bit more and talk about a few gotchas.
To create a new runbook we start from our Azure Automation account blade and click the runbooks tile just like we did when we were going to schedule a runbook. This time rather than selecting a runbook we are going to create a new one…
If you create a new runbook you have the option of selecting one of three types: powershell workflow, powershell, or visual. Each of these will be touched on in another post, but to keep it short powershell is what we are using here. This allows us to run our native powershell scripts the same as we would on the server itself.
With that in mind, we already built and tested our powershell script in the first post of this series, so we are going to simply import that runbook into Azure using the import function. Once that is complete we will now have an azure version of our runbook that we can schedule and run.
In wrapping up our series, lets look at the steps we’ve taken.
- Designed and tested a powershell script within our environment to collect and manipulated the data we needed.
- Uploaded the data to Power BI using the API.
- Created a report in Power BI and promoted panels to a dashboard.
- Established and configured an Azure Automation account and configured a Hybrid Worker Group
While these are the very high level steps of this process, there are some things you should keep in mind throughout this process than can be major ‘gotchas’.
- When running a powershell script on a hybrid worker, you must install the Azure Powershell module for it to run.
- When running a powershell script on a hybrid worker, you must make sure that the Hybrid Worker has the appropriate modules installed or that within your script you are using PSRemoting to remote into the appropriate server in your environment that does have the modules installed.
- If running a script natively in azure, you will need to import any modules required into azure directly.
I hope this series will help all of you get started with Power BI, Azure Automation, and give you some great ideas on what you can accomplish in terms of reporting or automating your Service Manager environment using these new tools!