Optimizing Azure Inventory with Powershell Azure Functions and Power BI

In previous post I showed how to build Azure inventory system by utilizing Azure Automation Account, Azure Logic App and SharePoint.

Drawbacks of that approach:

  1. Solution is pretty cumbersome due to the use Azure powershell for query for necessary information and takes up to 30 mins to execute
  2. Neither Azure Logic App no Azure Automation are well integrated into source control and hence makes it difficult to move around as well use modern technologies for CI/CD
  3. Use of SharePoint Excel causing additional headache for concurrency since each item is inserted one by one into Excel online which is causing occasional timeouts and needs to be handled in retry option in Azure Logic App
  4. Result Excel file is single dimensional database and non interactive which is sub-optimal
  5. Report is always stale since it’s last view of state when report was successfully run, so if you run it weekly you might be looking at stale data

Alternative solution which fixes drawbacks above are based on completely different set of technologies. Azure Logic App is replaced with Azure Function, Excel is replaced with Power BI, Azure powershell calls replaced with Azure Resource Graph.

Flow of this setup is as follows:

  1. Azure powershell function is created with system assigned identity to query Azure Resource Graph
  2. Resulting JSON is ingested as web source in to Power BI report

Code for function and power bi file is available here. This is just proof of concept setup and hence you’d need to modify exact query script as well as BI dashboard to fit your needs. Current incarnation just outputs disk information data.

Steps

Create Function App. It shall be powershell function on Windows

Create system assigned identity for application and assign read permissions to subscription

If you are using Azure Blueprints you can create this assignment in your management group

Deploy function app from VSCode

Test application which shall return information from all subscriptions identity have access to

Create function URL which would be called later via Power BI by clicking “Get Function URL”

Open PBIX file from the same repo and click on File/Options and Settings/Data Source settings

Click “Change source” and point to URL you copied in step above

You shall report similar to below. It’s interactive so you can click on VM and see what disk is used by that VM, or you can click on location and only filter VMs from that specific location, etc.

Building better Azure inventory system

Azure portal provides semi-usable ability to inventory your VMs for various folks at you company but have significant limitations, namely:

  • Users will still require be at least in ReadOnly role in your subscription
  • Available information is not well structured (for example you can not see in list size of OS disk drive for each machine)
  • There is no historical information available about state of environment since what you see in snapshot in time of inventory

Solution below will allow to output required information in Excel file hosted in Office 365. Following are working pieces of solution

  1. Azure logic app which runs on recurrent schedule
  2. Azure automation job which pulls necessary information out of Azure and output JSON for Azure Logic App to consume
  3. Office 365 workspace which will hold resulting Excel file written as part of LogicApp job

Automation Job

Automation job consists of 2 run books. Get-AzureInventory graphical powershell runbook which calls child jobGet-VMs

Get-AzureInventory runbook shown below is used to pull Automation Account credentials and login to target subscription. Download this file ( https://gist.github.com/artisticcheese/846f509618cb7bd8754bf705732cba64) and import it into Automation Account. This runbook calls child runbook called Get-VMs ( https://gist.github.com/artisticcheese/89421f49819cecc153e3b33a9c388d6e ) download and import it into Automation account as well.

The output of this job is JSON file which will be consumed downstream by Azure LogicApp and sent to Excel online.

Example of output is below.

[
    {
        "VMName":  "WinVM",
        "Location":  "canadacentral",
        "ResourceGroup":  "TEST",
        "OSType":  "Windows",
        "PowerState":  "VM running",
        "BootDiagnostics":  null,
        "OSDiskSizeGB":  127,
        "NumberOfDataDisks":  2,
        "Offer":  "WindowsServer",
        "Publisher":  "MicrosoftWindowsServer",
        "VMSize":  "Standard_DS1_v2",
        "DataDisksSize":  "10|10",
        "VnetName":  "Test-vnet",
        "Subnet":  "default",
        "privateIPs":  "10.0.0.4",
        "publicIPs":  "13.71.162.184",
        "EnvironmentTag":  "Production",
        "VMCores":  1,
        "VMmemory":  3.5
    },
    {
        "VMName":  "LinuxVM",
        "Location":  "westus",
        "ResourceGroup":  "TEST",
        "OSType":  "Linux",
        "PowerState":  "VM running",
        "BootDiagnostics":  null,
        "OSDiskSizeGB":  30,
        "NumberOfDataDisks":  0,
        "Offer":  "UbuntuServer",
        "Publisher":  "Canonical",
        "VMSize":  "Standard_B1ms",
        "DataDisksSize":  "",
        "VnetName":  "Testvnet646",
        "Subnet":  "default",
        "privateIPs":  "10.0.1.4",
        "publicIPs":  "23.101.204.1",
        "EnvironmentTag":  "Development",
        "VMCores":  1,
        "VMmemory":  2
    }
]

You can modify script Get-VMs to your liking. For example current script looks for 4 specific tags and outputs into JSON which might be different in your environment. The same goes for custom output of DataDisk sizes etc.

Azure Logic App

Azure Logic App is what automates entire process of pulling information out of Azure via Automation Runbook job, massaging it and outputing it into Excel online. Steps are shown below.

App consists of following major steps

  1. Recurrent execution
  2. Instantiating and getting results of Automation job in JSON
  3. Copying Excel template file to populate from JSON obtained in step 2
  4. Copying file into History folder for historical reasons

Create following folder structure in Sharepoint online. You can get Excel file here (https://github.com/artisticcheese/artisticcheesecontainer/blob/master/docs/inventory.xlsx)

Add recurrence step with whatever recurrence you desire

Add Azure Automation job you created in previous step

Add Get Job Output job and pass JobID from previous step

Add Parse JSON step which will convert output of Automation job into LogicApp artifact. Paste following JSON into schema ( https://gist.github.com/artisticcheese/a150555a8192a8193ff8c71c9e77543e )

Add Copy File Sharepoint task to copy template file into root folder

Add foreach block and map Excel fields to results of Parse JSON statement

Add step to copy file to history folder for historical reasons

If everything was done correctly then running Azure Logic App job manually will populate excel file with information from Azure

Proper Azure policy to verify Azure hybrid benefit enabled

Azure policies allows Azure admins to enforce or verify how Azure resources are deployed in environment. It relies on Azure Policy definition file written in JSON which stipulates what condition resource shall adhere to pass or fail a policy and what effect it will have on resource (deny or audit)

This is very good way to prevent certain things from happening before it happens (like deploying resources in unapproved locations etc) which are not possible to accomplish with plain RBAC controls.

For this specific case I needed to ensure that all VMs in subscription are enabled with Azure Hybrid Benefit which saves up to 40% of Windows licensing costs if company already has EA agreement with Microsoft and will not pay double licensing costs for OS.

Searching Microsoft samples actually yield a result which worked fine for sometime untill I was alerted that some machines are still passing a test despite the fact that they are not using Hybrid Benefit.

Looking at policy it becomes apparent what the issue is. Definition of original policy is below and applicable only to images created from Azure gallery Windows images. So if you created VM through ASR or created VM from custom image this policy will not apply. Another issue is that policy does not apply to Windows client machine if you happen to have those in your environment since license types of those is named differently.

 "if": {
                "allOf": [
                    {
                        "field": "type",
                        "in": [
                            "Microsoft.Compute/virtualMachines",
                            "Microsoft.Compute/VirtualMachineScaleSets"
                        ]
                    },
                    {
                        "field": "Microsoft.Compute/imagePublisher",
                        "equals": "MicrosoftWindowsServer"
                    },
                      {
                        "field": "Microsoft.Compute/imageOffer",
                        "equals": "WindowsServer"
                    },
                    {
                        "field": "Microsoft.Compute/imageSKU",
                        "in": [
                            "2008-R2-SP1",
                            "2008-R2-SP1-smalldisk",
                            "2012-Datacenter",
                            "2012-Datacenter-smalldisk",
                            "2012-R2-Datacenter",
                            "2012-R2-Datacenter-smalldisk",
                            "2016-Datacenter",
                            "2016-Datacenter-Server-Core",
                            "2016-Datacenter-Server-Core-smalldisk",
                            "2016-Datacenter-smalldisk",
                            "2016-Datacenter-with-Containers",
                            "2016-Datacenter-with-RDSH"
                        ]
                    },
                    {
                        "field": "Microsoft.Compute/licenseType",
                        "notEquals": "Windows_Server"
                    }
                ]
            },

To find our what all aliases are used for specific resource you can execute following powershell statement (Get-AzPolicyAlias -NamespaceMatch 'Microsoft.Compute').Aliases. It will list all aliases available to be using in Microsoft.Compute resource provider. To identify Windows only boxes we can narrow down the search to osType property

PS Azure:\> (Get-AzPolicyAlias -NamespaceMatch 'Microsoft.Compute').Aliases  | where Name -match ostype | select Name

Name
----
Microsoft.Compute/virtualMachines/storageProfile.osDisk.osType
Microsoft.Compute/virtualMachineScaleSets/virtualMachineProfile.storageProfile.osDisk.osType
Microsoft.Compute/virtualMachineScaleSets/virtualMachines/storageProfile.osDisk.osType
Microsoft.Compute/galleries/images/osType
Microsoft.Compute/disks/osType
Microsoft.Compute/snapshots/osType
Microsoft.Compute/images/storageProfile.osDisk.osType

From the list you can clearly see that Microsoft.Compute/virtualMachines/storageProfile.osDisk.osType is properly alias to use in matching. You can verify aliases on existing VMs you can use Azure Resource Graph in preview portal by executing following query

where type=~'Microsoft.Compute/virtualMachines'
| where name =~ 'pr7-material'
| project aliases

So resulting Azure Policy Definition rule shall look like below which will identify all Windows VMs regardless how they were created in your subscription and ensure hybrid benefit is enabled on them.

    "policyRule": {
     [
          {
            "field": "type",
            "equals": "Microsoft.Compute/virtualMachines"
          },
          {
            "field": "Microsoft.Compute/virtualMachines/storageProfile.osDisk.osType",
            "equals": "Windows"
          },
          {
            "allOf": [
              {
                "field": "Microsoft.Compute/licenseType",
                "notEquals": "Windows_Server"
              },
              {
                "field": "Microsoft.Compute/licenseType",
                "notEquals": "Windows_Client"
              }
            ]
          }
        ]
  },

Below result catching windows box created not from gallery failing audit as a result