Managing Azure Blob Containers and Content with PowerShell

I do a lot of work in Azure with writing and testing ARM templates.  Oftentimes I deal with a lot of parameters that need to access resources existing in Azure.  Things such as Azure Automation Credentials, KeyVault objects, etc.  To streamline my testing process, I’ll often create an Azure runbook to run the deployment template, pulling in the necessary objects as they’re needed.

Of course, this requires putting the template in a place that’s secure, and that Azure Automation can easily get to it.  This means uploading my templates to a location, and then creating a secure method of access.  This week, I’ll show you how to do the former process – with the latter coming next week.  Then later on, I’ll be walking you through how to create a runbook to access these resources and do your own test deployments!

First, let’s log in to our AzureRM instance in PowerShell and select our target subscription.  Once we’re done, we’re going to get our target resource group to play with and the storage account.:

$Subscription = 'LastWordInNerd'
Add-AzureRmAccount
$SubscrObject = Get-AzureRmSubscription -SubscriptionName $Subscription
Set-AzureRmContext -SubscriptionObject $SubscrObject

$ResourceGroupName = 'nrdcfgstore'
$StorageAccountName = 'nrdcfgstoreacct'

$StorAcct = Get-AzureRmStorageAccount -ResourceGroupName $ResourceGroupName -Name $StorageAccountName
 Now that we have our storage account object, we’re going to retrieve the storage account key for use with the classic Azure storage commands.
$StorKey = (Get-AzureRmStorageAccountKey -ResourceGroupName $ModuleStor.ResourceGroupName -Name $ModuleStor.StorageAccountName).where({$PSItem.KeyName -eq 'key1'})

I know it’s not the most intuitive thing to think of, but if you take a look, there are currently no AzureRM cmdlets for accessing blob stores.  What we can do, however, is use the storage key that we’ve retrieved and pass it in to the appropriate Azure commands to get the storage context.  Here’s how:

Let’s go ahead and log in to our Azure classic instance and select the same target subscription.    Once you’re logged in, you can use the New-AzureStorageContext cmdlet and pass the storage key we just retrieved from AzureRM.  This allows us to use the AzureRM storage account in the ASM context.

Add-AzureAccount

$AzureSubscription = ((Get-AzureSubscription).where({$PSItem.SubscriptionName -eq $SubscrObject.Name}))
Select-AzureSubscription -SubscriptionName $AzureSubscription.SubscriptionName -Current

$StorContext = New-AzureStorageContext -StorageAccountName $StorAcct.StorageAccountName -StorageAccountKey $StorKey.Value
Now that we have a usable storage context, let’s create our blob store by using the New-AzureStorageContainer cmdlet with the -Context parameter to get at our storage account:
$ContainerName = 'json'
Try{

$Container=Get-AzureStorageContainer-Name $ContainerName-Context $StorContext-ErrorAction Stop

}

Catch [System.Exception]{

Write-Output ("The requested container doesn't exist. Creating container "+$ContainerName)

$Container=New-AzureStorageContainer-Name $ContainerName-Context $StorContext -Permission Off

}

I decided to write this as a Try/Catch statement so that if the container doesn’t exist, it will go ahead and create one for me.  It works great for implementations where I might be working with a new customer and I forget to configure the storage account to where I need it.  Also, if you notice, I’ve set the Public Access to Private by setting the Permission parameter to Off.  Once again, a little counter-intuitive.

Now, if our script created the blob, we’ll be able to look at the storage account in the portal we’ll see that our container is available:

But we’ve also captured the object on creation, which you can see here:

So now that we have our container, all we have to do is select our target and upload the file:

$FilesToUpload = Get-ChildItem -Path .\ -Filter *.json
ForEach ($File in $FilesToUpload){

Set-AzureStorageBlobContent-Context $StorContext-Container $Container.Name-File $File.FullName-Force -Verbose

}

And we get the following return:

Now that we’ve uploaded our JSON template to a blob store, we can use it in automation.  But first, we’ll need to be able to generate Shared Access Signature (SAS) Tokens on the fly for our automation to securely access the file.  Which is what we’ll be talking about next week!

You can find the script for this discussion on my GitHub.