Managing Azure Blob Containers and Content with PowerShell

I do a lot of work in Azure with writing and testing ARM templates.  Oftentimes I deal with a lot of parameters that need to access resources existing in Azure.  Things such as Azure Automation Credentials, KeyVault objects, etc.  To streamline my testing process, I’ll often create an Azure runbook to run the deployment template, pulling in the necessary objects as they’re needed.

Of course, this requires putting the template in a place that’s secure, and that Azure Automation can easily get to it.  This means uploading my templates to a location, and then creating a secure method of access.  This week, I’ll show you how to do the former process – with the latter coming next week.  Then later on, I’ll be walking you through how to create a runbook to access these resources and do your own test deployments!

First, let’s log in to our AzureRM instance in PowerShell and select our target subscription.  Once we’re done, we’re going to get our target resource group to play with and the storage account.:

$Subscription = 'LastWordInNerd'
Add-AzureRmAccount
$SubscrObject = Get-AzureRmSubscription -SubscriptionName $Subscription
Set-AzureRmContext -SubscriptionObject $SubscrObject

$ResourceGroupName = 'nrdcfgstore'
$StorageAccountName = 'nrdcfgstoreacct'

$StorAcct = Get-AzureRmStorageAccount -ResourceGroupName $ResourceGroupName -Name $StorageAccountName
 Now that we have our storage account object, we’re going to retrieve the storage account key for use with the classic Azure storage commands.
$StorKey = (Get-AzureRmStorageAccountKey -ResourceGroupName $ModuleStor.ResourceGroupName -Name $ModuleStor.StorageAccountName).where({$PSItem.KeyName -eq 'key1'})

I know it’s not the most intuitive thing to think of, but if you take a look, there are currently no AzureRM cmdlets for accessing blob stores.  What we can do, however, is use the storage key that we’ve retrieved and pass it in to the appropriate Azure commands to get the storage context.  Here’s how:

Let’s go ahead and log in to our Azure classic instance and select the same target subscription.    Once you’re logged in, you can use the New-AzureStorageContext cmdlet and pass the storage key we just retrieved from AzureRM.  This allows us to use the AzureRM storage account in the ASM context.

Add-AzureAccount

$AzureSubscription = ((Get-AzureSubscription).where({$PSItem.SubscriptionName -eq $SubscrObject.Name}))
Select-AzureSubscription -SubscriptionName $AzureSubscription.SubscriptionName -Current

$StorContext = New-AzureStorageContext -StorageAccountName $StorAcct.StorageAccountName -StorageAccountKey $StorKey.Value
Now that we have a usable storage context, let’s create our blob store by using the New-AzureStorageContainer cmdlet with the -Context parameter to get at our storage account:
$ContainerName = 'json'
Try{

$Container=Get-AzureStorageContainer-Name $ContainerName-Context $StorContext-ErrorAction Stop

}

Catch [System.Exception]{

Write-Output ("The requested container doesn't exist. Creating container "+$ContainerName)

$Container=New-AzureStorageContainer-Name $ContainerName-Context $StorContext -Permission Off

}

I decided to write this as a Try/Catch statement so that if the container doesn’t exist, it will go ahead and create one for me.  It works great for implementations where I might be working with a new customer and I forget to configure the storage account to where I need it.  Also, if you notice, I’ve set the Public Access to Private by setting the Permission parameter to Off.  Once again, a little counter-intuitive.

Now, if our script created the blob, we’ll be able to look at the storage account in the portal we’ll see that our container is available:

But we’ve also captured the object on creation, which you can see here:

So now that we have our container, all we have to do is select our target and upload the file:

$FilesToUpload = Get-ChildItem -Path .\ -Filter *.json
ForEach ($File in $FilesToUpload){

Set-AzureStorageBlobContent-Context $StorContext-Container $Container.Name-File $File.FullName-Force -Verbose

}

And we get the following return:

Now that we’ve uploaded our JSON template to a blob store, we can use it in automation.  But first, we’ll need to be able to generate Shared Access Signature (SAS) Tokens on the fly for our automation to securely access the file.  Which is what we’ll be talking about next week!

You can find the script for this discussion on my GitHub.

PowerShell – Strings Are Objects Too!

I find that as I become more comfortable with my new skills as a PowerShell junkie, I enjoy answering questions more and more.  Oftentimes, I find questions on various forums that challenge my knowledge and skill level to become even more adept at PowerShell.  Recently, I came across a task that not only challenged my skills, but also reinforced the need to break away from certain habits and concepts that I’ve been using over the years with my scripts.  This week’s lesson: Strings are not fixed data!

The challenge was an interesting one:  You have a CSV file with a bunch of names.  Some include First, Middle, and Last – some don’t.  Retrieve the first letter of the first name and the entire last name to generate an email address.

Easy enough if your CSV consists of three columns.  First we’ll use Import-CSV and see what we’ve got here.

Import-CSV 'C:\scripts\email.csv'

Strings0

Ah.  Looks like the file has no headers.  Easy enough of a fix:

Import-Csv 'C:\scripts\email.csv' -Header First,Middle,Last

Strings1Now that looks a lot better!  So we’ve got our data.  Let’s go ahead and break this thing down, starting with the first name.

$FName = $Name.First

This will return the names in the First column only.  Now what we’ll do is use the substring method to extract the first character from the string.  The two parameters after substring mark your starting point (which is 0, or before the first letter), and how many characters you want (which is the first one).

$First = $FName.substring(0,1)

So we’ll just run our script real quick here and…Strings2

 

 

 

 

 

There we go!  Part one accomplished!  The last name and email parts are pretty easy.  Just add in these lines:

$Last = $Name.Last
$Email = $First + $Last + '@company.com'

And you get…

Strings3 And there you have it!

But what if the full names are in a single column only?  Well, that’ll take a little bit different of an approach.  Fortunately, it doesn’t involve completely scrapping the script.  For the first initial, you can still use the substring method to grab it, but you need an easy way to identify the last name.  So let’s split the incoming data on the spaces:

$Last = $FName.Split(" ")

And we’ll get this return:

Strings4Now I started thinking, “Well that’s great!  But how do I tell PowerShell which one is which!?”  It took a little time, but I remembered that with PowerShell, we’re moving objects through the pipe, not fixed data.   And like any other object, I can filter them!

$Last = $FName.Split(" ") | Select-Object -Last 1

And voila!

Strings5Now you might decide, “you know, I think I’d like to have the middle name in there.  How are you going to do that, huh!?”  Well, I’ll show you. Just:

$Middle = $FName.Split(" ") | Select-Object -First 2 | Select-Object -Last 1

Strings6

Oh, but look.  Some of the strings have a middle name instead of just an initial.  We can easily remedy this by feeding our filtered object into a substring like we did with the first name:

$Middle = ($FName.Split(" ") | Select-Object -First 2 | Select-Object -Last 1).substring(0,1)

And now we get:

Strings7So then we put this all together and update our Email string…

Strings8Success!  Now we can start issuing emails from that list that HR sent us without having to do any manual formatting.  And we also reinforce the lesson that everything, including strings, are not fixed data; but objects that can be moved, and manipulated, through the pipe.

Have a happy holiday!