Holy Cow! I’m An Honorary Scripting Guy!

Honorary-Scripting-Guy_largeSo this happened today.

When Ed Wilson told me that I was going to become an Honorary Scripting Guy, I was absolutely floored.  For me, this is not just a career high, but a personal one as well.  One that I’ve dreamed of for almost a decade.

When I started my career in IT, and decided that living the hell that was help desk wasn’t for me, I decided that I wanted to be something better.  I picked up two books from Barnes and Noble that would effectively change my career.  The first was a book on Systems Management Server 2003 – the predecessor to System Center Configuration Manager.  The other, was Microsoft VBScript: Step by Step by none other than The Scripting Guy – Ed Wilson.

These two books set me on a career of configuration management, automation, and compliance.  Patch management was already a passion of mine (if you worked on a help desk in the Sasser/Slammer era, you’d understand), and fit in perfectly with the other three; driving me towards a career of Patch Management, Security Compliance, and Automation.  As times and technology changed, it was only logical that I did too.  PowerShell and cloud technologies like Azure would become my guiding star; My new passion.

I credit Don Jones’ and Jason Helmick’s passion to get into the community that drove me to start down the path that I’ve been on the last year and a half.  But it was Ed Wilson and Wally Mead that gave me direction to get to that point.

2015 was a roller-coaster for me.  Receiving the Microsoft MVP in PowerShell a fantastic and humbling achievement.  Then, at PowerShell Summit, I got to meet Ed Wilson for the first time.  He encouraged me to write some articles for ‘Hey, Scripting Guy!’.  Within a month, I had been given two of the highest honors that one could achieve in my line of work.  It was amazing!

Shortly after, a major personal event effectively derailed my involvement in the community as I struggled to refocus and right the ship.  I had to relocate, take on a new position, and start rebuilding.  Over the last couple of months, PowerShell and Azure have been not only my career focus, but my therapy as well.  I’ve always been big on puzzles and PowerShell and Azure have plenty of them to solve!

Teresa Wilson (@ScriptingWife and my adopted MVP mom!), and Sean Kearney (@energizedtech) have been instrumental in my revival with their kind words and guidance.  I felt renewed.  My writer’s block was lifted.  I was back in the game.  I could again share what I learned with the community.  Thanks to you both for your help.

Ed Wilson gave me the chance to give back in a big way; and when he referred to my series on DSC as ‘WAY COOL’ on The Scripting Guys Facebook page, I was again blown away.  Thank you, Ed, for again giving me an opportunity to give to the community.

I’m very fortunate to get to work with my career heroes on a regular basis.  Receiving an honor from one of them…well, words cannot describe how I feel.  I find myself again honored and humbled.

2016, look out!

PowerShell – Using Try Catch To Help With Decision Making In A Script

Recently, while working on my scripts for rolling out server deployments in Azure, I came across an interesting issue with a cmdlet throwing a terminating error when I wasn’t expecting one.

I was attempting to use the Get-AzureService cmdlet to verify if the cloud service that I specified already existed or not.  It was necessary to check its existence in case VMs had already been deployed to the service and we were adding machines to the pool.  If it didn’t exist, I would add script logic to create the cloud service before deploying the VMs.  So when I execute:

Get-AzureService -ServiceName 'LWINPRT'

Returns with the following terminating error:

TryCatch1

Now, I expected the service to not be there, because I haven’t created it, but I didn’t expect the cmdlet to terminate in a way that would stop the rest of the script from running.  Typically, when using a command to look for something, it doesn’t throw an error if it can’t find it.  For example, when I look to see if a VM exists in the service:

Get-AzureVM -ServiceName 'LWINPRT' -Name 'LWINPRT01'

I get the following return:

TryCatch2

While the error wasn’t expected, it’s certainly not a show-stopper.  We just have to rethink our approach.  So instead of a ForEach statement looking for a null-value, why don’t we instead look at leveraging Try-Catch?

The Try-Catch-Finally blocks are what allows you to catch .NET exception errors in PowerShell, and provide you with a means to alert the user and take a corrective action if needed.  You can read about them here, or there’s an exceptional article by Ashley McGlone on using it.   So we’ll go ahead and set this up to test.

    Try {
        Get-AzureService -ServiceName 'LWINPRT' -ErrorAction Stop 
        }#EndTry

    Catch [System.Exception]
        {
        Write-Host "An error occurred"
        }#EndCatch

And we execute…

TryCatch3

And we get a return!  But I don’t want an error in this case.  What I want is to create the cloud service if it doesn’t exist.  so let’s do this instead:

    Try {
        Get-AzureService -ServiceName 'LWINPRT' -ErrorAction Stop 
        }#EndTry

    Catch [System.Exception]
        {
        New-AzureService 'LWINPRT' -Location "West US"
        }#EndCatch

And we execute this…

TryCatch4

And now we get the service created.  And we can now see it in our Azure console:

TryCatch5

Now we can use this Try block to check if a cloud service exists or not, knowing that if it can’t find the cloud service it will throw a terminating error.  And when it does, we can use the Catch block to create the existing service.  Decision made.

PowerShell – Automating Server Builds In Azure – Pt. 3 – Finish And Function

Over the last couple of weeks, we’ve taken our simple Azure VM creation script and expanded its versatility to support standardization in an automated fashion.  Now  we’re going to add some finishing touches to make this a function that includes some scalability and added functionality before we turn our eyes towards the DSC portion of our role-based deployments.

Of course, because of some of the functionality that we’ll be adding in the script, we’re going to be jettisoning that easy stuff that was New-AzureQuickVM in favor of New-AzureVM.  New-AzureVM offers us a lot more flexibility to build our VMs, including the ability to statically assign an IP address during the configuration.  So to wrap up this portion of our Azure exploration, we’ll be:

  • Adding logic to verify that your Azure account token is valid.
  • Checking the predefined subnets’ address pools for available addresses and assigning them to the machine
  • Adding logic to deploy multiple VMs for a given role simultaneously.
  • Adding in our comment-based help and building our script into a function.

First step, let’s add in our comment-based help.  Aside from it being a community best-practice, it’s helpful to whomever you’re intending to use this script to understand what it is you’ve created and how it works.  So in it goes.

AzurePt3-1

We’ll go ahead and call this function New-AzureRoleDeployment.  Along with adding the block to set this as a function, we’re going to go ahead and leverage the Begin, Process, and End blocks as well.  The bulk of our previously existing script will reside in the Process block.  In the Begin block, I’m going to add some code to verify that there is an Azure Account configured for the PowerShell instance, and to execute the Add-AzureAccount cmdlet if no Azure account is signed in.  I’m using Get-AzureService to verify that the account’s authentication token is current, because Get-AzureAccount doesn’t readily give up that information.  Get-AzureService will throw an exception if it’s not current.

***NOTE*** – I was previously using Get-AzureSubscription, but found that this didn’t provide a consistent result.  I’ve updated the script to reflect the use of Get-AzureService instead.

    BEGIN {
        Write-Verbose "Verifying Azure account is logged in."
        Try{
            Get-AzureService -ErrorAction Stop
            }#EndTry

        Catch [System.Exception]{
            Add-AzureAccount
            }#EndCatch

    }#EndBEGIN

We’ll also add in a quick Write-Verbose message in the End block to state that the function finished.  We could omit the End block altogether, or use it to clean up our login with the Remove-AzureAccount cmdlet, but depending on how you’ve set up your Azure account on the system, you could wind up creating more work for yourself after running this function.  I’d recommend doing some reading up on how the Remove-AzureAccount cmdlet works before deciding if it’s something you want to add.

    END {

        Write-Verbose "New-Deployment tasks completed."

        }#EndEND

Now let’s do some modifications to the script to allow us to add a number of systems instead of a single system at a time.  This is going to require us to work with one of my favorite PowerShell features – math!  First, let’s update our parameter block with a Quantity parameter to input.

    Param (

        [Parameter(Mandatory=$True)]
        [ValidateSet('IIS','PSWA','PRT','DC')]
        [string]$Purpose,

        [Parameter(Mandatory=$True)]
        [int]$Quantity,

        [switch]$Availability

    )#End Param

Now, we’ll find our original code for creating the numbering portion of our server names.

$CountInstance = (Get-AzureVM -ServiceName $ConvServerName).where({$PSItem.InstanceName -like "*$ConvServerName*"}) | Measure-Object
$ServerNumber = ($CountInstance.Count + 1)
$NewServer = ($ConvServerName + ("{00:00}" -f $ServerNumber))
Write-Verbose "Server name $NewServer generated.  Executing VM creation."

We’re going to modify this code by changing the ServerNumber variable to FirstServer.  To make this easier, I use the Replace function in ISE (CTRL + H) to change all of the references to ServerNumber at once.  Next, we need to figure out the last server in the series.  Logically, you would think that this would just be the Quantity variable, plus the FirstServer.  However, this doesn’t work exactly as expected.  For example, if we:

$CountInstance = (Get-AzureVM -ServiceName 'LWINPRT').where({$PSItem.InstanceName -like "*LWINPRT*"}) | Measure-Object

We get a return of 0, because the cloud service doesn’t currently exist.  Now, so we don’t start at 0 or the highest allocated number for our server number series, we have to do this:

$FirstServer = ($CountInstance.Count + 1)

And if we execute our two lines of code, then the FirstServer variable will equal 1.  Now, we’ll go ahead and create a Quantity variable with the value of 3 and add the FirstServer and Quantity together.

$Quantity = 3
$LastServer = $FirstServer + ($Quantity)

Now, if we check the LastServer variable, we get a value of 4.  Now the problem comes up when we array it:

$Range = $FirstServer..$LastServer

We get the following array of values in the Range variable.

AzurePt3-4

So now, while we’ve requested 3 machines, our logic will tell PowerShell to build 4.  So we instead rectify it by subtracting a number from the Quantity like so:

            $CountInstance = (Get-AzureVM -ServiceName $ConvServerName).where({$PSItem.InstanceName -like "*$ConvServerName*"}) | Measure-Object
            $FirstServer = ($CountInstance.Count + 1)
            $LastServer = $FirstServer + ($Quantity - 1)
            $Range = $FirstServer..$LastServer

AzurePt3-5

And now we have the appropriate range.  Next, we’re going to add a new switch block under our existing one to help set us up for assigning a static address in the subnet that the new systems will be assigned in.  So first let’s create the block with the output variable VNet:

            Switch ($Purpose){
            'IIS' {$VNet = '10.0.0.32'};
            'PSWA' {$VNet = '10.0.0.16'};
            'PRT' {$VNet = '10.0.0.48'};
            'DC' {$VNet = '10.0.0.64'}            

            }#Switch

Notice that I’m using the same purpose parameter.  No sense in requiring our user to enter information needlessly when we can pull it from a single source.

Because of how we need to craft our command to build a VM with the New-AzureVM cmdlet (you’ll see in a minute), we can no longer use a single argument list as before.  So instead we’re going to take what we had before…

                #Standard arguments to build the VM  
                $AzureArgs = @{

                    'ServiceName' = $ConvServerName
                    'Name' = $NewServer
                    'InstanceSize' = 'Basic_A1'
                    'SubnetNames' = $_Purpose
                    'VNetName' = 'LWINerd'
                    'ImageName' = $BaseImage.ImageName
                    'AdminUserName' = 'LWINAdmin'
                    'Password' = 'b0b$yerUncl3'
                }#EndAzureArgs

…and we’re going to update it like so:

           #Standard arguments to build the VM  
           $InstanceSize = 'Basic_A1'
           $VNetName = 'LWINerd'
           $ImageName = $BaseImage.ImageName
           $AdminUserName = 'LWINAdmin'
           $Password = 'b0b$yerUncl3'

Now we’re going to use our VNet switch to test the subnet, check the available addresses, and get the first one available to assign.  Also, I’m adding in some Write-Verbose statements so I can verify that the variables that I need to have created are actually being generated by my script.

      $AvailableIP = Test-AzureStaticVNetIP -VNetName $VNetName -IPAddress $VNet
      $IPAddress = $AvailableIP.AvailableAddresses | Select-Object -First 1

      Write-Verbose "Subnet is $VNet"
      Write-Verbose "Image used will be $ImageName"
      Write-Verbose "IPAddress will be $IPAddress"

As before, we’re going to use the presence of the Availability parameter to determine our path here.  The biggest change will be with our actual creation command.  Instead of a quick one-liner, we’ll instead be moving through the pipe, creating a new VM configuration object, adding the necessary information, assigning the static IP, and finally kicking off the build.

If($Availability.IsPresent){
                    
Write-Verbose "Availability set requested.  Building VM with availability set configured."
                    
Try{
    Write-Verbose "Verifying if server name $NewServer exists in service $ConvServerName"
    $AzureService = Get-AzureVM -ServiceName $ConvServerName -Name $NewServer
        If (($AzureService.InstanceName) -ne $NewServer){

            New-AzureVMConfig -Name $NewServer -InstanceSize $InstanceSize -ImageName $ImageName -AvailabilitySetName $ConvServerName | 
            Add-AzureProvisioningConfig -Windows -AdminUsername $AdminUserName -Password $Password | 
            Set-AzureSubnet -SubnetNames $_Purpose | 
            Set-AzureStaticVNetIP -IPAddress $IPAddress | 
            New-AzureVM -ServiceName $ConvServerName -VNetName $VNetName
        }#EndIf

        Else {Write-Output "$NewServer already exists in the Azure service $ConvServerName"
                
        }#EndElse

}#EndTry

Catch [System.Exception]{$ErrorMsg = $Error | Select-Object -First 1
    Write-Verbose "VM Creation failed.  The error was $ErrorMsg"
}#EndCatch

}#EndIf

The process is repeated for the Else statement in the event that the Availability parameter is not selected.

Else{
                    
        Write-Verbose "No availability set requested.  Building VM."
                    
    Try{
                        
        Write-Verbose "Verifying if server name $NewServer exists in service $ConvServerName"
                        
        $AzureService = Get-AzureVM -ServiceName $ConvServerName -Name $NewServer
        If (($AzureService.InstanceName) -ne $NewServer){
            New-AzureVMConfig -Name $NewServer -InstanceSize $InstanceSize -ImageName $ImageName | 
            Add-AzureProvisioningConfig -Windows -AdminUsername $AdminUserName -Password $Password | 
            Set-AzureSubnet -SubnetNames $_Purpose | 
            Set-AzureStaticVNetIP -IPAddress $IPAddress | 
            New-AzureVM -ServiceName $ConvServerName -VNetName $VNetName
        }#EndIf

        Else {Write-Output "$NewServer already exists in the Azure service $ConvServerName"
        }#EndElse

    }#EndTry

    Catch [System.Exception]{$ErrorMsg = $Error | Select-Object -First 1
                                Write-Verbose "VM Creation failed.  The error was $ErrorMsg"
    }#EndCatch

}#EndElse

Now we’ll go ahead and execute our new code to create three new VMs destined to be print servers.

New-AzureRoleDeployment -Purpose PRT -Quantity 3 -Availability -Verbose

AzurePt3-6

Success!  Now we can deploy any number of servers to our designated subnets, configure them with a statically assigned IP Address, and assign them to an availability group off of a simple one-liner!  Now I’m off to do some more reading and research on Desired State Configuration so we can continue our automated deployment track!

You can download the full script at the TechNet Script Center for review.

Using Azure to Keep Moving Your Career Forward

As admins and engineers, it’s often left on us to gain the knowledge and experience needed to further our careers.  Even if we’re lucky enough to work for a company that will pay for some training, it’s often directly related to the position that you’re currently working.  For large environments with silo’d IT groups, this means that you’ll likely get trained on one or two products that you already have experience working in.

Sure, getting that training is cool, and hopefully you’ll learn some things that you didn’t know before (and hopefully make you more efficient), but what if you want to expand your horizons to move to a different position or just gain a broader understanding of how everything works?

WhyAzure

A lot of the talk these days is about cloud, and if you’re in a Microsoft environment, that means Azure.  Furthermore, PowerShell has moved beyond a simple system management tool, to a tool for handling configuration management and deployment of applications among other things.  Desired State Configuration, released with PowerShell v4.0 just 16 months ago, has already celebrated it’s 10th resource kit wave.  PowerShell 5.0, slated to launch with Windows 10, will provide application deployments using OneGet.  Even though I’ve been pretty hot and heavy on the PowerShell track for a while now, I’m still feeling pretty far behind.

PowerShell is a management platform that has absolutely taken off in the last couple of iterations, and there’s no indication from Redmond that it’s going to slow down anytime soon.  New PowerShell cmdlets are made available in wave updates to Windows, and other applications are following suit by releasing new or enhanced product-specific cmdlets in cumulative update releases.  So for those that haven’t started learning PowerShell, you might want to consider taking your IT education into your own hands.

But let’s step away from the PowerShell discussion for the moment and talk about those other applications and operating systems themselves.  Companies are increasingly relying on us to be knowledgeable about many new apps and server platforms the moment they hit RTM.  But getting VMs spun up in a lab or non-production environment, and scheduling time during work hours is pretty close to impossible.  But of course, how do you begin to overcome those challenges?

For a long time, I used a home PC as a lab environment, leveraging Microsoft Virtual PC, and later, Hyper-V.  But I find that as I do more presentations, I need more flexibility than carrying around a massive PC with me, and my Surface just doesn’t have enough power to support five or six VM instances.  Even if you take presentation out of the equation, there’s still the question of managing legal server licenses and software, or tearing down and rebuilding an environment every 90 days if you’re using evals.  So I decided to try out Microsoft’s Azure service to see what it could offer me from a learning perspective, as well as a presentation point.

The Pros

Well first off, you’re going to be directly learning a technology that you’re going to have to eventually learn to deal with.  Whether it’s in Microsoft’s cloud, or your own internal one, my gut tells me that Azure is going to be the management platform for Windows Server for many moons to come.  On top of that, you’ll have access to the latest versions of server, and many applications depending on what subscription level you’re running, all without having to manage as many licenses as you were previously dealing with.

Want to try something new?  Spin up a new VM in minutes.  If it explodes the machine, you can delete it and start all over again without having to build a machine from scratch.  Getting underway is super fast and easy.

You’re also dealing with products that those exam prep books are talking about!  You can build up your environment along with the study guide and get underway to your next certification in hopefully minimal time.

WhyAzure2

Finally, you can access it pretty much anywhere you have an internet connection.  So if you’ve got a presentation to head out to, or you’re on the road and want to test out a theory or new configuration, you can do so through RDP or PowerShell Remoting.

The Cons

It costs money.  Not a lot mind you; especially if you’re careful.  Microsoft basically charges you for what you’re using, so if you shut down your VMs when they’re not in use, it won’t cost you as much.  Though a couple of times, I have managed to leave a large number of VMs on overnight, and that wound up costing me about $6 for the overnight mistake.  But if you’re careful, you can keep the bill under $50 a month USD.

It’s internet-based.  So if you’re unable to access the internet from your location (or they have a slow connection), you can’t get to your environment.  From a presentation perspective, this is becoming less and less of an issue, but still something you’ll want to check in on when presenting at a new locale.

In the End

It’s cool to say that you’ve built out your own infrastructure from scratch and blah blah blah…  Actually, who are we kidding?  Nobody thinks it’s cool or fun; even other people that do it themselves.  It’s a lot of work to maintain, and a total pain in the ass to lug around!  The cost of an hour or two of your monthly salary can save you tons of headaches and give you a foundation of new technology that everyone’s talking about.  It might not be free, but it’s been my experience that if you’re not willing to make a financial commitment to your own career to get further ahead, then maybe it’s time to consider investing in a new direction.

Holy Cow! I’m A PowerShell MVP!

Yesterday I received an email from Microsoft informing me that I had been selected as a Most Valuable Professional for PowerShell. While I can’t prove it, I’m also pretty sure that someone in the office was peeling a truckload of onions at the very instant that I was reading that email.

Talk about blown away!

I’m not going to talk about the biography of my career to this date and how it led me to this point, because in all honesty, it really didn’t. What did motivate me, was the fortunate opportunity that I had to hang out with three gentleman who have been very vocal about PowerShell: Don Jones, Jeff Hicks, and Jason Helmick, last year at TechMentor in Redmond, Washington.

MVP

I learned so much during TechMentor that my PowerShell skills had easily improved a thousand-fold. I went from creating basic scripts to improve my daily workload, to creating functions and modules almost literally overnight! But it wasn’t the courses that I took that drove me to write more about my PowerShell journeys; it was what they said between classes, and during the end-of-day get-togethers. It was where Don, Jeff, and Jason would preach to us the importance of not only improving our own skills, but to evangelize to those who didn’t realize the impact that PowerShell is making on Microsoft’s products, how PowerShell can make their current workload so much more manageable, and to mentor those who wanted to learn more as we ourselves were being mentored.

I took it to heart, and that’s why I do what I do.

So I want to personally thank those who have mentored me as they mentor so many others, – Don, Jeff, and Jason. Thank you for motivating me to learn all I can about PowerShell and help lead others to the water. I have a long road ahead of me, but I will do my best.

Thank you to those who nominated me. I will do everything I can to earn the confidence that you have in me.

Finally, thank you to those who selected me to be a PowerShell MVP. I am humbled, and honored, by this award.

PowerShell – Automating Server Builds In Azure – Pt. 2 – Rolling Servers To Their Silos

During this scripting session, I’ll be working on a system that is running PowerShell 5.0 (February 2015 release).

So now that I’ve put together a basic script for building out a server in Azure, I want to do even more by making the script and my environment more versatile.  So we’re going to go ahead and add some parameterization that will build a naming standard for the hostname and cloud service, as well as give our script the ability to deploy the server into a preconfigured VLAN.

Ultimately, my goal will be to deploy a system and apply a DSC configuration using a single script that will configure the system using standardized settings based on the role that I’ve selected initially.  But let’s concentrate on the basics first.

You might have noticed, using the previous script, that there are some things that are configured (or not) with your Azure server.  For example, the VM is built as a Standard A1 Standard system (1 core, 1.75 GB RAM).  It also deposits the machine in your root network instead of any subnets you may have configured.  We’re going to remedy that today.

Before I start with updating my script, I’m creating some virtual networks in my Azure environment to assign systems to.  At the time of this writing, handling this in my PowerShell script requires a little more work than what I’d like to do, so I’m creating my virtual networks through the Azure UI.  These newly created subnets will use a naming convention that will be recognizable to my script with minimal work.

Azure2-6

Now on to the scripting!  First, I’m going to create a parameterization for the server role.  This will be the core variable that will determine a number of settings for us.  Second, I’m going to create a switch for creating an availability group (more on this later).  I’m also adding the cmdletbinding function for use later.

[cmdletbinding()]
Param (

    [Parameter(Mandatory=$True)]
    [ValidateSet('IIS','PSWA','PRT')]
    [string]$Purpose,

    [switch]$Availability

)#End Param

Now I’m going to create the switch to work with my Purpose parameter.  The purpose of this switch will be to determine the future role of the server, its naming convention, what subnet to add the system to, as well as the DSC configuration to apply to it at a later time.  So basically, everything.

Switch ($Purpose){
    'IIS' {$_Purpose = 'IIS'};
    'PSWA' {$_Purpose = 'PSWA'};
    'PRT' {$_Purpose = 'PRT'}
    }#Switch

Now we’ll also add our standardized naming prefix, set up our server naming, and the default Azure location we want to use.

$RootName = "LWIN"
$ConvServerName = ($RootName + $_Purpose)
$Location = "West US"

So when our script executes and we specify the PSWA parameter, we’ll get this:

Azure2-1

And I’m also going to add some Write-Verbose data here for troubleshooting purposes if anything goes south later on.

Write-Verbose "Environment is $_Purpose"
Write-Verbose "Root name is $RootName"
Write-Verbose "Service will be $ConvServerName"
Write-Verbose "Datacenter location will be $Location"
If($Availability.IsPresent){Write-Verbose "Server will be assigned to $ConvServerName availability group."}

Since I’m building a fairly basic environment for now, I’m going to silo things by server role.  But before we deploy the machine, we’re going to check to see if a cloud service exists, and if not, create it using our $ConvServerName label.  For some reason, if you attempt to retrieve a service name that doesn’t exist, Azure throws a terminating error.  So we’re going to handle this with a Try/Catch statement, and leverage that to create the cloud service if it runs into this error.

Try 
    {Write-Verbose "Checking to see if cloud service $ConvServerName exists."
    Get-AzureService -ServiceName $ConvServerName -ErrorAction Stop 
    }#EndTry

Catch [System.Exception]
    {Write-Verbose "Cloud service $ConvServerName does not exist.  Creating new cloud service."
    New-AzureService $ConvServerName -Location $Location
    }#EndCatch

Now that we’ve created the cloud service, we’ll go ahead and create the host name that we’ll be using.  Since I’ll be rolling out servers in numerical order, I’m going to add some logic in to count the number of existing servers in the cloud service (if any) and create the next instance based on count.

$CountInstance = (Get-AzureVM -ServiceName $ConvServerName).where({$PSItem.InstanceName -like "*$ConvServerName*"}) | Measure-Object
$ServerNumber = ($CountInstance.Count + 1)
$NewServer = ($ConvServerName + ("{00:00}" -f $ServerNumber))
Write-Verbose "Server name $NewServer generated.  Executing VM creation."

So let’s add in our arguments table from last week and our boot image location.  We’re making some modifications over last week’s script to accomodate for some of the automation we’re performing.  We’re specifying the Basic_A1 instance size, as well as assigning the machine to a pre-configured subnet, and using our $ConvServerName variable to determine the service to put the machine into.

$BaseImage = (Get-AzureVMImage).where({$PSItem.Label -like "*Windows Server 2012 R2 Datacenter*" -and $PSItem.PublishedDate -eq "2/11/2015 8:00:00 AM" })

$AzureArgs = @{

    'ServiceName' = $ConvServerName
    'Name' = $NewServer
    'InstanceSize' = 'Basic_A1'
    'SubnetNames' = $_Purpose
    'VNetName' = 'LWIN.Azure'
    'ImageName' = $BaseImage.ImageName
    'AdminUserName' = 'LWINAdmin'
    'Password' = 'b0b$yerUncl3'
}

Now for our VM creation, we’re going to add some logic in to verify whether or not the machine already exists in the service (just in case!), and add a little error handling in case things get a little ugly.  We’ll also wrap this in an If statement for handling the build with and without the availability parameter selected.

If($Availability.IsPresent){
    Write-Verbose "Availability set requested.  Building VM with availability set configured."
    Try{
        Write-Verbose "Verifying if server name $NewServer exists in service $ConvServerName"
        $AzureService = Get-AzureVM -ServiceName $ConvServerName -Name $NewServer
            If (($AzureService.InstanceName) -ne $NewServer){
                New-AzureQuickVM -Windows @AzureArgs -AvailabilitySetName $ConvServerName
            }#EndIf
            Else {Write-Output "$NewServer already exists in the Azure service $ConvServerName"}#EndElse
        }
    Catch [System.Exception]{$ErrorMsg = $Error | Select-Object -First 1
                                Write-Verbose "VM Creation failed.  The error was $ErrorMsg"}#EndCatch
}#EndIf
Else{
        Write-Verbose "No availability set requested.  Building VM."
    Try{
        Write-Verbose "Verifying if server name $NewServer exists in service $ConvServerName"
        $AzureService = Get-AzureVM -ServiceName $ConvServerName -Name $NewServer
            If (($AzureService.InstanceName) -ne $NewServer){
                New-AzureQuickVM -Windows @AzureArgs
            }#EndIf
            Else {Write-Output "$NewServer already exists in the Azure service $ConvServerName"}#EndElse
        }
    Catch [System.Exception]{$ErrorMsg = $Error | Select-Object -First 1
                                Write-Verbose "VM Creation failed.  The error was $ErrorMsg"}#EndCatch
}#EndIf

So now we’ll go ahead and save our script and execute…

.\ServerDepl.ps1 -Purpose IIS -Availability -Verbose

Pt2PicA

And success!  So let’s check and verify that we have our service:

Pt2PicB

And that we have our VM.

Pt2PicC

And now we can check our VM config and verify that we have an availability group and the correct network.

Pt2PicD

Tada!

Next week I’ll be putting some of the finishing touches on this script to make it a bit more versatile.  And hopefully in the following week after that, I’ll be able to show off a little of what I’ve learned of DSC before heading out to the PowerShell Summit this April.  Stay tuned!

PowerShell Debate: Write-Verbose As Opposed To Writing Comments

Recently, I was working out some problems in my script that I’m developing in my Azure environment, and a thought had occurred to me while I was working: Is commenting in my scripts a waste of time?

WriteVerbose2

While I was building my script, I was actually using a lot of Write-Verbose messages so that way I could monitor the commands as they executed by using the -Verbose parameter.  This was especially helpful in my If/Else statements, or my Try/Catch statements, so I could monitor what patch my script was heading down to make sure it was doing what it was supposed to do.  I also added some extra bits here and there to make sure that my variables were generating as I expected them to.  Very quickly I found that I was actually duplicating the work I was putting into commenting my code with my Write-Verbose statements.  This is what lead me to my to the thought that perhaps comments aren’t really the way to go.

If you’ve coded a robust function, there should almost never be a time when an admin has to dig into your scripts.  This means that the only time someone is ever going to look at it, is when something goes wrong.  Instead of tearing through your code and figuring out at what point things fall apart, you could far more quickly and easily execute the script with -Verbose, and see the last command executed before everything hit the fan.  After all, if it’s a script that has worked well for a long time, it’s more likely that something changed in your environment more so than the script just breaking for no good reason.

I still use commenting, but more really for my own sanity, such as tracking which curly bracket is closing which statement, and of course when adding my Comment Based Help! But for the most part, I’m seeing less reason to use commenting over Write-Verbose, because of the usability that the latter supplies.  I’d actually like to hear what other people think about this.  Let me know in the comments below!

PowerShell – Automating Server Builds In Azure – Pt. 1 – Basic Builds

During this scripting session, I am working on a system that is running PowerShell 5.0 (February 2015 release).

I started with a very simple goal towards learning Azure and Desired State Configuration: to be able to rapidly deploy a series of machines required for giving demonstrations at user group meetings and internal company discussions.  In order to get my Azure environment to this point, I figured that I would need to learn the following:

  • Build a single deployment using a one-line command.
  • Build a series of constants to provide the one-line command to ensure consistency in my deployments.
  • Build a series of basic parameterized inputs to provide the command the necessary information to deploy the server into a service group for a given purpose (IIS server, print server, etc.)
  • Expand the scope of the script to build a specified number of servers into a service group, or add a number of servers to an existing service group.
  • Build a second command to tear down an environment cleanly when it is no longer required.

Once complete, I will stand up a DSC environment to explore the possibility of leveraging my provisioning scripts to deploy my standardized configurations to a number of different deployments based on the designated purpose.

Before I begin, it should be noted that I had an issue with the New-AzureQuickVM cmdlet returning an error regarding the CurrentStorageAccountName not being found.  A quick Googling took me to Stephen Owen’s blog on how to resolve this error.  You’ll want to read up on this post and update your Azure subscription info if necessary.  You’ll likely have to.

The ‘One-Liner’

Building a one-line command is pretty easy, provided that you have some necessary things configured before deploying.  For my purposes, I’ll be using the New-AzureQuickVM cmdlet.  At a bare minimum for a Windows deployment, you’ll need the following bits of information:

  • You need to specify that you want to deploy a Windows server. (-windows)
  • The location of your preferred Azure datacenter (-location)
  • The image name you wish to deploy. (-imagename)
  • The name of the server you wish to deploy.(-name)
  • The Azure Cloud Services name to deploy the server into. (-servicename)
  • The user name of the Administrator account. (-adminusername)
  • The password of the Administrator account. (-password)

But before we can do the assemblage, we need to gather some information first; such as the information for location and imagename.  Getting the location is fairly straightforward.  Using the Get-AzureLocation cmdlet, you can get a listing of the datacenters that are available globally.

Get-AzureLocation

Azure2

For our purposes, we’ll use the West US datacenter location.  Now to look up the image name, we’ll use the Get-AzureVMImage since we’re not using any custom images.

Get-AzureVMImage

Now you’ll find that when you run this, it’s going to pull all of the images available through Azure; 470 at the time of this writing to be exact!  So we’re going to try to pare this down a bit.

Azure3

(Get-AzureVMImage).where({$PSItem.Label -like "*Windows Server 2012 R2*"}) | Measure-Object

Azure4

Well, almost.  But if we whittle it down a bit further…

Azure5

Ah!  Better.  Now that we’ve gotten down to the base server load, we can take a look and we’ll find that, like the images available in the image gallery in the New Virtual Machine wizard in Azure, you have three different dated versions available to choose from.  For the purposes of our implementation, we’ll just grab the latest image.

Azure6

What we’ll be looking to grab from here is the long string of characters that is the image name to fulfill the image requirement for our command.  So let’s go ahead and snag this bit for our BaseImage variable and add our $Location variable as well.

$Location = 'West US'
$BaseImage = (Get-AzureVMImage).where({$PSItem.Label -like "*Windows Server 2012 R2 Datacenter*" -and $PSItem.PublishedDate -eq "2/11/2015 8:00:00 AM" })

So now we can build our command.  But I’d like to not have my commands run off the screen, so let’s do some splatting!

$Location = 'West US'
$BaseImage = (Get-AzureVMImage).where({$PSItem.Label -like "*Windows Server 2012 R2 Datacenter*" -and $PSItem.PublishedDate -eq "2/11/2015 8:00:00 AM" })
$AzureArgs = @{
 Windows = $True
 ServiceName = 'LWINerd'
 Name = 'TestSrv'
 ImageName = $BaseImage.ImageName
 Location = $Location
 AdminUserName = 'LWINAdmin'
 Password = 'b0b$y3rUncle'
}
New-AzureQuickVM @AzureArgs

And now we see that we have a new service in Azure…

Azure7

And a new VM is spinning up!

Azure8

And now we have our base script for building VMs in Azure.  Next post, we’ll be looking at creating availability sets through PowerShell, as well as assigning our VMs to specific virtual network address spaces, bringing in some parameterization and more!

***EDIT*** – For some reason I was previously under the impression that you have to create an availability set if you wanted to have multiple machines co-existing in the same cloud service.  This is not the case, but I’ll be exploring the creation of availability sets in my code next week nonetheless.

Exploring Azure and Desired State Configuration Through PowerShell

I’ve decided to test out the possibility of using an Azure environment for carrying the tools necessary for some of my presentations.  The goal is to build out an Azure server instance that stores a number of DSC configurations that I can use to spin up different environment scenarios on the fly.  It also gives me a good excuse to finally get heads-down and learn Azure and DSC.

The other side of the coin is to see if Azure itself can actually be utilized as an affordable alternative to building out a virtual lab at home.  So while my posts may have less code for the next few weeks, I’m hoping that all of the work will pay off in some creative coding that will include some examples of spinning up resources in Azure and then applying DSC templates to them for your digestion.

Azure

For the moment, I’m using a free trial of Azure, which is available by going here and signing up.  At the time of writing this, I was granted a 30-day trial with $200 in credits to use as I saw fit.

There’s a lot of good information out there regarding configuring your first Azure environment.  Here are some of the blogs and guides that I’m using, including some light reading for DSC.

Of course, as I’m new to Azure and DSC, I’ll be happy to have people point out any gaps or improvements I can make, so please feel free to comment!

PowerShell – Use SCCM Automated Reports to Create Software Update Rollups

A while back, I showed off how you could use PowerShell to create Software Update packages and deployments for SCCM.  I was asked about my process for grabbing outstanding updates and adding them into the package.  I’ve since automated this method, and thought I’d share an update with the class.

SCCM gives you the ability to create subscriptions to automatically generate reports and email them or deposit them on a file share.  For the purpose of rollup creation, I use ‘Compliance 3 – Update group (per update) under Software Updates – A Compliance’.  The process is simple – Create a software update group of all of the updates you track and right-click on the report, click create subscription, and fill out the necessary fields.  Make sure you select the report to be delivered by Windows File Share, and Overwrite an existing file with a newer version.

Once you’ve completed the process and SCCM has generated its first report, you can use it to mine for the data you want.  First, we’ll need to import the CSV file.

Import-Csv "\\Server01\Patching\Critical and Security Updates All Workstations.csv"

CMReports1

As you will very quickly find, you’re not going to immediately get the data you want.  This is because the canned report adds 6 rows of header data that’s completely unusable for our purposes.  Not to worry though!  It can easily be remedied using Get-Content and re-exporting the data you want to a new CSV.

 Get-Content "\\server01\Patching\Critical and Security Updates All Workstations.csv" | Select-Object -Skip 6 | Out-File "\\server01\Patching\OutstandingWksUpds.csv"
 (Import-csv -Path "\\server01\Patching\OutstandingWksUpds.csv")

CMReports2This gives us a little something more to work with.  But we’re dealing with a few hundred lines of information, so let’s thin this out a bit.  For our purposes, I’m going to just look for updates for Windows components and Office for my workstations.  So we’ll filter out the following as a base:

  • Updates that are missing on 0 machines
  • Updates that have Server in the description
  • Updates that are applicable to applications we don’t wish to include. (Lync, SQL, Onedrive, etc.)

So now we’ll have a filter statement that looks like this:

(Import-csv -Path "\\Server01\Patching\OutstandingWksUpds.csv").where({$PSItem.Details_Table0_Missing -ne 0 -and $PSItem.Details_Table0_Vendor0 -notlike "*Server*" -and $PSItem.Details_Table0_Vendor0 -notlike "*Lync*" -and $PSItem.Details_Table0_Vendor0 -notlike "*Skydrive*"  -and $PSItem.Details_Table0_Vendor0 -notlike "*Onedrive*"-and $PSItem.Details_Table0_Vendor0 -notlike "*Sharepoint*"}) | Select-Object Details_Table0_Vendor0,Details_Table0_Missing

There’s a lot in there,  but once you’ve filtered out the things you don’t want, you’ll wind up with a nicely ordered list:CMReports3

Now that we have the updates, we can incorporate them into a Software Update Group using some of my previous code.

Important note: If you attempt to run the first line of the script while mapped to your SCCM PSDrive, you will get an error on the Out-File “Cannot open file because the current provider (AdminUI.PS.Provider\CMSite) cannot open a file.”  So make sure you’re not in your CM PSDrive until you’re ready to execute your CM cmdlets.

Get-Content "\\Server01\Patching\Critical and Security Updates All Workstations.csv" | Select-Object -Skip 6 | Out-File "\\Server01\Patching\OutstandingWksUpds.csv"
$update = (Import-csv -Path "\\Server01\Patching\OutstandingWksUpds.csv").where({$PSItem.Details_Table0_Missing -ne 0 -and $PSItem.Details_Table0_Vendor0 -notlike "*Server*" -and $PSItem.Details_Table0_Vendor0 -notlike "*Lync*" -and $PSItem.Details_Table0_Vendor0 -notlike "*Skydrive*"  -and $PSItem.Details_Table0_Vendor0 -notlike "*Onedrive*"-and $PSItem.Details_Table0_Vendor0 -notlike "*Sharepoint*"}) | Select-Object Details_Table0_Vendor0,Details_Table0_Missing


$PowerShellPath = "\\Server01\C$\Program Files\Microsoft Configuration Manager\AdminConsole\bin\ConfigurationManager.psd1"
$CMSiteCode = "A00"
Import-Module $PowerShellPath
CD ("$CMSiteCode" + ":")

         $UpdateGroupArgs = @{
                            'Name' = 'Workstations Update Rollup';
                            'Description' = 'Created with Powershell!'
                            'UpdateID' = '16802522','16801092','16803970'
                            }
New-CMSoftwareUpdateGroup @UpdateGroupArgs
    ForEach ($upd in $update){Add-CMSoftwareUpdateToGroup -SoftwareUpdateName $upd.Details_Table0_Vendor0 -SoftwareUpdateGroupName $UpdateGroupArgs.Name }

Execute the script, and then check your Software Update Groups and you’ll find you have a basic rollup package to test and deploy to your systems.

Using the same methods, you can streamline your process for creating new Software Update groups for your monthly deployments as well.  Nothing beats a set of eyeballs to review the list and make sure you’ve got what you want to push, but importing a list of outstanding updates and adding them straight to an update group sure as heck beats selecting them individually through the console.