Saturday, August 18, 2012

PowerShell V3 - $PSDefaultParameterValues

One of the new features of PowerShell V3 that I find very helpful is having the ability to automatically pass default parameters to cmdlets or advanced functions.

Here are a few snippets of code that shows how to manipulate them.  Simply include your default parameters in your profile and script on!

# Intitial creation (perhaps place in profile)            
# Add another one            
# opps, need to modify the email address            
# Add another one (straight out of Get-Help) that utilizes a scriptblock
$PSDefaultParameterValues+=@{"Format-Table:AutoSize"={if ($host.Name –eq "ConsoleHost"){$true}}}            
# Remove one            
# Remove all of them            
# Temporarily disable all parameters            
$PSDefaultParameterValues.Add("Disabled", $true)            
# Remove the temporary disable            

Tuesday, August 14, 2012

Windows PowerShell for Developers by Douglas Finke; O’Reilly Media

It is not often that a technical book makes you rethink how you think. Doug Finke (a Microsoft Most Valuable Professional) has achieved just that with this concise PowerShell reference.

He starts out our journey describing PowerShell as a glue language that is as programmable as Perl, Python and Ruby and takes its cues from Unix Shells. The next few steps describe getting started, and include a brief tour. He then shifts into high gear as we learn about template engines, adding PowerShell to our GUI apps and creating graphical applications within PowerShell.

The chapter on “Writing Little Languages in PowerShell” was a welcome paradigm shift for me. Having virtually no experience with Domain Specific Languages (DSL), it was a fun ride as Doug demonstrated how to create a better XML and a creating a DSL using Graphviz. The lessons in this chapter alone were worth the price of the book.

He completes our tour with coverage of integration with COM (Component Object Model – specifically Microsoft Excel) and some of highlights of PowerShell V3 (Workflows, JSON).

This was an enjoyable invigorating read; in fact, I went through it multiple times. I appreciate the developer-centric perspective that Doug displayed throughout the text.  Whether you are a seasoned developer or a weekend hacker, if you have any interest in PowerShell, I encourage you to pick up “Windows PowerShell for Developers”.

Tuesday, August 7, 2012

PowerShell, Diskpart and Exchange (Oh my!)

Was given the opportunity to work a bit on an Exchange 2007 - 2010 migration.  I was asked if supplied a csv that contained Server and folders could the following be scripted:
  • Step 1 - Create a mount point to hold the drives (M:)
  • Step 2 - Create a series of folders on the above drive
  • Step 3 - Create a series of volumes to be used to hold the database and log files
After a bit of research, it seemed feasible using a combination of PowerShell and DiskPart.
It took me a few tries to get the hang of the necessary DiskPart commands to string together to pipe, my experience with disks has primarily been with the GUI, so I able to add a bit more to my command line toolbelt.

So once the DiskPart commands were figured out, all that needed to be done is run these on the (new)Exchange servers.  Enter Invoke-Command - you should be hearing cheering, whistling and much applause as this is darn near the most useful cmdlet in PowerShell V2.

Following is a script that could be used to create 100s of volumes.

############## Step 1 ##############            
# Create the mount point for the drives            
# We create a script block consisting of the commands we need to pipe to Diskpart            
$cmds = "`"Select Disk 2`"",            
        "`"create partition primary`"",            
        "`"assign letter=m`"",            
        "`"format fs = ntfs unit=64k quick label='Databases'`""            
$string = [string]::Join(",",$cmds)            
$sb = $ExecutionContext.InvokeCommand.NewScriptBlock("$string | DiskPart")            
# Iterate over the 6 Exchange Servers using Invoke-Command to run or            
# script block on each server            
1..6 | foreach {            
    Invoke-Command -ComputerName "ex10mbox-vp0$_" -ScriptBlock $sb            
############## Step 2 ##############            
# Using a supplied CSV file, create our directories            
$folders = Import-Csv -Path C:\temp\BrentFolders.csv             
$folders | foreach {            
    $path = "\\$($_.server)\M$\$($_.folder)"            
    if(-not(Test-Path -Path $path)) {            
        New-Item -Path $path -ItemType Directory            
############## Step 3 ##############            
# Create the 10 volumes that will be used to hold the individual database and log files.            
$disk = 3            
foreach ($folder in $folders) {            
    if($i -eq 13){$i=3}            
    $cmds = "`"select disk $disk`"",            
            "`"online disk`"",            
            "`"attributes disk clear readonly`"",            
            "`"convert mbr`"",            
            "`"create partition primary`"",            
            "`"assign mount=M:\$($folder.Folder)`"",            
            "`"format fs = ntfs unit=64k quick label=$($Folder.Folder)`""             
    $string = [string]::Join(",",$cmds)            
    $sb = $ExecutionContext.InvokeCommand.NewScriptBlock("$string | DiskPart")            
    Invoke-Command -ComputerName $folder.Server -ScriptBlock $sb            
This could also be utilzed for SQL Server rollouts, etc.


Wednesday, August 1, 2012

"Introducing Regular Expressions" By Michael Fitzgerald; O'Reilly Media

Michael Fitzgerald achieves his goal of introducing the reader to Regular Expressions.  He clearly states his intent and his expected audience in the introduction.  In addition to the inductive approach to teaching the basics of Regular Expressions to the reader, he takes the opportunity to introduce a plethora of (free) tools.  They include but are not limited to the following:

The chapters build upon each other starting with the basics:
  • Simple Pattern Matching
  • Boundaries
  • Alternation, Groups and Backreferences
  • Character Classes
  • Matching Unicode and Other Characters
  • Quantifiers
  • Lookarounds
The lessons learned in the chapters listed above are put to use in Chapter 9, "Marking up a Document with HTML".

In addition to learning the basics of Regular Expressions, the reader gets (re)introduced to Samuel Taylor Coleridge's "The Rime of the Ancyent Marinere".  A nice change from the standard, technical examples.

This concise, book is worth the investment if you are new to Regular Expressions.  

Monday, June 25, 2012

PowerShell and Bass Guitar

I recently decided to try my hand at a musical instrument.  The bass guitar seemed like a good fit for me so I picked it up started a with a few lessons.  I quickly realized that in order to have any success with it, I needed to memorize the fret-board.  Given my affinity for PowerShell, I decided to blend the two.  Following is a script that I threw in my profile to prompt me upon start-up for a few notes.

function Get-BassNote {            
 $BassNotes = @{             
      "Fret0" = @{             
        "String1" = {E}            
        "String2" = {A}            
        "String3" = {D}            
        "String4" = {G} }            
      "Fret1" = @{            
        "String1" = {F}            
        "String2" = {A#}            
        "String3" = {D#}            
        "String4" = {G#} }            
      "Fret2" = @{            
        "String1" = {F#}            
        "String2" = {B}            
        "String3" = {E}            
        "String4" = {A} }            
      "Fret3" = @{            
        "String1" = {G}            
        "String2" = {C}            
        "String3" = {F}            
        "String4" = {A#} }            
      "Fret4" = @{            
        "String1" = {G#}            
        "String2" = {C#}            
        "String3" = {F#}            
        "String4" = {B} }            
      "Fret5" = @{            
        "String1" = {A}            
        "String2" = {D}            
        "String3" = {G}            
        "String4" = {C} }            
      "Fret6" = @{            
        "String1" = {A#}            
        "String2" = {D#}            
        "String3" = {G#}            
        "String4" = {C#} }            
      "Fret7" = @{            
        "String1" = {B}            
        "String2" = {E}            
        "String3" = {A}            
        "String4" = {D} }            
      "Fret8" = @{            
        "String1" = {C}            
        "String2" = {F}            
        "String3" = {A#}            
        "String4" = {D#} }             
      "Fret9" = @{            
        "String1" = {C#}            
        "String2" = {F#}            
        "String3" = {B}            
        "String4" = {E} }             
      "Fret10" = @{            
        "String1" = {D}            
        "String2" = {G}            
        "String3" = {C}            
        "String4" = {F} }            
      "Fret11" = @{            
        "String1" = {D#}            
        "String2" = {G#}            
        "String3" = {C#}            
        "String4" = {F#} }            
      "Fret12" = @{            
        "String1" = {E}            
        "String2" = {A}            
        "String3" = {D}            
        "String4" = {G} }            
    for ($i=1; $i-le$Count; $i++){            
        $fretNumber   = Get-Random -Minimum 0 -Maximum 12            
     $stringNumber = Get-Random -Minimum 1 -Maximum 4            
     [string]$Answer = $BassNotes."Fret$($FretNumber)"."String$($StringNumber)"            
     $prompt = "What is the note for string $($stringNumber), fret $($fretNumber)"            
     do { $Response = Read-Host $Prompt }             
        until ($Answer -eq $Response)            

Rock on!

Monday, April 16, 2012

Speed Reading with PowerShell

Many of you have had to read in large text files for processing in PowerShell.  The Get-Content cmdlet is perfect for this.  However, it can be very sloooow with large files.  There are multiple ways to speed this up.  For example, we could dive into .NET using the [System.IO.File]::ReadAllLines() method. For simplicity, let's stick with the Get-Content cmdlet.  Following is an example that demonstrates a couple different techniques, the one to focus on is the use of the "-ReadCount" parameter.

# define some random nouns, verbs and adverbs            
$noun = "Ed","Hal","Jeff","Doug","Don","Kirk","Dmitry"            
$verb = "ran","walked","drank","ate","threw","scripted","snored"       
$adverb = "quickly","randomly","erratically","slowly","slovenly","loudly"            
# create an array with 10,000 random sentences             
$content = 1..10000 | foreach {            
    "{0} {1} {2}." -f ($noun|Get-Random),($verb|Get-Random),($adverb|Get-Random)            
# save our array to a text file            
$path = "c:\temp\RandomSentences.txt"              
$content | Out-File -FilePath $path            
# read in the files and measure the time taken.            
(measure-command -Expression { Get-Content -Path $path }).TotalMilliseconds
(measure-command { Get-Content $path -ReadCount 10 }).TotalMilliseconds
(measure-command { Get-Content $path -ReadCount 100}).TotalMilliseconds
(measure-command { Get-Content $path -ReadCount 1000}).TotalMilliseconds

The results....


The Get-Content cmdlet does more behind the scenes then just present the data.  There are a properties being populated as it reads in the file.  By default, this happens for each line as it it read.  For large files, this overhead can be reduced by setting the -ReadCount parameter.  With this parameter set, you will only be manipulating the behind the scenes properties in a collection size that is equal to the number you set the -ReadCount attribute to.
Hope this helps!

Friday, March 2, 2012

Limit your use of the pipe

There have been many posts about the proper utilization of the powerful pipe. Filtering left to avoid piping to the Where-Object is always a good idea. Following is another example that demonstrates that judicious use of the pipe is a best practice.
We are going to define 3 scriptblocks that simply count to 100,000 and measure the time it takes them to run.
$limit = 100000         
$test1 = { foreach ($num in 1..$limit ) {$num} }
$test2 = { for($x=1; $x -le $limit; $x++) {$x} }
$test3 = { 1..$limit | foreach{$_} }

"ForEach: {0} seconds" -f (Measure-Command $test1).TotalSeconds
"For: {0} seconds" -f (Measure-Command $test2).TotalSeconds
"Pipe: {0} seconds" -f (Measure-Command $test3).TotalSeconds
On my machine these were the results:
ForEach: 0.0348825 seconds
For: 0.2490948 seconds
Pipe: 6.7627934 seconds