Thursday, 30 September 2010

Five Things I Wish My iPhone Did

Wake Me Up When I Get To Bristol

For falling asleep on the train – or on the last bus home – this would be a killer feature; an alarm that goes off when you get within 1km of your destination, waking you up so you don’t miss your stop. For bonus points, it could calculate your average speed and last known distance from your destination, so if you've lost GPS signal, it’ll wake you up early just to be on the safe side.

Geographical Task Reminders

Every night I get home and realize that I’ve left my headphones in my desk drawer at work – again. It’d be cool to set a reminder so that next time I’m at work, it’ll remind me to put my headphones in my bag so I don’t forget them.

Charging Reminders Based On Wi-Fi Signals

I sometimes forget to charge my phone at work, and since it won’t last a full day & night without a top-up, this means it dies sometime during the evening, leaving me out & about with no music, no movies, no Twitter, no e-mail – oh, yeah, and no phone.

If my phone’s connected to my office wi-fi network, then I’m at work – so if I haven’t plugged it in to charge after 10-15 minutes, it should beep at me “hey, I know you’re at work – plug me in!”

Don’t Ask for a Unlock Code When On A Known Wi-Fi Network

Just like above – if my phone’s on my office or my home wi-fi network, then it’s hopefully not been stolen, and it’s probably safe to use it without entering the unlock code first.

Make Phone Calls and Send Text Messages Reliably

I know… wishful thinking. The iPhone is a lovely internet gadget, music player, movie player, sat-nav and all-round geek toy. But for actually communicating with other people, I still carry a battered old Nokia. It makes calls, it sends texts, and the battery lasts well over a week.

Friday, 24 September 2010

Fun with Powershell, SQL Backup and Automated Restores

Our database backup system is based on taking a full backup of every database every night, and transaction log backups every 20 minutes. Backups are shipped offsite to a backup server sitting in a data-centre across town, and what I was trying to do was to automate the process of restoring the latest full backup and all subsequent transaction logs on the remote server.

So: ingredients. I have :

  • Red Gate SQL Backup (and the SqlBackupC.exe command-line client)
  • Windows Powershell
  • A list of databases in a text file (“databases.txt”)
  • A  folder full of database backups, and a folder full of transaction log backups.

The backup filenames look something like:


What I’m trying to do is, for each backup listed in databases.txt, I want to find the most up-to-date full backup, restore it, and then restore, in chronological order, every transaction log that’s been created since that full backup was taken.

So. Powershell… time to see what all the fuss is about.

Let’s start with a couple of things that threw me until I got my head around them. Powershell is a shell. Don’t think of it like C# or Java – think of it like a batch file on steroids. With objects. If you want to run a program from a Powershell script, you just put its name in the script. For example – this is a valid Powershell script that’ll open itself in Notepad:

C:\Windows\System32\Notepad.exe myscript.ps1

Copy that line into notepad, save it as as myscript.ps1, fire up Powershell.exe, navigate to the folder where you saved it, and type .\myscript.ps1 – bingo.

Secondly, Powershell is designed so that if something has a certain meaning in DOS, it’ll mean the same thing in Powershell. <, | and > will redirect and pipe output – just like in DOS – which means Powershell can’t use > as the greater-than operator – so they’ve used –ge instead. (Mmm. Perly.) Backslashes are path separators – just like in DOS – so Powershell uses the backtick (`) as an escape character.

Thirdly, Powershell supports Perl-style string interpolation.

$foo = 10
echo “I have $foo fingers”

will output “I have 10 fingers”. To avoid this behaviour, use single-quotes instead:

$foo = 10
echo 'I have $foo fingers'

will print "I have $foo fingers” Be careful, though – underscores are valid in variable names, so

echo “LOG_$server_$catalog_$timestamp”

is actually going to try and find variables called $server_ , $catalog_ and $timestamp. To work around this behaviour, enclose the variable name in braces:

echo “LOG_${server}_${catalog}_${timestamp}”

Finally – Powershell supports aliases, which means most commands have more than one name. For example, there’s a built-in command Get-ChildItem – and dir, gci, and ls are all shortcuts for this command, so when you bring up a Powershell command and want to list the contents of the current directory, it doesn’t care whether you type dir, ls, gci, or Get-ChildItem – they all do exactly the same thing.

OK. Time for scripty fun. Here’s the script – you’ll need to supply your own databases.txt and credentials, and probably tweak the

# Read the contents of databases.txt into a collection called $databases
$databases= Get-Content "databases.txt"

# Path to the SQL Backup command-line client.
$exe = "C:\Program Files (x86)\Red Gate\SQL Backup 6\SqlBackupC.exe"

# Credentials for connection to SQL Server. 
$username = 'sa' $password = '[email protected]'
# Database backup encryption password (you *do* encrypt your backups, don’t you… ?)
$dbpw = 'T0P$3CR3T'
foreach($database in $databases) {

  # Get a collection of ALL backup files in the backup folder.
  $allSqbFiles = Get-ChildItem D:\backup\data\*.sqb 
  # Create a regular expression that'll match filenames
  # against the current database name
  $regex = "FULL_\(local\)_${database}_\d+_\d+\.sqb"
  # Filter the list of backup files to find those for the current DB
  $backups = $allSqbFiles | Where-Object { $_.Name -match $regex }
  # Sort the backups by LastWriteTime...
  $backups = $backups | Sort-Object LastWriteTime

  # and extract the most recent one
  $latestBackup = $backups | Select-Object -last 1

  # Capture the LastWriteTime of the most recent backup  
  $t = $latestBackup.LastWriteTime

  # Extract the full name of the database backup file  
  $backup = $latestBackup.FullName
  # Construct the SQL statement to pass to SqlBackupC.exe - note the 
  # doubled double-quotes used to include a double-quote in the result.
  $sql = """RESTORE DATABASE [$database] FROM DISK = '$backup' WITH PASSWORD = '$dbpw', NORECOVERY, REPLACE"""
  # This next bit is what actually does the database restore, and 
  # it's a bit fiddly.
  # & (the ampersand) is known as the "call" operator, and 
  # basically says "hey, run this command" Note that you CAN'T just
  # give it a big fat string containing the whole command, arguments
  # and everything. That's not how it works.
  # Second, the arguments beginning with a hyphen need to be
  # backtick-escaped so Powershell knows they're not operators.
  # Finally, note how we've included the $exe in quotes,
  # because the path to the SqlBackupC.exe has spaces in it. 

  & "$exe" `-U $username `-P $password `-SQL $sql
  # Use another regex to grab all the LOG files for the current database.
  $regex = "LOG_\(local\)_${database}_\d+_\d+\.sqb";
  $logFiles = Get-ChildItem D:\backup\logs\*.sqb
  $logFiles = $logFiles | Where-Object { $_.Name -match $regex }
  # Then we filter this list to return only those that are more
  # recent than the full backup (which we captured earlier)
  $logFilesToRestore = $logFiles | Where-Object {$_.CreationTime -ge $t } | Sort-Object CreationTime
  $logFileCount = $logFilesToRestore.Length
  # If there's no log files, we break out of the loop and move
  # onto the next database.
  if ($logFileCount -le 0) { continue }
  # Now, the LAST transaction log needs a slightly different SQL 
  # command, because we want the last restore to leave the database
  # in a usable state, so we need to split the logs into the final
  # one, and all the others. 
  $splitAtIndex = $logFileCount - 2
  # Powershell lets you slice arrays by saying $MyArray[x..y]
  $logFilesExceptFinaltOne = $logFilesToRestore[0..$splitAtIndex]
  $finalFogFile = $logFilesToRestore | Select-Object -last 1
  foreach($log in $logFilesToRestore) {
    $logFileName = $log.FullName
    # Construct the SQL statement to restore the transaction log
    # leaving the database ready to accept further log restores:
    $sql = """ RESTORE LOG [${database}] FROM DISK = '${logFileName}' WITH PASSWORD = '${dbpw}', NORECOVERY"" "
    # and run SqlBackupC with that SQL statement:
    & "$exe" `-U $username `-P $password `-SQL $sql
} $logFileName = $finalFogFile.FullName # Construct the SQL statement to restore the last # transaction log and leave everything ready for use $sql = """RESTORE LOG [${database}] FROM DISK = '$logFileName' WITH PASSWORD = '${dbpw}',
# and run it! & "$exe" `-U $username `-P $password `-SQL $sql }

Having never really used Powershell before, this whole thing took about three hours to put together – and I spent the first two hours cursing Powershell for being so damned awkward, and then it just clicked and suddenly everything made a lot more sense. Hopefully the tips above will save you a bit of frustration if you’re taking your first steps with it… and I’ve a feeling I’m going to be doing a lot more Powershelling before too long. For stuff like this, it’s so much more powerful than batch files, and so much more maintainable than writing little command-line utilities in C#.

Monday, 20 September 2010

Want a Free Red Gate Tool to Support Your Open-Source .NET Project?

Earlier in the year, I helped Red Gate software out with some customer focus group stuff, and in return they’ve offered me a free license to any single Red Gate software tool

Since I’m already a happily-licensed user of Red Gate’s SQL Toolbelt and most of their .NET tools, I thought I’d pass this generous gesture on to the .NET community, so here’s the deal.

If you maintain an open-source .NET project, and you’d like a free license for any single Red Gate tool, drop me a short e-mail, tell me about your project, tell me which tool you’d like, and why it would be useful.

imageThere is a single license up for grabs, for any of the tools listed on Red Gate’s products page at

I’m not looking for ten-page essays – if your project is well-known, just tell me which tool you’d like and how it will help. E-mail your entry to [email protected] – please put “Red Gate Giveaway” in the subject line so they don’t end up in the spam folder, and remember to include your full name & e-mail so I can let you know if you’ve won. I’ll accept entries until this Friday, October 1st, at 10:00am (GMT), after which I’ll pick a winner and send their details across to Red Gate.

Wheaton’s Law applies; my decision is final, no correspondence will be entered into, etc, etc. Good luck!

Wednesday, 15 September 2010

.NET String Formatting Cheat Sheet

This morning I saw this tweet from @jhollingworth:image

which linked to this excellent guide to .NET string formats by Steve Tibbett

Sometimes, something is just so incredibly useful that even bookmarking it isn’t convenient enough – so I’ve hacked Steve’s guide and examples into a one-page A4 cheat sheet and stuck it on the wall next to my desk.

image It’s up on my site as a PDF or as a Word document – download it, print it out, stick it on the wall, and you’ll never find yourself Googling {0:n} vs {0:c} again.

All credit to Steve Tibbett for this – all I did was make it fit on one printed page. Happy formatting.

Monday, 13 September 2010

Automating Secure FTP Backups to a Windows 2008 Server

I’ve been looking into solutions for shipping database backups and log-files to a remote server as part of our backup strategy. Requirements are pretty simple:

  • It’s got to be completely automated
  • A daily run has to take less than 6 hours so it’ll finish overnight whilst our connection is quiet
  • Transfers need to be encrypted to stop people sniffing our packets
  • I don’t want to spend a lot of money
  • If I can get away without lots of complicated (read: fragile) command-line switches, then that would be a bonus

Now, there’s four common protocols for doing secure file transfers - SFTP, FTPS, FTP+SSH, and SCP. The specifics aren’t important – what matters is that they’re incompatible, so your client and server have to be using the same protocol, otherwise nothing will work. There’s also various dedicated tools such as rsync, robocopy, xcopy, SyncBack, DeltaCopy… the list goes on. Setting up rsync on Windows is a bit fiddly – not to mention the dozens of subtle and nefarious command-line switches available to rsync.exe – and whilst I love a bit of cygwin hacking as much as anyone else, I’d really rather use something a little more intuitive here. I did try DeltaCopy, a GUI wrapper around rsync, but I quickly ran into security problems with SSH keys and username/password combinations. Since I wanted to avoid open folder shares on the remote system, tools like xcopy and robocopy were out.

In the past, I’ve had great results with Vandyke Software’s VShell, a commercial Windows SSH server, but whilst putting this lot together, I discovered that in Windows Server 2008, you can set up an FTP server to encrypt connections via SSL (it’ll even use the same certificate as your website). That’s good, by the way – I think things are always cheaper & easier when Windows does them out of the box.

So – assuming you’ve already got a Windows 2008 Server running IIS, and you’ve already got a certificate installed, then log onto the server, fire up IIS, create a new FTP site, switch on SSL, bingo.


Just to sanity-check that everything’s working, fire up WinSCP, create a new session using FTP with SSL Explicit Encryption, and you should be able to connect to your new secure server.

image OK – so far, so good. We can connect, we can see a directory listing. Now what I want to do is to automate the client side of things, and for this bit, I’m going to use a superb bit of software called Super Flexible File Synchronizer. This is something I’ve only discovered recently, but so far, I’m delighted with it. It’s got native support for FTP. SFTP, WebDAV and HTTP, as well as Google Docs, Amazon S3 and Azure (!), plus a built-in scheduler, mail notification facility… as I said, it’s looking very promising.

Right, time for some actual computer science – complete with experimental data and graphs and everything. I want to know how fast this thing is going to go, and I want to know how much the encryption’s going to slow things down, so I’ve put together 72Mb of assorted database backups, TIFF files and documents, and transferred them up to the remote server using various settings. Results here are the average of five runs for each combination.

There’s a slightly odd setting in Super Flexible File Synchroniser – when you’re setting up an Internet target, if you pick FTP, it’ll give you a choice of libraries – cryptically called 1, 2 and 3.

image I’m guessing that under the hood, there’s three different off-the-shelf FTP libraries in there, and the option lets you pick another one if the default doesn’t play nicely with your particular server. (Yeah, like anyone’s ever going to need that…)

So, numbers. I ran five different settings – network share (i.e. copying straight to \\myserver\backups using Windows file sharing), “open” FTP (no SSL), and then each of the three FTP libraries with the SSL option switched on.


Total Time

Time per File

Transfer Speed
























FTP+SSL (Library 1)








FTP+SSL (Library 2)


FTP+SSL (Library 3)









A couple of things to note:

  1. FTP Library 2 just didn’t work – it wouldn’t even connect to the remote server to retrieve a directory listing.
  2. FTP Library 1 clearly had some issues – a typical run took well over eight minutes, and the subsequent logs were littered with timeouts and stuff like this:
  3. COPY L->R d:\FileDemo\F00090820-0103.tif  (889.7kB)
    FTP Exception (5): EclSocketError Timeout error occured @ 009283D0
    COPY L->R d:\FileDemo\F00090820-0104.tif  (851.6kB)
    FTP Exception (5): EclSocketError Timeout error occured @ 009283D0

Fortunately – FTP Library #3 worked perfectly, and gave transfer speeds that were actually faster than the raw FTP connection. I’m thinking that’s probably down to variations in test conditions (time of day, traffic. and so on) but even allowing for a little experimental error, it’s definitely fast enough to work with.

We’re looking at transferring about 20Gb worth of database backups and photographs a day. Based on the stats above, we can do 75Mb in 45 seconds, which equates rather neatly to 100Mb / min, which means our 20Gb backups are going to take just over three hours. $60 for a copy of SuperSynchroFlexofile O’Maticalyzer (and I’ll throw in a case of beer if you can come up with a snappier name for the next edition), SSL all the way, and a nice GUI to set it all up.

Isn’t it nice when everything just… works?