Boxstarter and Chocolatey tips

Monday, 16 July 2018

Two big things happened earlier this year on the Chocolatey front. First off, Boxstarter (the tool created by Matt Wrock that allows you to script up full Windows installations including handling reboots) is now being managed by Chocolatey. Boxstarter.org still exists, but the source repository is now under the Chocolatey org on GitHub.

The second is that Microsoft are contributing Boxstarter scripts in a new GitHub repo – https://github.com/Microsoft/windows-dev-box-setup-scripts

If you’re looking to use Boxstarter to automate the software installation of your Windows machines, there’s a few tricks and traps worth knowing about.

Avoid MAXPATH errors

It’s worth understanding that Boxstarter embeds its own copy of Chocolatey and uses that rather than choco.exe. Due to some compatibility issues Boxstarter currently needs to embed an older version of Chocolatey. That particular version does have one known bug where the temp directory Chocolatey uses to download binaries goes one directory deeper each install. Not a problem in isolation, but when you’re installing a lot of packages all at once, you soon hit the old Windows MAXPATH limit. A workaround is described in the bug report – essentially using the --cache-location argument to override where downloads are saved. The trick here is that you need to use this on all choco calls in your Boxstarter script – even for things like choco pin. Forget those and you still may experience the MAXPATH problem.

To make it easier, I add the following lines to the top of my Boxstarter scripts

New-Item -Path "$env:userprofile\AppData\Local\ChocoCache" -ItemType directory -Force | Out-Null
$common = "--cacheLocation=`"$env:userprofile\AppData\Local\ChocoCache`""

And then I can just append $common to each choco statement. eg.

cinst nodejs $common
cinst visualstudiocode $common 
choco pin add -n=visualstudiocode $common

Avoid unexpected reboots

Detecting and handling reboots is one of the great things about Boxstarter. You can read more in the docs, but one thing to keep in mind is it isn’t perfect. If a reboot is initiated without Boxstarter being aware of it, then it can’t do its thing to restart and continue.

One command I’ve found that can cause this is using Enable-WindowsOptionalFeature. If the feature you’re turning on needs a restart, then Boxstarter won’t resume afterwards. The workaround here is to leverage Chocolatey’s support for the windowsfeatures source. So instead of this

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Hyper-V-All

Do this

choco install Microsoft-Hyper-V-All -source windowsfeatures $common

Logging

If you have a more intricate Boxstarter script, you may run into some problems that you need to diagnose. Don’t look in the usual Chocolatey.log as you won’t see anything there. Boxstarter logs all output to its own log, which by default ends up in $env:LocalAppData\Boxstarter\Boxstarter.log. This becomes even more useful when you consider that Boxstarter may automatically restart your machine multiple times, so having a persistent record of what happened is invaluable. The other things you might want to make use of is Boxstarter-specific commands like Write-BoxstarterMessage (which writes to the log file as well as the console output) and Log-BoxstarterMessage (which just write to the log file)

Find out more about these and other logging commands by running help about_boxstarter_logging.

My scripts

I keep a few of my Boxstarter scripts at https://gist.github.com/flcdrg/87802af4c92527eb8a30. Feel free to have a look and borrow them if they look useful.

Find out more

If you’re really getting in to Chocolatey and Boxstarter, you might also be interested in Chocolatey Fest, a conference focusing on Windows automation being held San Francisco on October 8th.

Not all SSDs are the same

Thursday, 5 July 2018

(Or why I should stick with software, rather than hardware!)

I'd ordered some larger SSDs from MATS Systems this week to upgrade a couple of laptops that were running out of room. I'd scanned through the list and saw Samsung EVO 500GB. Yep, "add to cart" x 2.

Job done (or so I thought).

They arrived promptly yesterday, and near the end of the day I disassembled the first laptop to extract the existing smaller-capacity SSD so I could put it in the disk duplicator. I then ripped open the box of the newly purchased Samsung SSD and to my horror, it didn't look anything like the old one!

In fact it looked a lot like this:

Samsung EVO 860 SSD
"But David", you say, "that's an M.2 SSD!"

Well yes, yes it is, and that's exactly what it turns out I ordered - not realising that "M.2" doesn't just mean "fast" or "better" but it's an indication of the actual form factor.

I now understood that what I should have ordered was the 2.5" model - not the M.2 one.

So what was I going to do? First step, post to Twitter and see if I get any responses - and I did get some helpful advice from friends:


Twitter conversation

Twitter conversation

Twitter conversation

Unfortunately I'd ripped open the box so it wasn't in a great state to return. Instead I sourced one of these Simplecom converter enclosures to see if I could use it in the 2.5" laptop slot after all.

As Adam had mentioned on Twitter, one important thing was to identify what kind of key the SSD I had was using. You can tell that by looking at the edge connector. Here's the one I had:

Showing edge connector of M.2 SSD

This is apparently a "B+M" key connector (as it has the two slots). The specs for the Simplecom enclosure say it's suitable for either "B" or "B+M" so I was good there.

Unpacking the enclosure, there's a tiny screw one one side to undo, then you can pry open the cover.

Enclosure, with side screw and screwdriver

With the cover off, four more screws to extract before you can access the mounting board

Unscrewing mounting board from drive enclosure

Now it's just a simple matter of sliding in the SSD and using the supplied screw to keep it in.

SSD mounted on mounting board in enclosure

Then reassemble the enclosure and it's ready to test.

I tried it out in a spare laptop - pulling out the existing SSD and using the duplicator to image that onto the new SSD (and taking extra care to make sure I had them in the correct slots in the duplicator. It would be a disaster getting that wrong!)

Then pop the new SSD back in the laptop and see if it boots up.. Yay, it did!

The great news is MATS were able to arrange to swap over the other SSD (the one I hadn't opened yet) with a proper EVO 860 2.5" model. And I learned that if I had been more careful opening the box on the first one, that probably could have been swapped with just a small restocking fee too.

So after feeling like I'd really messed up, things ended up not too bad after all :-)

2018-2019 Microsoft Most Valuable Professional (MVP) award

Monday, 2 July 2018

I first received Microsoft's MVP award in October 2015. My most recent renewal just occurred on July 1st (aka the early hours of July 2nd here in Adelaide), which was a really nice way to start the week. My 4th consecutive year of being an MVP.

Microsoft MVP Logo


To quote the confirmation email, it was given "in recognition of your exceptional technical community leadership. We appreciate your outstanding contributions in the following technical communities during the past year: Visual Studio and Development Technologies"

For me, that's leading the Adelaide .NET User Group, occasional blogging here, speaking at user groups (and the odd conference) and open source contributions. I like to think that the things I do that have been recognised are things that I would be trying to do in any case.

It isn't something I take for granted. A number of MVPs I know didn't make the cut this year - and it's always a bit of a mystery why some continue and some don't.

I'm also aware that should my own (or Microsoft's) priorities change in the future, then it may no longer be for me. But for now, I really appreciate receiving the award and hope I can make the most of the opportunities it gives me.

Migrating Redmine issues to VSTS work items with the REST API

Friday, 22 June 2018

Redmine is an open-source project management/issue tracking system. I wanted to copy issues out of Redmine and import them into a Visual Studio Team Services project.

Extracting issues can be done by using the "CSV" link at the bottom of the Issues list for a project in Redmine. This CSV file doesn't contain absolutely everything for each issue (eg. attachments and custom data from any plugins). Another alternative would be to query the database directly, but that wasn't necessary for my scenario.

To migrate the data to VSTS you can use a simple PowerShell script, making use of the VSTS REST API.

You'll need to create a Personal Access Token. Be aware that all items will be created under the account linked to this token - there's no way that I'm aware of that you can set the "CreatedBy" field to point to another user.

Notice in the script how we handle different fields for different work items types (eg. Product Backlog Items use the 'Description' field, whereas Bugs use 'Repro Steps'), and for optional fields (eg. not all Redmine issues had the 'Assignee' field set).

The full set of fields (and which work item types they apply to) is documented here. If you have more fields in Redmine that can be mapped to ones in VSTS then go ahead and add them.

Get programming in F#

Monday, 11 June 2018

I’m really interested in learning more about functional programming. It isn’t something I knew much about, but the benefits of reducing mutability (and shared state) promoted by functional languages and functional style are enticing.

To that end, I recently bought a copy of Isaac Abraham’s new book “Get programming in F#. A guide for .NET Developers”.



I have no background in functional languages at all, so I was looking for a “gentle” introduction to the F# language, without getting hung up on a lot of the functional terminology that seems to make learning this stuff a bit impenetrable for the newcomer. This book delivers.

The structure of the book is in 10 “units”, which in turn are broken down into separate “lessons” (each lesson is a separate chapter).

Here's my notes from each unit:

Unit 1 – F# and Visual Studio

Unit 2 – Hello F#

Unit 3 – Types and functions

Unit 4 – Collections in F#

Unit 5 – The pit of success with the F# type system

Unit 6 – Living on the .NET platform

Unit 7 – Working with data

Unit 8 – Web programming

Unit 9 – Unit testing

Unit 10 – Where next?