Two big things happened earlier this year on the Chocolatey front. First off, Boxstarter (the tool created by Matt Wrock that allows you to script up full Windows installations including handling reboots) is now being managed by Chocolatey. Boxstarter.org still exists, but the source repository is now under the Chocolatey org on GitHub.
The second is that Microsoft are contributing Boxstarter scripts in a new GitHub repo – https://github.com/Microsoft/windows-dev-box-setup-scripts
If you’re looking to use Boxstarter to automate the software installation of your Windows machines, there’s a few tricks and traps worth knowing about.
Avoid MAXPATH errors
It’s worth understanding that Boxstarter embeds its own copy of Chocolatey and uses that rather than choco.exe. Due to some compatibility issues Boxstarter currently needs to embed an older version of Chocolatey. That particular version does have one known bug where the temp directory Chocolatey uses to download binaries goes one directory deeper each install. Not a problem in isolation, but when you’re installing a lot of packages all at once, you soon hit the old Windows MAXPATH limit.
A workaround is described in the bug report – essentially using the
--cache-location argument to override where downloads are saved. The trick here is that you need to use this on all choco calls in your Boxstarter script – even for things like choco pin. Forget those and you still may experience the MAXPATH problem.
To make it easier, I add the following lines to the top of my Boxstarter scripts
New-Item -Path "$env:userprofile\AppData\Local\ChocoCache" -ItemType directory -Force | Out-Null
$common = "--cacheLocation=`"$env:userprofile\AppData\Local\ChocoCache`""
And then I can just append
$common to each choco statement. eg.
cinst nodejs $common
cinst visualstudiocode $common
choco pin add -n=visualstudiocode $common
Avoid unexpected reboots
Detecting and handling reboots is one of the great things about Boxstarter. You can read more in the docs, but one thing to keep in mind is it isn’t perfect. If a reboot is initiated without Boxstarter being aware of it, then it can’t do its thing to restart and continue.
One command I’ve found that can cause this is using
Enable-WindowsOptionalFeature. If the feature you’re turning on needs a restart, then Boxstarter won’t resume afterwards. The workaround here is to leverage Chocolatey’s support for the windowsfeatures source. So instead of this
Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Hyper-V-All
choco install Microsoft-Hyper-V-All -source windowsfeatures $common
If you have a more intricate Boxstarter script, you may run into some problems that you need to diagnose. Don’t look in the usual Chocolatey.log as you won’t see anything there. Boxstarter logs all output to its own log, which by default ends up in
$env:LocalAppData\Boxstarter\Boxstarter.log. This becomes even more useful when you consider that Boxstarter may automatically restart your machine multiple times, so having a persistent record of what happened is invaluable.
The other things you might want to make use of is Boxstarter-specific commands like
Write-BoxstarterMessage (which writes to the log file as well as the console output) and
Log-BoxstarterMessage (which just write to the log file)
Find out more about these and other logging commands by running help
I keep a few of my Boxstarter scripts at https://gist.github.com/flcdrg/87802af4c92527eb8a30. Feel free to have a look and borrow them if they look useful.
Find out more
If you’re really getting in to Chocolatey and Boxstarter, you might also be interested in Chocolatey Fest, a conference focusing on Windows automation being held San Francisco on October 8th.
(Or why I should stick with software, rather than hardware!)
I'd ordered some larger SSD
s from MATS Systems
this week to upgrade a couple of laptops that were running out of room. I'd scanned through the list and saw Samsung EVO 500GB. Yep, "add to cart" x 2.
Job done (or so I thought).
They arrived promptly yesterday, and near the end of the day I disassembled the first laptop to extract the existing smaller-capacity SSD so I could put it in the disk duplicator. I then ripped open the box of the newly purchased Samsung SSD and to my horror, it didn't look anything like the old one!
In fact it looked a lot like this:
"But David", you say, "that's an M.2 SSD!"
Well yes, yes it is, and that's exactly what it turns out I ordered - not realising that "M.2" doesn't just mean "fast" or "better" but it's an indication of the actual form factor.
I now understood that what I should have ordered was the 2.5" model - not the M.2 one.
So what was I going to do? First step, post to Twitter and see if I get any responses - and I did get some helpful advice from friends:
Unfortunately I'd ripped open the box so it wasn't in a great state to return. Instead I sourced one of these Simplecom converter enclosures
to see if I could use it in the 2.5" laptop slot after all.
As Adam had mentioned on Twitter, one important thing was to identify what kind of key the SSD I had was using. You can tell that by looking at the edge connector. Here's the one I had:
This is apparently a "B+M" key connector (as it has the two slots). The specs for the Simplecom enclosure say it's suitable for either "B" or "B+M" so I was good there.
Unpacking the enclosure, there's a tiny screw one one side to undo, then you can pry open the cover.
With the cover off, four more screws to extract before you can access the mounting board
Now it's just a simple matter of sliding in the SSD and using the supplied screw to keep it in.
Then reassemble the enclosure and it's ready to test.
I tried it out in a spare laptop - pulling out the existing SSD and using the duplicator to image that onto the new SSD (and taking extra care to make sure I had them in the correct slots in the duplicator. It would be a disaster getting that wrong!)
Then pop the new SSD back in the laptop and see if it boots up.. Yay, it did!
The great news is MATS were able to arrange to swap over the other SSD (the one I hadn't opened yet) with a proper EVO 860 2.5" model. And I learned that if I had been more careful opening the box on the first one, that probably could have been swapped with just a small restocking fee too.
So after feeling like I'd really messed up, things ended up not too bad after all :-)
I first received Microsoft's MVP award
in October 2015. My most recent renewal just occurred on July 1st (aka the early hours of July 2nd here in Adelaide), which was a really nice way to start the week. My 4th consecutive year of being an MVP.
To quote the confirmation email, it was given "in recognition of your exceptional technical community leadership. We appreciate your outstanding contributions in the following technical communities during the past year: Visual Studio and Development Technologies
For me, that's leading the Adelaide .NET User Group
, occasional blogging here, speaking at user groups (and the odd conference) and open source contributions. I like to think that the things I do that have been recognised are things that I would be trying to do in any case.
It isn't something I take for granted. A number of MVPs I know didn't make the cut this year - and it's always a bit of a mystery why some continue and some don't.
I'm also aware that should my own (or Microsoft's) priorities change in the future, then it may no longer be for me. But for now, I really appreciate receiving the award and hope I can make the most of the opportunities it gives me.
is an open-source project management/issue tracking system. I wanted to copy issues out of Redmine and import them into a Visual Studio Team Services project.
Extracting issues can be done by using the "CSV" link at the bottom of the Issues list for a project in Redmine. This CSV file doesn't contain absolutely everything for each issue (eg. attachments and custom data from any plugins). Another alternative would be to query the database directly, but that wasn't necessary for my scenario.
To migrate the data to VSTS you can use a simple PowerShell script, making use of the VSTS REST API
You'll need to create a Personal Access Token
. Be aware that all items will be created under the account linked to this token - there's no way that I'm aware of that you can set the "CreatedBy" field to point to another user.
Notice in the script how we handle different fields for different work items types (eg. Product Backlog Items use the 'Description' field, whereas Bugs use 'Repro Steps'), and for optional fields (eg. not all Redmine issues had the 'Assignee' field set).
The full set of fields (and which work item types they apply to) is documented here.
If you have more fields in Redmine that can be mapped to ones in VSTS then go ahead and add them.
I’m really interested in learning more about functional programming. It isn’t something I knew much about, but the benefits of reducing mutability (and shared state) promoted by functional languages and functional style are enticing.
To that end, I recently bought a copy of Isaac Abraham’s new book “Get programming in F#. A guide for .NET Developers
I have no background in functional languages at all, so I was looking for a “gentle” introduction to the F# language, without getting hung up on a lot of the functional terminology that seems to make learning this stuff a bit impenetrable for the newcomer. This book delivers.
The structure of the book is in 10 “units”, which in turn are broken down into separate “lessons” (each lesson is a separate chapter).
Here's my notes from each unit:
Unit 1 – F# and Visual Studio
- Introduces using the Visual Studio IDE for F# development, and recommended extensions. Surprisingly for a book published in 2018, most of the book is based on using Visual Studio 2015. I can only presume this is an artifact of the time it takes to write a book. I understand the initial release of 2017 did have some tooling regressions for F# but I am under the impression those are now resolved, seeing at my time of writing the 7th update for 2017 has just been released, including specific enhancements for F#.
- Throughout the book, comparisons are made to equivalent C# language constructs, and here too, the text is already a bit dated. An unfortunate downside of a printed book I guess.
- One thing to note that is different from many other languages – the file order in F# projects is significant. You can’t reference something before the compiler has seen it, and the compiler processes files in project order.
- The REPL is also a big part of F# development.
Unit 2 – Hello F#
- The ‘let’ keyword is introduced. It’s more like C#’s const than var, seeing as F# defaults to things being immutable rather than mutable.
- Scoping is based on whitespace indentation rather than curly braces.
- Diving into how the F# compiler is much stricter because of the way the F# type system works, and how that can be a good thing.
- A closer look at working with immutable data, and how you can opt in to mutable data when absolutely necessary, and how to handle state.
- C# is statement based, whereas F# likes to be expression based.
- The ‘unit’ type is introduced. It’s kind of like void, but is a way for expressions to always return a value (and means the use of those expressions is always consistent).
Unit 3 – Types and functions
- Tuples, records
- Composing functions, partial functions, pipelines,
- How do you organise all these types and functions if you’re not using classes? Organising code through namespaces and modules
Unit 4 – Collections in F#
- Looking at the F#-specific collection types – List, Array and Seq, the functions you can use with those collections. Immutable dictionaries, Map and Sets. Aggregation and fold.
Unit 5 – The pit of success with the F# type system
- Conditional logic in F#, pattern matching
- Discriminated unions
Unit 6 – Living on the .NET platform
- How to use C# libraries from F#. Using Paket for NuGet package management
- How to use F# libraries from C#
Unit 7 – Working with data
- Introducing Type Providers. Specific use cases with JSON, SQL and CSV.
Unit 8 – Web programming
- Asynchronous language support
- Working with ASP.NET WebAPI 2
- Suave – F#-focussed web library
- Consuming HTTP data
Unit 9 – Unit testing
- The role of unit testing in F# applications
- Using common .NET unit testing libraries with F#
- Property-based testing and FsCheck
- Web testing
Unit 10 – Where next?
- Further reading and resources to take your next steps.