Migrating npm packages to Azure Artifacts

Thursday, 25 October 2018

Azure Artifacts is the new name for VSTS Package Management. It's a "one stop shop" for storing NuGet, npm, Maven, Gradle and "Universal" packages.

I'd previously been using another private npm registry server and wanted to shift over to using the Azure Artifacts npm registry instead. As part of this move, I needed to somehow grab the packages that were currently stored in the old registry and then re-publish them to the Artifacts one. Here's how I did it. Artifacts do support configuring 'upstream' sources, but that's not really a long term solution for migration.

Downloading packages

npm pack "@myscope/[email protected]^1.2.3456" --registry http://my.oldnpmserver

You'll now have a file with a name similar to myscope-mypackage-1.2.3456.tgz

Repeat this for all the packages you need.

Re-publishing packages


First off, create a new file named .npmrc and enter in the details for your Artifacts registry url. If you have packages with scopes (like I did above), then add in those as well.

@myscope:registry=https://myorg.pkgs.visualstudio.com/_packaging/MyArtifacts/npm/registry/
registry=https://myorg.pkgs.visualstudio.com/_packaging/MyArtifacts/npm/registry/

always-auth=true

Azure Artifacts are password-protected, so you'll need to authenticate. Your options here are to either make use of the vsts-npm-auth tool or generate credentials that can be pasted into the .npmrc file. Click on the Connect to Feed button from the Azure Artifacts page in the DevOps portal to find out the details.

Now use npm publish to push all the .tgz files up to the Artifacts repository (with a bit of help from PowerShell)

Get-ChildItem *.tgz | ForEach-Object { npm publish $_ }

A Jon Skeet Meetup retrospective

Saturday, 13 October 2018

This week we hosted Jon Skeet at the Adelaide .NET User Group. Jon demonstrated some of the new language features coming in C# 8, and it was probably the biggest attendance we've had in a really, really long time.

Because we had so many registrations, I lined up some extra help with my two oldest kids (conveniently on school holidays). They helped set up the room, liaised with the pizza delivery guy, and helped pack up everything. I think they even found a few things familiar with Jon's use of Fibonacci sequences in one example and similarity of programming language features (my eldest daughter has been doing some Python coding at school).

Jon lives in the UK, and we're in Australia, so whilst we would have loved to have Jon in person, the next best thing was to have him present remotely. We've had remote presentations before using Google Hangouts, but this time I opted to use Skype (and made use of the new recording feature).
I don't use Skype a lot, but we got the call up and running without too much difficulty.

I set up a webcam in the meeting room, along with a boundary microphone (an MXL AC404 USB Conference Microphone). The intention is that the presenter can see the audience, and the boundary microphone allows people to comment and ask questions from a fair way away (eg. right at the back of the room) and still be heard. This seemed to work pretty well - people up the back were able to ask questions and Jon seemed to hear them ok. Jon's video feed was pretty good. I think our feed back might have been a bit jumpy, but for the most part I think it was ok.

I left it to the last minute to arrange for access to the WiFi network at our venue. I ended up using my phone's 4G data for the call, which worked well. It was only after we'd finished that I discovered that an email had come through just before the start of the meeting with the WiFi details. At least I've got them for next time.

We also picked up a meeting sponsor this month in Simon Cook from Encode Talent Management. Being able to not charge attendees (to cover the cost of pizza) was great, and hopefully this relationship might continue in the future.

Snapshot from Skype recording, showing Jon Skeet in top, audience in bottom


Skype notes:
Other notes:
The recording of Jon's talk is up on YouTube. I won't be giving up my day job to become a YouTube broadcaster anytime soon, but it's nice to have a record of a great presentation.


Speaking at .NET Conf - Put your C#, VB and F# projects and packaging on a diet

Thursday, 13 September 2018

I'm really exited to be selected as one of the community speakers for .NET Conf

Title slide for .NET Conf talk
.NET Conf is a free “virtual” conference organised by Microsoft and the .NET developer community that is streamed live around the world. Being virtual, it means organising travel and accommodation is remarkably easy!

My talk is titled “Put your C#, VB and F# projects and packaging on a diet”, drilling in to the new project system for .NET, and how you can use it even with old projects that target .NET Framework and it starts at 04:00 UTC on Friday 14th September (check local times). Go to https://www.dotnetconf.net/ to watch the live stream.

All the demos from my talk and links to other resources can be found in the Github repo https://github.com/flcdrg/project-system-diet

Converting a SQL Server .bacpac to a .dacpac

Monday, 27 August 2018

Microsoft SQL Server has two related portable file formats - the DACPAC and the BACPAC. Quoting Data-tier Applications:
A DAC is a self-contained unit of SQL Server database deployment that enables data-tier developers and database administrators to package SQL Server objects into a portable artifact called a DAC package, also known as a DACPAC.
A BACPAC is a related artifact that encapsulates the database schema as well as the data stored in the database.
When they say related, they're not kidding! Both of these files formats are based on the Open Packaging Conventions (a fancy way of saying it's a .zip file with some other bits), and cracking them open you discover that a bacpac file is basically a dacpac with a few extra files and a couple of different settings. Knowing this, it should be possible to manually convert a bacpac to a dacpac.

First, unzip the .bacpac file (using 7-zip, or rename to .zip and use Windows File Explorer’s Extract Archive).

Now do the following actions (you could do these programmatically if this is something you need to do repeatedly):
  1. Edit model.xml
    1. Change [email protected] to 2.4
  2. Edit Origin.xml
    1. Change ContainsExportedData to false
    2. Change ModelSchemaVersion to 2.4
    3. Remove ExportStatistics
    4. Recalculate the SHA256 checksum for model.xml and update the value stored in Checksums/[email protected]=’/model.xml’
  3. Remove directories _rels and Data
Now re-zip up the remaining files and change the file suffix back to .dacpac

To verify that the .dacpac is valid, try using SSMS with the Upgrade Data-tier Application wizard. Run it against any database and if you can proceed to without error to the "Review Upgrade Plan" step, you should be good to go.

Create a temporary file with a custom extension in PowerShell

Monday, 13 August 2018

Just a quick thing I wanted to record for posterity. The trick is using the -PassThru parameter with the Rename-Item cmdlet so that this ends up a one-liner thanks to PowerShell's pipeline:

$tempNuspec = Get-ChildItem ([IO.Path]::GetTempFileName()) | Rename-Item -NewName { [IO.Path]::ChangeExtension($_, ".nuspec") } -PassThru