Using the Developer PowerShell Visual Studio with PowerShell 7

Monday, 27 April 2020

One of the nice new features introduced in Visual Studio 2019 16.2 was the Developer PowerShell for VS 2019 - a nice accompaniment to the existing cmd.exe based Developer Command Prompt for VS 2019.

Windows Start Menu showing Developer PowerShell.

I use PowerShell as much as possible, and for a long time now I’ve made a habit of updating my profile.ps1 so that all the Visual Studio tools are available from the PowerShell command prompt. Previously this required running the old VsDevCmd.bat batch file and capturing the environment variables it set to then bring them into the PowerShell process. You can see an example here.

But now there’s first class support for integrating Visual Studio tooling into your PowerShell environment.

If you take a look at the Windows Start Menu shortcut that’s added, you’ll see it’s defined with a target similar to this:

C:\Windows\SysWOW64\WindowsPowerShell\v1.0\powershell.exe -noe -c "&{Import-Module """C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\Tools\Microsoft.VisualStudio.DevShell.dll"""; Enter-VsDevShell f9f5056f}"

With the release of PowerShell Core 6, and now PowerShell 7, I’m now favouring these latest releases of PowerShell over the ‘legacy’ Windows PowerShell 5.1. The problem was until recently, the assembly you see referenced in the shortcut above only worked in Windows PowerShell. It wasn’t compatible with PowerShell Core. Pleasingly this was fixed in Visual Studio 2019 16.5.

So now in your PowerShell 7 profile, you can add:

Import-Module "C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\Tools\Microsoft.VisualStudio.DevShell.dll"
Enter-VsDevShell -InstanceId 9034d7ab

And you’ll get the full DevShell experience: !Windows Terminal with PowerShell 7 and Visual Studio integration

There’s just one catch - notice that InstanceId? That’s unique for every machine. You can either grab the value out of the properties of the Start Menu shortcut, or run vswhere -property instanceId which will return the instanceId of the newest instance of Visual Studio.

4 doesn't go into 3 part 2

Sunday, 26 April 2020

Last time, we’d figured out a strategy for getting our data up into Azure Artifacts. The problem was that we are using SemVer pre-release notation, and the Azure Artifacts command line and Azure Pipelines task don’t support quering pre-release versions using wildcards.

The CLI tools and Pipelines task will work if we know the exact pre-release version required, but how to find that out? The Azure DevOps Services REST API.

Here’s an example PowerShell script you could use to find out the latest version of a package named ‘mypackage’. It sets a pipeline variable that can be used in subsequent Azure Pipeline tasks.

$url = "https://feeds.dev.azure.com/{organization}/{project}/_apis/packaging/Feeds/{feedId}/packages?api-version=5.1-preview.1&packageNameQuery=mypackage"

$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f "",$env:SYSTEM_ACCESSTOKEN)))
$result = Invoke-RestMethod -Uri $url -Method Get -ContentType "application/json" -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)}

$packageVersion = $result.value.versions.normalizedVersion
Write-Host "##vso[task.setvariable variable=packageVersion]$packageVersion"

We’re making use of the packageNameQuery parameter to filter by the package name, but there are other filtering options available. You could also do more filtering on the JSON data that the REST API returns.

Now we can use the Universal Package task, requesting the specific version. eg.

# Download Universal Package
steps:
- task: [email protected]
  displayName: 'Universal download'
  inputs:
    downloadDirectory: Application
    vstsFeed: '00000000-0000-0000-0000-000000000000/00000000-0000-0000-0000-000000000001'
    vstsFeedPackage: imagemagick
    vstsPackageVersion: '$(packageVersion)'

4 doesn't go into 3

Saturday, 25 April 2020

There’s different ways of versioning things. Windows has had a 4-part version scheme that is used by itself and most Windows applications. The convention is <major version>.<minor version>.<build number>.<revision>. .NET supports this through classes like AssemblyVersionAttribute and Version.

SemVer is a 3-part scheme (MAJOR.MINOR.PATCH) with optional pre-release or metadata that has seen wide industry adoption, especially by package management tools (npm, NuGet etc).

Using SemVer-versioned components in an ecosystem that uses 4-part versioning is pretty straight forward. Just tack an extra “.0” on the end if it really wants to see 4 parts.

On the other hand, using 4-part versions in a SemVer ecosystem is hard. You can’t just drop one of the parts unless you know with certainty that it will never, ever be significant.

And that’s the problem. How do you preserve information from 4 things when you’ve only got 3 places to put it?

I hit this problem recently when looking to push some versioned data that’s generated on-prem up to an Azure Pipeline so it could be used in the build process. I decided that Azure Artifacts would be a good way to get the data safely into a place where the Pipeline could access it.

Azure Artifacts has built-in support for a number of package formats (NuGet, npm, Maven and others), but this data wasn’t any of those, so the ‘Universal Package’ type seemed most appropriate. But as I was about to discover, Universal Packages are versioned with SemVer 2 (and no metadata allowed).

As the third part of the 4-part version number was zero, I was thinking I could just drop that to convert to a 3-part version number. But after some investigation it turns out that assumption was wrong - the third part can change.

I’m not the first to encounter this problem, and the final comment on that article by Mitch Denny (an Australian developer who I’m honoured to know, and who was the Program Manager for Azure Artifacts at the time) gives an interesting workaround.

1.2.3.4 could be encoded in SemVer as 1.2.3-4

I’m more used to seeing the pre-release with text (eg. 1.2.3-beta.4), but reviewing the SemVer site (point 9), they actually give 1.0.0-0.3.7 as an example, so this should be ok. It is technically a ‘pre-release’ version, but it has preserved all the data.

There’s one final gotcha, and it relates to the version being pre-release.

I mentioned earlier that I was going to use the Universal Package in an Azure Pipeline. To download the package in the pipeline, you use the Universal Package task.

The problem is that this task, like the CLI tooling doesn’t support pre-release versions using wildcards. This is explicitly called out in the documentation for the Universal Packages Quickstart “Wildcard expressions do not currently support pre-release versions. It is not possible to get the latest pre-release version of a package.”

Next steps, see if using I can use the REST API to obtain the version and pass that to the Pipelines task.

Continued in part 2

Moving a user group online

Sunday, 19 April 2020

The Adelaide .NET User Group has been meeting in various incarnations since waaaaay back in 1993. The group has been through changes of name, venue and leadership, but 2020 and specifically COVID-19 has presented a new challenge to the continuity of the group.

Our last in-person meeting was in early March, and at that time COVID was in the news but there were no restrictions on public gatherings. I did however rush out to buy some hand sanitiser for people to use before eating their pizza. Concern was growing, and in fact that same day my friend and fellow MVP Rob decided to call off the Adelaide Data and Analytics User Group meeting that was scheduled for the following week. (By that weekend it had become clear that Rob was on the money).

April’s meeting had already been planned, but it had quickly become apparent that an in-person meeting would not possible. The choice was to either shut down the group for the duration, or find a way to go virtual. I was keen to explore the latter!

Our group (like Rob’s) had hosted remote presenters before (not everyone is lucky enough to visit Adelaide in person), but this would involve everyone being remote which was a big change.

I did a lot of research into viable platforms to run the meeting on - reading blog posts, comments on Twitter and listening and asking questions from my fellow Microsoft MVPs in the ANZ region. Zoom was popular in the community, but it was around that time that it was getting all the wrong kind of press. Microsoft Teams was a service that I was familiar with (having used it for work and also for the recent MVP Summit). The aspect of Teams that I decided to use for our first fully virtual meeting was Microsoft Teams Live Events.

What swayed me to choose Live Events was that it had a fully-anonymous option to run public events which I felt would lower the barrier to entry for participation. I have a complimentary Office 365 tenancy provided as part of being a Microsoft MVP, so I made use of this to host the event.

My presenter for April was Andrew Best (who also happens to be my co-organiser for DDD Adelaide). We scheduled a test run the week before to try out the technology and also involved Simon Cook to sit in the ‘audience’ seat. For the test we actually tried out Teams Live Events and Teams Meetings, and from that we felt comfortable choosing Live Events.

Our group has been using Meetup.com for a number of years now to promote our meetings. They recently introduced the ability to add a link to online meetings. The problem I discovered was that they’d limited the link field to 250 chars, and a Teams Live Event URL was around 270! Using a URL shortener didn’t work either (I tried) as they were parsing the text to limit it to a fixed list of virtual event providers. I reported this to Meetup.com and pleasingly they increased the field size a few days before our event.

Show time

David ready to go

I was all set up (including bringing in some extra lighting) and 6pm was rapidly approaching. I then got a message from Andrew - he was having trouble connecting to the event! But we’d tested it the week before, what had changed? As it happens, Andrew had actually changed jobs in that week, and it seems Teams was trying to use his old credentials

Rebooting and restarting Teams wasn’t helping. Some frantic Googling came across this post, and applying the suggested solution unblocked Andrew. It did mean we started a little later than planned, but we didn’t have to cancel. To make sure attendees realised that we were running late, I’d hit the ‘Start’ button on the Live Event and then let everyone know to sit tight while we did some troubleshooting the problem (rather than think it wasn’t on).

Producing the event Once the introduction and updates were out the way, I handed over to Andrew to do his presentation. I was in the Producer seat, so I could choose whether to show Andrew’s screenshare, his webcam, or both. For most of the talk I left it on his screen, but as we were taking questions during the talk, I tried to switch to a screen+webcam view for those.

You can see in the photo above how you can queue up the next layout on the left. When you’re ready to change the live stream you click on the Send Live button. The layout options are very basic compared to something like OBS, but they did the job nicely.

While the event was on we seemed to peak at around 47 concurrent viewers. Afterwards I downloaded the event usage report which said we had 114 unique people connect in that day. I think the difference could be some people might have connected in via different devices (or app vs web browser), but still 114 sounds impressive!

Post-event

The recording of the live event can be downloaded as an MPEG4 video file. My eldest daughter has access to Adobe’s Creative Cloud so I asked her if she could edit the video for me in Premiere (removing the big delay at the start). I uploaded two resulting videos to YouTube. The first was my ‘welcome, news and updates spiel’ that I normally do at the start of our events, the second was Andrew’s presentation.

The recordings are now published in the ADNUG YouTube channel:

What worked

  1. We have an Adelaide developer community Slack (called HeapsGoodDev - ‘Heaps Good’ being a particularly South Australian expression) that our group participates in. I’d promoted our Slack channel as a great place to have side and post event conversations, and also as a means to support anyone who was having trouble connecting.
  2. Live Events worked just like it should have

What to watch out out for

  1. A Teams Live Events event can only be started once, and once they’re started, if you stop then that’s it, there’s no option to restart.
  2. There’s a decent lag between the presentation and what viewers see - around 20-30 seconds, so take that into account if you’re asking for responses from attendees during the talk.
  3. Live Events have a Q&A facility which allows attendees to ask a question. A moderator can then answer privately or choose to publish the question to everyone. It’s a good way to manage questions, particularly from a larger group, but it isn’t the same as an online chat - attendees can’t chat with each other.
  4. As we found out, Teams Live Events doesn’t currently support sharing system sounds (unlike Teams Meetings).
  5. Live Events starts the recording automatically as soon as you hit Start to begin the stream.

What’s next

We’ve already got our next meeting planned, and will use Live Events for that.

I’m also thinking of running some shorter lunchtime meetings, and might try using Teams Meetings for these to make them a bit more informal.

One of the recommendations from the ANZ MVPs was to get your group its own Office 365 tenancy (rather than rely on an employer’s or MVP’s). I’d like to get the group our own Office 365 Business Essentials subscription.

EDITED TO ADD (21st April 2020): Office 365 Business Essentials includes Teams Meetings, but does not include Live Events. It looks like you’ll need at least an Enterprise E1 license for that.

Downloading an Azure VM

Friday, 3 April 2020

Yesterday I needed to get a copy of a virtual machine onto my local workstation. As I’m now working from home, that was going to mean downloading a lot of data, but first I had to find the VM. I remembered I had exported this particular VM up into Azure at one stage to experiment with using different hardware specs to find out how that would affect performance.

Lucky for me, the VM was still there (though de-allocated to reduce costs). Usually you want to migrate a VM up into the cloud, but I needed to go the other way! So how do you get a copy of that VM? It turns out it isn’t that tricky:

  1. Make sure the VM is shut down (mine was)
  2. Open up the VM in the Azure Portal
  3. Under Settings, click on Disks
  4. Click on the individual disk (if you have more than one, you’ll need to repeat the next few steps)
  5. Under Settings, click on Disk Export
  6. You’re prompted to enter a URL expire time. The default is 3600 seconds (1 hour). If you have limited bandwidth you should make this larger, otherwise your download may fail. I set mine to 36000 (10 hours)
  7. Click Generate URL and a URL will be displayed Azure virtual machine disk export
  8. Download the .vhd file for this disk. Mine was 80GB and it took all day. It also failed a number of times, but I was able to restart the download and it did continue on from where it left off.
  9. The download defaulted to calling the file abcd, but it is a VHD file, so just rename the file to something useful.