• GitHub Releases

    I want to make creating a release as simple as possible. Some projects might adopt a continuous delivery approach where every commit to the main branch generates a new release. In this case I want to allow commits to accrue until I decide that a new release should be created.

    My requirements for creating a new release include the following:

    • Only run on the main branch
    • Provide a useful set of release notes, outlining all changes since the last release
    • Attach vsix binaries from the automated build
    • Publish vsix to the Visual Studio Marketplace so it becomes available for users to install/upgrade


    The main.yml file in the Show Missing project is split into two jobs - build and update_release_draft. The latter job only runs when we’re building the main branch.

    The second workflow is publish.yml, which is run after a non-draft release is created.

    Update release draft job

    This job has an if: clause that means it only runs when we’re building master branch.

        name: Update release draft
        runs-on: ubuntu-latest
        needs: [build]
        if: github.ref == 'refs/heads/master'

    Release notes

    Browsing GitHub Actions, there’s a few that help with release notes. I chose Release Drafter. It creates a draft release (automatically generating the release name based on the version) and each time it runs, it reviews the list of commits since the last release and generates formatted release notes. It is smart enough to update the draft release on subsequent runs.

    Release Drafter calls GitHub APIs so we set GITHUB_TOKEN.

    I use Nerdbank.GitVersioning to manage version numbers. I use the full notation to access the calculated version number from the previous build job.

          - uses: release-drafter/release-drafter@v5
            id: create_release
              GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
              version: ${{ needs.build.outputs.GitBuildVersionSimple }}

    Here’s the release notes (edited for this blog post) to show the kind for formatting that Release Drafter provides. It can use labels to group the issues under the different headings.

    Release Notes

    You configure Release Drafter by adding a file named .github/release-drafter.yml. Mine contains the following:

    name-template: 'v$RESOLVED_VERSION'
    tag-template: 'v$RESOLVED_VERSION'
      - title: '🚀 Features'
          - 'feature'
          - 'enhancement'
      - title: '🐛 Bug Fixes'
          - 'fix'
          - 'bugfix'
          - 'bug'
      - title: '🧰 Maintenance'
        label: 'chore'
    change-template: '- $TITLE @$AUTHOR (#$NUMBER)'
          - 'major'
          - 'minor'
          - 'patch'
      default: patch
    template: |
      ## Changes

    Release assets

    The Upload a Release Asset action is used to append the vsix from the build to the draft release.

          - name: Upload Release Asset
            id: upload-release-asset
            uses: actions/upload-release-asset@v1
              GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
              upload_url: ${{ steps.create_release.outputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: https://jasonet.co/posts/new-features-of-github-actions/#passing-data-to-future-steps
              asset_path: ./bin/Release/Gardiner.VsShowMissing.VS2019.vsix
              asset_name: Gardiner.VsShowMissing.VS2019.vsix
              asset_content_type: application/octet-stream

    Publishing to the marketplace

    The publish.yml workflow triggers after the draft release is published (changes to non-draft).

    It firstly grabs a copy of vsix file that was attached to the release that has triggered this workflow.

          - name: Download Assets
            uses: i3h/[email protected]
              owner: ${{ github.event.repository.owner.login }}
              repo: ${{ github.event.repository.name }}
              tag: ${{ github.event.release.tag_name }}
              file: Gardiner.VsShowMissing.VS2019.vsix
              token: ${{ secrets.GITHUB_TOKEN }}

    And then locates VsixPublisher.exe, then runs that to publish the vsix up to the marketplace.

          - name: Script
            run: |
              # Find VsixPublisher
              $Installation = & "${env:ProgramFiles(x86)}\Microsoft Visual Studio\Installer\vswhere.exe" -latest -format json | ConvertFrom-Json
              $Path = $Installation.installationPath
              Write-Host $Path
              $VsixPublisher = Join-Path -Path $Path -ChildPath "VSSDK\VisualStudioIntegration\Tools\Bin\VsixPublisher.exe" -Resolve
              & $VsixPublisher publish -payload ".\Gardiner.VsShowMissing.VS2019.vsix" -publishManifest ".\build\extension-manifest.json" -personalAccessToken $env:PersonalAccessToken -ignoreWarnings "VSIXValidatorWarning01,VSIXValidatorWarning02,VSIXValidatorWarning08"
              PersonalAccessToken: ${{ secrets.PersonalAccessToken }}

    Creating a new release

    After enough changes have been made, it’s time to publish a new release!

    1. Browse to the Releases page.
    2. A draft release is shown. Draft release Click on the Edit button
    3. Review (and optionally edit) the release notes. Edit draft
    4. If you’re happy to proceed, click on Publish release
    5. The publish workflow is automatically triggered

    The release is now public (no longer in draft) and GitHub has attached additional files to it

    Latest release

    Reviewing the Visual Studio Marketplace, you can see that the new vsix has been submitted and is being processed before being made available to the general public.



  • Dependabot

    Keeping dependencies up to date is useful. Even more so if the dependency has a security fix.

    I’ve using Dependabot for a while now. Initially with the preview integration, but now that Dependabot is part of GitHub (complete with a name change to ‘GitHub Dependabot’) the integration is even better.

    All you need to do is add a file under .github/dependabot.yml, and Dependabot integration will be enabled for your repository.

    Here’s the dependabot.yml file for Show Missing:

    version: 2
    - package-ecosystem: nuget
      directory: "/"
        interval: daily
        time: '19:30'
      open-pull-requests-limit: 10
      - flcdrg

    It specifies the following:

    • Look for NuGet packages
    • Based in the root directory
    • Check for updates daily at 7.30pm (UTC)
    • Limit to 10 pull requests
    • Assign those pull requests to me (flcdrg)

    Dependabot will create a pull request to update each outdated dependency. If release notes are available, it will populate the pull request with those details, as well as the commit history between the old version and the new one.

    Dependabot-generated pull request

    There’s comprehensive documentation for using Dependabot on the GitHub Docs site, including many more configuration options.

    I let Dependabot create the pull requests but I still decided whether to approve the request (or not). You could even hook up a GitHub Action to auto-merge your Dependabot pull requests!

    Azure Pipelines

    The interesting thing about Dependabot is the core engine is open source and hosted on GitHub as well. Andrew Craven has created an example of using the Dependabot engine with Azure DevOps. Not sure if he’s updating that repo, but you might find some of the pull requests I’ve submitted there useful.

    You don’t get all the @dependabot bot behaviour like you see on GitHub (as that’s built on top of the core). I guess if you were keen you could build that functionality too!

    I’ve used his code to generate pull requests on some repositories hosted in Azure DevOps and then used Service Hooks to trigger some code in an Azure Function to update the pull requests to set auto-complete and assign a work item.

  • GitHub Action caching

    I’m always interested in making builds faster!

    If your builds run on self-hosted runners then you can persist files between builds so caching is of limited value (or may even make builds slower). However when using a GitHub-hosted runner (build agent) every build gets a brand new VM. It can take a while for dependencies to be restored (eg. NuGet, NPM or similar), and this has to happen every time a build runs. Being able to cache these dependencies and restore them quickly can potentially make a big difference.

    I’ve started adding the Cache task to my Azure Pipelines builds where I can. The equivalent for GitHub Actions is the Cache action.

    These both work in a similar way. You indicate a path whose contents you want to cache for future builds, and a key which is used to determine when the cache is stale.

    Here’s the cache action that I’m using for my Show Missing extension.

        - uses: actions/cache@v2
            path: ${{ github.workspace }}/.nuget/packages
            key: ${{ runner.os }}-nuget-${{ hashFiles('**/packages.lock.json') }}
            restore-keys: |
              ${{ runner.os }}-nuget-

    The first time you run a build with caching enabled, it won’t appear to run any faster. In fact it might take slightly longer, as when just before the build completes, the cache action will bundle up all the files underneath the path specified and save them.

    Subsequent builds will then download and restore the dependencies. Because this is done efficiently (one tar.gz file to download and extract, and the cache presumably lives relatively close to the runner VM), it will usually be a lot faster than relying on the normal package restore process.

    For NuGet packages, you need to have key paths that can indicate when the cache should be updated. Whilst Visual Studio extensions don’t yet support the new ‘SDK-style’ project format, you can still make use of PackageReference, and if you use nuget.exe 4.9 or above, then you can create and use packages.lock.json files. If I hadn’t updated to PackageReference, then the old packages.config would probably work just as well.

    Here’s a build with no caching. It took 1m 54s.

    Build without cache

     Committing restore...
    Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\VS2019\obj\VS2019.csproj.nuget.g.props.
    Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\VS2019\obj\VS2019.csproj.nuget.g.targets.
    Writing assets file to disk. Path: D:\a\VsShowMissing\VsShowMissing\VS2019\obj\project.assets.json
    Restored D:\a\VsShowMissing\VsShowMissing\VS2019\VS2019.csproj (in 8.74 sec).
    NuGet Config files used:
        C:\Program Files (x86)\NuGet\Config\Microsoft.VisualStudio.Offline.config
        C:\Program Files (x86)\NuGet\Config\Xamarin.Offline.config
    Feeds used:
        C:\Program Files (x86)\Microsoft SDKs\NuGetPackages\
        45 package(s) to D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\Gardiner.VsShowMissing.csproj
        109 package(s) to D:\a\VsShowMissing\VsShowMissing\VS2019\VS2019.csproj

    The first time we add the cache (2m)

    First build with cache

    The cache task logs that there is currently nothing to restore

    Run actions/cache@v2
    Cache not found for input keys: Windows-nuget2-3881b0e254e4b0c4e40edd9efa8c26dfd9f5c93c42dad979f6c5869b765a72d0, Windows-nuget2-

    But notice there’s a second post-build step for the cache. File to be cache are added to a tar file and that is then saved.

    Post Run actions/cache@v2
    Cache saved successfully
    Post job cleanup.
    C:\windows\System32\tar.exe -z -cf cache.tgz -P -C d:/a/VsShowMissing/VsShowMissing --files-from manifest.txt
    Cache saved successfully

    And now subsequent builds use the cache.

    Second build with cache

    You can see the cache action does a restore:

    Run actions/cache@v2
    Cache Size: ~116 MB (122108071 B)
    C:\windows\System32\tar.exe -z -xf d:/a/_temp/50db9096-3e6c-4681-8753-3e12e33854f1/cache.tgz -P -C d:/a/VsShowMissing/VsShowMissing
    Cache restored from key: Windows-nuget2-3881b0e254e4b0c4e40edd9efa8c26dfd9f5c93c42dad979f6c5869b765a72d0

    and the output from the nuget restore is a bit different:

     MSBuild auto-detection: using msbuild version '' from 'C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\MSBuild\Current\bin'.
    Restoring packages for D:\a\VsShowMissing\VsShowMissing\VS2019\VS2019.csproj...
    Restoring packages for D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\Gardiner.VsShowMissing.csproj...
    Committing restore...
    Committing restore...
    Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\VS2019\obj\VS2019.csproj.nuget.g.props.
    Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\obj\Gardiner.VsShowMissing.csproj.nuget.g.props.
    Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\VS2019\obj\VS2019.csproj.nuget.g.targets.
    Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\obj\Gardiner.VsShowMissing.csproj.nuget.g.targets.
    Writing assets file to disk. Path: D:\a\VsShowMissing\VsShowMissing\VS2019\obj\project.assets.json
    Writing assets file to disk. Path: D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\obj\project.assets.json
    Restored D:\a\VsShowMissing\VsShowMissing\VS2019\VS2019.csproj (in 844 ms).
    Restored D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\Gardiner.VsShowMissing.csproj (in 845 ms).
    NuGet Config files used:
        C:\Program Files (x86)\NuGet\Config\Microsoft.VisualStudio.Offline.config
        C:\Program Files (x86)\NuGet\Config\Xamarin.Offline.config
    Feeds used:
        C:\Program Files (x86)\Microsoft SDKs\NuGetPackages\

    But wait.. that build took 2m 4s! What gives? That’s slower than the first time!

    Yeah, that is odd. So a couple of thoughts:

    • Do measure if adding a cache actually makes a difference.
    • The speed of the runner VMs does vary a bit. In that last run, notice that the restore was slightly faster but the build step was quite a bit slower.
    • Possibly you might get different better results from SDK projects?

    I have a theory that in my case there aren’t a huge amount of dependencies, so the time saved downloading them separately isn’t dramatically different to the cache restoring them all. But when you have a lot of dependencies (and NPM packages are likely to be a good example), or the the download speed of all those dependencies is limited, then single large download vs lots of smaller separate downloads should give definite advantages.