The church of the future (from 1995)

Saturday, 12 September 2020

Way, way back in 1995 I’d been a part of a team that published a newspaper leading up (and during) that year’s National Christian Youth Convention (aka NCYC 95). After the convention had finished, the New Times newspaper (which had been supporting us) asked us to continue writing pieces.

Recently my Dad gave me a collection of articles from these publications that he’d kept. This one stood out as being surprisingly relevant! If you know me well, you’ll be familiar with my unusual sense of humour, and I enjoyed sprinking this into my written work.

Extract from New Times August 1995

The full text from my article, first published in the August 1995 edition of New Times follows:

YAWN, I don’t think I’ll ever get used to getting up early for church. It’s good to catch up with friends. But worship is the real reason we get together each week.

Of course our church, like most, is “virtual”. We all meet via our computer communications link. Sure, you miss out on actually being with other people. But there are advantages.

You can attend your own church, no matter where you are in the world. It allows people with disabilities to participate as much as anyone. What I consider the biggest advantage is that you can “tune-out” those noisy kids!

On reflection, I guess modern church life has changed a lot over the years.

Today, we had communion. I think my replicator is on the blink, because the wine/grape juice was an uncharacteristic luminescent green.

The minister’s message was pretty good. I do appreciate being able to fast forward over the boring bits. I don’t know how people used to cope when they actually had to site through a whole sermon.

The use of hypertext scripture readings, multimedia and 3D real-time computer animation are commonplace in the sermons of today. They certainly add a new dimension to understanding the Bible in today’s society.

Like most churches, we are often struggling with our regular giving. Accepting all major credit cards has helped, though. But I’m not so sure about the floating of our church on the stock market. Next thing you know the CPI will stand for the “Consumer Prayer Index”.

And another thing. Call me old fashioned, but I do prefer those tried and true choruses - I can’t relate to all these modern techno-sampled tunes we have in church now.

Well, I guess things are always changing - technologies, language, people. But God never changes. God’s still as relevant today as in the 1990s.

Given what’s happened just this year, I’d say most of my predictions been pretty close to the mark.

In the garden - August 2020

Sunday, 9 August 2020

August, and winter is hanging around. It’s been pretty cold overnight and first thing in the morning. Today was overcast but there has been the odd day where the sun comes out and warms you up a little.

I do enjoy it when the jonquils and daffodils appear. Splashes of colour that are have laid forgotten for most of the year. Here’s a few growing in our garden. I don’t know the names of all the varieties. Some we’ve planted but many were already in the garden when we moved here.

Daffodil - yellow and orange

Daffodil - pale white and yellow

Daffodil - pale white and frilly yellow

Daffodil - small yellow

Daffodil - small white and yellow

Daffodil - small white and orange

Jonquil - white

GitHub Releases

Thursday, 23 July 2020

I want to make creating a release as simple as possible. Some projects might adopt a continuous delivery approach where every commit to the main branch generates a new release. In this case I want to allow commits to accrue until I decide that a new release should be created.

My requirements for creating a new release include the following:


The main.yml file in the Show Missing project is split into two jobs - build and update_release_draft. The latter job only runs when we’re building the main branch.

The second workflow is publish.yml, which is run after a non-draft release is created.

Update release draft job

This job has an if: clause that means it only runs when we’re building master branch.

    name: Update release draft
    runs-on: ubuntu-latest
    needs: [build]

    if: github.ref == 'refs/heads/master'

Release notes

Browsing GitHub Actions, there’s a few that help with release notes. I chose Release Drafter. It creates a draft release (automatically generating the release name based on the version) and each time it runs, it reviews the list of commits since the last release and generates formatted release notes. It is smart enough to update the draft release on subsequent runs.

Release Drafter calls GitHub APIs so we set GITHUB_TOKEN.

I use Nerdbank.GitVersioning to manage version numbers. I use the full notation to access the calculated version number from the previous build job.

      - uses: release-drafter/[email protected]
        id: create_release
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          version: ${{ }}

Here’s the release notes (edited for this blog post) to show the kind for formatting that Release Drafter provides. It can use labels to group the issues under the different headings.

Release Notes

You configure Release Drafter by adding a file named .github/release-drafter.yml. Mine contains the following:

name-template: 'v$RESOLVED_VERSION'
tag-template: 'v$RESOLVED_VERSION'
  - title: '🚀 Features'
      - 'feature'
      - 'enhancement'
  - title: '🐛 Bug Fixes'
      - 'fix'
      - 'bugfix'
      - 'bug'
  - title: '🧰 Maintenance'
    label: 'chore'
change-template: '- $TITLE @$AUTHOR (#$NUMBER)'
      - 'major'
      - 'minor'
      - 'patch'
  default: patch
template: |
  ## Changes


Release assets

The Upload a Release Asset action is used to append the vsix from the build to the draft release.

      - name: Upload Release Asset
        id: upload-release-asset
        uses: actions/[email protected]
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          upload_url: ${{ steps.create_release.outputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: 
          asset_path: ./bin/Release/Gardiner.VsShowMissing.VS2019.vsix
          asset_name: Gardiner.VsShowMissing.VS2019.vsix
          asset_content_type: application/octet-stream

Publishing to the marketplace

The publish.yml workflow triggers after the draft release is published (changes to non-draft).

It firstly grabs a copy of vsix file that was attached to the release that has triggered this workflow.

      - name: Download Assets
        uses: i3h/[email protected]
          owner: ${{ github.event.repository.owner.login }}
          repo: ${{ }}
          tag: ${{ github.event.release.tag_name }}
          file: Gardiner.VsShowMissing.VS2019.vsix
          token: ${{ secrets.GITHUB_TOKEN }}

And then locates VsixPublisher.exe, then runs that to publish the vsix up to the marketplace.

      - name: Script
        run: |
          # Find VsixPublisher
          $Installation = & "${env:ProgramFiles(x86)}\Microsoft Visual Studio\Installer\vswhere.exe" -latest -format json | ConvertFrom-Json
          $Path = $Installation.installationPath

          Write-Host $Path
          $VsixPublisher = Join-Path -Path $Path -ChildPath "VSSDK\VisualStudioIntegration\Tools\Bin\VsixPublisher.exe" -Resolve

          & $VsixPublisher publish -payload ".\Gardiner.VsShowMissing.VS2019.vsix" -publishManifest ".\build\extension-manifest.json" -personalAccessToken $env:PersonalAccessToken -ignoreWarnings "VSIXValidatorWarning01,VSIXValidatorWarning02,VSIXValidatorWarning08"
          PersonalAccessToken: ${{ secrets.PersonalAccessToken }}

Creating a new release

After enough changes have been made, it’s time to publish a new release!

  1. Browse to the Releases page.
  2. A draft release is shown. Draft release Click on the Edit button
  3. Review (and optionally edit) the release notes. Edit draft
  4. If you’re happy to proceed, click on Publish release
  5. The publish workflow is automatically triggered

The release is now public (no longer in draft) and GitHub has attached additional files to it

Latest release

Reviewing the Visual Studio Marketplace, you can see that the new vsix has been submitted and is being processed before being made available to the general public.




Friday, 17 July 2020

Keeping dependencies up to date is useful. Even more so if the dependency has a security fix.

I’ve using Dependabot for a while now. Initially with the preview integration, but now that Dependabot is part of GitHub (complete with a name change to ‘GitHub Dependabot’) the integration is even better.

All you need to do is add a file under .github/dependabot.yml, and Dependabot integration will be enabled for your repository.

Here’s the dependabot.yml file for Show Missing:

version: 2
- package-ecosystem: nuget
  directory: "/"
    interval: daily
    time: '19:30'
  open-pull-requests-limit: 10
  - flcdrg

It specifies the following:

Dependabot will create a pull request to update each outdated dependency. If release notes are available, it will populate the pull request with those details, as well as the commit history between the old version and the new one.

Dependabot-generated pull request

There’s comprehensive documentation for using Dependabot on the GitHub Docs site, including many more configuration options.

I let Dependabot create the pull requests but I still decided whether to approve the request (or not). You could even hook up a GitHub Action to auto-merge your Dependabot pull requests!

Azure Pipelines

The interesting thing about Dependabot is the core engine is open source and hosted on GitHub as well. Andrew Craven has created an example of using the Dependabot engine with Azure DevOps. Not sure if he’s updating that repo, but you might find some of the pull requests I’ve submitted there useful.

You don’t get all the @dependabot bot behaviour like you see on GitHub (as that’s built on top of the core). I guess if you were keen you could build that functionality too!

I’ve used his code to generate pull requests on some repositories hosted in Azure DevOps and then used Service Hooks to trigger some code in an Azure Function to update the pull requests to set auto-complete and assign a work item.

GitHub Action caching

Saturday, 11 July 2020

I’m always interested in making builds faster!

If your builds run on self-hosted runners then you can persist files between builds so caching is of limited value (or may even make builds slower). However when using a GitHub-hosted runner (build agent) every build gets a brand new VM. It can take a while for dependencies to be restored (eg. NuGet, NPM or similar), and this has to happen every time a build runs. Being able to cache these dependencies and restore them quickly can potentially make a big difference.

I’ve started adding the Cache task to my Azure Pipelines builds where I can. The equivalent for GitHub Actions is the Cache action.

These both work in a similar way. You indicate a path whose contents you want to cache for future builds, and a key which is used to determine when the cache is stale.

Here’s the cache action that I’m using for my Show Missing extension.

    - uses: actions/[email protected]
        path: ${{ github.workspace }}/.nuget/packages
        key: ${{ runner.os }}-nuget-${{ hashFiles('**/packages.lock.json') }}
        restore-keys: |
          ${{ runner.os }}-nuget-

The first time you run a build with caching enabled, it won’t appear to run any faster. In fact it might take slightly longer, as when just before the build completes, the cache action will bundle up all the files underneath the path specified and save them.

Subsequent builds will then download and restore the dependencies. Because this is done efficiently (one tar.gz file to download and extract, and the cache presumably lives relatively close to the runner VM), it will usually be a lot faster than relying on the normal package restore process.

For NuGet packages, you need to have key paths that can indicate when the cache should be updated. Whilst Visual Studio extensions don’t yet support the new ‘SDK-style’ project format, you can still make use of PackageReference, and if you use nuget.exe 4.9 or above, then you can create and use packages.lock.json files. If I hadn’t updated to PackageReference, then the old packages.config would probably work just as well.

Here’s a build with no caching. It took 1m 54s.

Build without cache

 Committing restore...
Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\VS2019\obj\VS2019.csproj.nuget.g.props.
Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\VS2019\obj\VS2019.csproj.nuget.g.targets.
Writing assets file to disk. Path: D:\a\VsShowMissing\VsShowMissing\VS2019\obj\project.assets.json
Restored D:\a\VsShowMissing\VsShowMissing\VS2019\VS2019.csproj (in 8.74 sec).

NuGet Config files used:
    C:\Program Files (x86)\NuGet\Config\Microsoft.VisualStudio.Offline.config
    C:\Program Files (x86)\NuGet\Config\Xamarin.Offline.config

Feeds used:
    C:\Program Files (x86)\Microsoft SDKs\NuGetPackages\

    45 package(s) to D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\Gardiner.VsShowMissing.csproj
    109 package(s) to D:\a\VsShowMissing\VsShowMissing\VS2019\VS2019.csproj

The first time we add the cache (2m)

First build with cache

The cache task logs that there is currently nothing to restore

Run actions/[email protected]
Cache not found for input keys: Windows-nuget2-3881b0e254e4b0c4e40edd9efa8c26dfd9f5c93c42dad979f6c5869b765a72d0, Windows-nuget2-

But notice there’s a second post-build step for the cache. File to be cache are added to a tar file and that is then saved.

Post Run actions/[email protected]
Cache saved successfully
Post job cleanup.
C:\windows\System32\tar.exe -z -cf cache.tgz -P -C d:/a/VsShowMissing/VsShowMissing --files-from manifest.txt
Cache saved successfully

And now subsequent builds use the cache.

Second build with cache

You can see the cache action does a restore:

Run actions/[email protected]
Cache Size: ~116 MB (122108071 B)
C:\windows\System32\tar.exe -z -xf d:/a/_temp/50db9096-3e6c-4681-8753-3e12e33854f1/cache.tgz -P -C d:/a/VsShowMissing/VsShowMissing
Cache restored from key: Windows-nuget2-3881b0e254e4b0c4e40edd9efa8c26dfd9f5c93c42dad979f6c5869b765a72d0

and the output from the nuget restore is a bit different:

 MSBuild auto-detection: using msbuild version '' from 'C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\MSBuild\Current\bin'.
Restoring packages for D:\a\VsShowMissing\VsShowMissing\VS2019\VS2019.csproj...
Restoring packages for D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\Gardiner.VsShowMissing.csproj...
Committing restore...
Committing restore...
Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\VS2019\obj\VS2019.csproj.nuget.g.props.
Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\obj\Gardiner.VsShowMissing.csproj.nuget.g.props.
Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\VS2019\obj\VS2019.csproj.nuget.g.targets.
Generating MSBuild file D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\obj\Gardiner.VsShowMissing.csproj.nuget.g.targets.
Writing assets file to disk. Path: D:\a\VsShowMissing\VsShowMissing\VS2019\obj\project.assets.json
Writing assets file to disk. Path: D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\obj\project.assets.json
Restored D:\a\VsShowMissing\VsShowMissing\VS2019\VS2019.csproj (in 844 ms).
Restored D:\a\VsShowMissing\VsShowMissing\Gardiner.VsShowMissing\Gardiner.VsShowMissing.csproj (in 845 ms).

NuGet Config files used:
    C:\Program Files (x86)\NuGet\Config\Microsoft.VisualStudio.Offline.config
    C:\Program Files (x86)\NuGet\Config\Xamarin.Offline.config

Feeds used:
    C:\Program Files (x86)\Microsoft SDKs\NuGetPackages\

But wait.. that build took 2m 4s! What gives? That’s slower than the first time!

Yeah, that is odd. So a couple of thoughts:

I have a theory that in my case there aren’t a huge amount of dependencies, so the time saved downloading them separately isn’t dramatically different to the cache restoring them all. But when you have a lot of dependencies (and NPM packages are likely to be a good example), or the the download speed of all those dependencies is limited, then single large download vs lots of smaller separate downloads should give definite advantages.