• Docker run from an Azure Pipeline Container Job with a volume mount

    This caught me out today. I was trying to run a Docker container directly from a script task, where the pipeline job was already running in a container (as a Container Job), similar to this:

      - job: MyJob
        container:
          image: my-container-job-image:latest
    
        steps:
          - script: |
              docker run --mount type=bind,source="$(pwd)",target=/home/src --rm -w /home/src my-container:latest
    

    The bit that was failing was the --mount, with the following error message:

    docker: Error response from daemon: invalid mount config for type "bind": bind source path does not exist: /__w/3/s.
    

    Eventually, I realised the problem - By default, when a job is running as a Container Job, all the tasks are also running in the context of that container. So $(pwd) was resolving to /__w/3/s. That happens to be the default directory, and also where your source code is mapped to (via a volume mount that you can see by viewing the output of the “Initialize containers” step).

    But when you invoke docker run, Docker doesn’t try and run the new container inside the existing Container Job container, rather it will run alongside it! So any paths you pass to Docker need to be relative to the host machine, not relative to inside the container job.

    In my case, the solution was to add a target: host property to the script task, so that the entire script is now run in the context of the host, rather than the container. eg.

          - script: |
              docker run --mount type=bind,source="$(pwd)",target=/home/src --rm -w /home/src my-container:latest
            target: host
    

    Now when the pipeline runs, $(pwd) will resolve to /agent/_work/3/s (which is the actual directory on the host machine), and the mount will work correctly!

  • Pasting Markdown into Confluence

    This is one of those blog posts which is mainly for my benefit 😀.

    To paste Markdown text into Atlassian Confluence wiki:

    1. Use Ctrl-Shift-V (to paste without formatting)
    2. Click on the Paste Options menu in Confluence and select Use Markdown (the default is)

    Confluence's Paste Options menu

    If you prepared your Markdown in Visual Studio Code (as I often do), then the ‘paste without format’ is important as by default VS Code copies text with a ‘plain text/code’ formatting flavour, and this means you don’t get offered the paste options menu.

  • Automatic updating of Chocolatey packages with .NET

    I maintain quite a few Chocolatey packages. The source for these packages lives in https://github.com/flcdrg/au-packages/, and until recently I used the AU PowerShell module to detect and publish updated versions of the packages.

    Chocolatey logo

    The first issue was that unfortunately, the original maintainer of the AU module archived the project on GitHub. The Chocolatey Community stepped in and is now maintaining a fork here.

    The second issue I hit that was causing issues was a compatibility issue with newer versions of PowerShell 7. AU was originally written for Windows PowerShell 5, but I have made extensive use of features of PowerShell 6 and 7 in my update scripts. That didn’t seem to cause issues until the GitHub Actions agents were updated from PowerShell 7.2 to 7.4 in January.

    The specific problem would reveal itself like this:

    Chocolatey had an error occur: System.ArgumentException: File specified is either not found or not a .nupkg file. 'D:\a\au-packages\au-packages\microsoft-teams.install\microsoft-teams.install.1.7.0.3653.nupkg '
    

    For some reason, the AU module was able to generate a new version of a package, but when it called the Chocolatey CLI (choco.exe) and passed the path to the nupkg file, it appeared that there was a trailing space in the filename!

    I spent hours trying to debug this to no avail. This was not made any easier by the fact that AU uses PowerShell Jobs to spin up separate processes for each package so they can be processed in parallel. I could not get debugging to work inside a Job when using the Visual Studio Code PowerShell debugger. Even the old-style debugging approach of Write-Host "I got here" didn’t work very well as all output of the job is captured isn’t easy to extract (let alone being able to inspect the original variables as proper objects rather than serialised strings)

    Eventually, I decided I was wasting my time trying to solve this, and maybe if I rewrote the updating logic myself I could mitigate the issue.

    There are essentially two parts to the AU module - the bits that support updating an individual package, and then there are the bits that run over all your packages. It’s that second part that makes use of PowerShell Jobs and I suspected was the source of the problem.

    I figured rewriting that part in C#/.NET would mean I had a much nicer debugging experience (should I need it). I wanted to leave the individual package updating alone - it would be a significant effort to migrate all the custom update.ps1 scripts to something else.

    au-dotnet

    And so au-dotnet was born.

    It is a reasonably simple .NET 8 console application that iterates over all the packages in my au-packages repository, and then calls the PowerShell update.ps1 script in each to see if there is a new version to generate and publish.

    Rather than just call out to the operating system to run each update.ps1 script, I decided to embed PowerShell in the application. This gives me a bit more control over how the scripts are run and the ability to capture any script output (and errors) from each run.

    Hosting PowerShell

    Figuring out how to host PowerShell in a .NET 8 application took a little bit of research. Many of the articles you find (and even some of the official documentation) are still aimed at Windows PowerShell.

    The key was to reference these three NuGet packages (and use the same version of each package):

    • Microsoft.PowerShell.Commands.Diagnostics
    • Microsoft.PowerShell.SDK
    • System.Management.Automation

    You can then create a PowerShell Class instance like this:

    var iss = InitialSessionState.CreateDefault2();
    iss.ExecutionPolicy = Microsoft.PowerShell.ExecutionPolicy.RemoteSigned;
    
    var ps = PowerShell.Create(iss);
    

    You can capture any output via the Streams property. eg. Here I am logging any errors from PowerShell as a GitHub Action error:

    ps.Streams.Error.DataAdded += (_, args) =>
    {
        core.WriteError(ps.Streams.Error[args.Index].ToString());
    };
    

    Running specific PowerShell cmdlets can be done via the AddCommand method. eg.

    ps.AddCommand("Set-Location").AddParameter("Path", directory).Invoke();
    

    Whereas running arbitrary PowerShell scripts is done via the AddScript method. eg.

    ps.AddScript("$ErrorView = 'DetailedView'").Invoke();
    

    If the script is in a separate .ps1 file, the only way I’ve found so far is to load that file into a string and pass it in. It would be nicer if you could point it at the file (so debugging/errors could include line numbers) but I have yet to find a way to do that.

    var output = ps.AddScript(File.ReadAllText(Path.Combine(directory, "update.ps1")))
    .Invoke();
    

    One thing to remember is you must call the Invoke method to actually run the scripts or commands you’ve just added.

    GitHub Action logging and summary

    Because I know the application will be run in a GitHub Actions workflow, I made use of the excellent GitHub.Actions.Core NuGet package for formatting output, as well as generating a nice build summary that lists all packages that were updated in the current run.

    Screenshot of GitHub Actions build summary, showing 17 packages updated and a table with the package names and versions

    Commit and publish

    If a new package is created (eg. a .nupkg file now exists) then we assume this file can be submitted to the Chocolatey Community Repository. choco push is then called to upload the package. Remember this was where we hit that error with the trailing space? Pleasingly the .NET version doesn’t exhibit this behaviour, so that problem is solved.

    Assuming the package is submitted successfully then we call git` to stage any modified files from this package and add a tag indicating the package that was updated.

    After all packages have been processed, we will commit all staged files and push the commit back to the repo, so that we get a version history of all the package changes.

    Enhancements

    I am currently using my fork of the chocolatey-au module, which has one minor enhancement. It adds a Files collection property to the AUPackage PowerShell class. This collection is populated with the paths of all the files that were downloaded (and had their checksums calculated).

    I make use of this for some of my packages to pre-emptively upload the files to VirusTotal. This can help fast-track the packages being approved by Chocolatey as it means the Chocolatey virus scanning step is already completed. Because I use the VirusTotal CLI tool for this, it also means I can upload files up to 650MB (compared to Chocolatey’s current 200MB limit due to using an older API).

    I have submitted the Files property enhancement to chocolatey-au.

    Summary

    You can see this in action in the latest workflow runs at https://github.com/flcdrg/au-packages/actions.

  • 1
  • 2