• Using GitHub Actions to update packages.lock.json for Dependabot PRs

    I like using Dependabot to keep my package dependencies up to date. But it does have one problem if you’re using packages.lock.json files with NuGet packages - it doesn’t update them. So your csproj will be modified but the packages.lock.json file won’t, which can lead to broken failing.

    Build failure

    Here’s one approach to working around this. Hopefully GitHub will fix this properly in the future.


    I’m going to make use of Peter Evans’ Slash Command Dispatch GitHub Action to enable triggering by entering /lockfiles as a comment on the pull request. This action is extensible and can be used to create all kinds of ‘slash’ commands.

    First up, I created a new workflow that uses this action:

    name: Slash Command Dispatch
        types: [created]
        runs-on: ubuntu-latest
          - uses: xt0rted/[email protected]
            id: comment-branch
          - name: Slash Command Dispatch
            uses: peter-evans/[email protected]
            id: slash-command
              token: ${{ secrets.PAT_REPO_FULL }}
              commands: |
              permission: write
              issue-type: pull-request
              dispatch-type: workflow
              static-args: ref=${{ steps.comment-branch.outputs.head_ref }}

    Things to note:

    • We’re triggering on a new comment being added to a pull request
    • We use Pull Request Comment Branch Action to obtain the name of the branch that is linked to the pull request for the triggering comment.
    • The dispatch-type is set to workflow as we want the secondary workflow to run against the pull request branch (not the default branch)
    • We set the ref argument to the branch name. This will be picked up by the second workflow.

    The second workflow is named lockfiles-command.yml. It needs to follow the convention of commandname-command.yml.

    name: Update lockfiles
        runs-on: ubuntu-latest
          - uses: actions/[email protected]
              fetch-depth: 0
              token: ${{ secrets.PAT_REPO_FULL }}
          - name: Setup .NET 5
            uses: actions/[email protected]
              dotnet-version: 5.0.x
          - name: Restore dependencies
            run: dotnet restore --force-evaluate
          - uses: stefanzweifel/[email protected]
              commit_message: Update lockfiles

    Things to note:

    • This workflow uses the workflow_dispatch trigger.
    • The checkout action notices that the ref value was set in the first workflow and so will checkout the pull request branch.
    • We use the git-auto-commit Action to commit and push any changes made by the earlier dotnet restore command.

    To trigger the workflow, add a new comment to a pull request with /lockfiles. eg.

    GitHub pull request comment

    You can see a complete repo with example pull request over at https://github.com/flcdrg/dependabot-lockfiles/pull/1

    Future ideas

    It could be possible to have this workflow trigger automatically after Dependabot creates the pull request if you wanted to completely automate this approach, rather than needing to add the comment manually.

  • Synology Active Backup for G Suite

    This is part 3 in a series of posts reviewing the DiskStation DS1621xs+.

    DiskStation DS1621xs+

    In part 1, I unboxed the Synology and configured storage. In part 2, I set up Cloud Sync. Now I’m going to use the Active Backup for G Suite tool to do a full backup of a G Suite.

    I have a domain that’s hosted by Google’s G Suite. Synology offer an on-premise backup solution for G Suite - Active Backup for G Suite.

    The first time you open Active Backup for G Suite, you’ll be prompted to activate the package by entering your Synology account.

    Active Backup for G Suite Activation prompt

    Enter your Synology account email address and password.

    Active Backup for G Suite Activation account details

    And you’re good to go.

    Active Backup for G Suite Activation complete

    We’re now ready to start creating a backup task. First up, enter your domain, email address and service key. I wasn’t sure what a ‘Service Key’ was, but you’ll notice there’s a hyperlink right there in the description that takes you a step by step tutorial on configuring G Suite and creating the service key.

    I followed the tutorial through and created the service key JSON file that I then selected for this step.

    Authorize Active Backup for G Suite

    Next enter a task name and confirm the destination folder. Unlike previous configurations, the only choice is to pick the top-level directory. The task will create a subdirectory under this location for you.

    Configure task settings

    You can then select services to be automatically added should new G Suite accounts be created in the future.

    Enable auto-discovery services

    You can choose whether to run backup continuously, or on a regular schedule. You also can set whether to keep all versions of files, or just a specific number of days.

    Set up backup and retention policy

    Finally, a summary of all the settings before you finally create the task.

    View task summary

    And then your task is created! Pretty easy really.

    If you configured it to run on a schedule, then you can then choose to start your first backup immediately (otherwise I assume it will start at the specified time).

    You can monitor the backup progress from the Task List page.

    Here’s an example having the Active Backup for G Suite run on a nightly schedule. The first couple of times were quite large, but the most recent backup was quite small, which makes sense if it’s just the changes since the previous backup. Looks like I possibly had a network outage the previous night which interrupted that backup, but everything is green now.

    Active Backup for G Suite - Overview screen

    Now backups are great, but they’re essentially useless unless you can restore from them.

    You can explore the backups using the Active Backup for G Suite Portal.

    Active Backup for G Suite Portal

    It is possible to allow multiple users to have access to the portal, but in my case I’m just using the admin account. You can select which Google account’s services you’re inspecting and (as per the screenshot) you can switch between the different services (eg. Drive, Mail, Contacts and Calendar).

    The timeline across the bottom allows you to choose the point in time to restore from.

    To restore an item, you simply select it, and then choose the ‘Restore’ button. There’s also the option to restore the entire service (eg. mailbox).

    You get a chance to confirm the restore action.

    Active Backup for G Suite Portal - Restore email

    And then watch the restore progress.

    Active Backup for G Suite Portal - Restore progress.

    Checking the inbox of the relevant account, I could then see the email ‘magically’ restored!

    Handily it also adds a label on the email so you can easily identify it was restored.

    GMail showing restored email.

    In summary, I think this is a really useful feature that I plan to keep using. I can also imagine it would be valuable for organisations that need to retain data to comply with legal requirements.

  • Synology Cloud Sync

    DiskStation DS1621xs+

    Now that the Synology is up and running with disks configured we can start installing some extra features. First on my list to take a look at is Cloud Sync.

    I think this could be useful for a small business that wants to have a second backup of content that employees might have in OneDrive or DropBox, or as a way of aggregating that content locally. The aggregation idea could also make sense if for some reason you had a mix of services in use too. The list of supported services is impressive (and to be honest I haven’t even heard of some of these):

    • Alibaba Cloud OS
    • Azure storage
    • Backblaze B2
    • Baidu Cloud
    • Box
    • Dropbox
    • Google Cloud Storage
    • Google Drive
    • Google Shared Drive
    • hicloud S3
    • HiDrive
    • JD Cloud OSS
    • MegaDisk
    • Microsoft OneDrive
    • Microsoft OneDrive for Business
    • Microsoft SharePoint
    • OpenStack Swift
    • Rackspace
    • S3 storage
    • SFR NAS Backup
    • Tencent Cloud COS
    • WebDAV
    • Yandex Disk

    Synology Cloud Sync - Overview

    I’ve already added a OneDrive account. You can see it shows a nice summary that everything is up to date and even shows how your usage for that specific account is going. In this case my OneDrive has 1 TB capacity and I’m currently using 22.66 GB.

    Adding a service is as easy as clicking on the “+” button from the Cloud Sync app.

    Then I select the specific provider I want to add. I’m going to add a Dropbox account.

    Synology Cloud Sync - Overview

    A new browser window opens which allows me to enter my credentials and authorise my Synology device to connect to my Dropbox account.

    Synology Cloud Sync - Overview

    I’m now given an opportunity to specify where on the Synology I want to store the synchronised files. I clicked on the folder icon in the “Local Path” field

    Synology Cloud Sync - Overview

    I can select an existing folder or create a new subfolder. I tried to create a new folder:

    Synology Cloud Sync - Overview

    But I got this error message. I’m not sure why this is the case as my account is a full administrator on the Synology server.

    Synology Cloud Sync - Overview

    I was able to work around the problem by going to the File Station app and successfully creating the folder there. Once it was created then I could select it.

    I can also specify what kind of synchronisation to perform. The default is bidirectional, but might also want to pick another option like ‘Download remote changes only’ which would make sense if you’re planning to use the Synology as a secondary backup for your cloud files.

    Synology Cloud Sync - Overview

    Finally a chance to review all the settings before you hit Apply to finish setting up the new sync task.

    Synology Cloud Sync - Overview

    This is a really simple, straightforward process that I was able to repeat for multiple different cloud providers.