-
Azure DevOps PowerShell Scripts - List all Git repositories
If you want to list all the Git repositories for all projects in an Azure DevOps organisation, this script will return all the remote URLs.
See Personal access tokens for instructions on how to create the personal access token.
param ( [string] $organisation, [string] $personalAccessToken ) $base64AuthInfo= [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($personalAccessToken)")) $headers = @{Authorization=("Basic {0}" -f $base64AuthInfo)} $result = Invoke-RestMethod -Uri "https://dev.azure.com/$organisation/_apis/projects?api-version=6.0" -Method Get -Headers $headers $projectNames = $result.value.name $projectNames | ForEach-Object { $project = $_ $result = Invoke-RestMethod -Uri "https://dev.azure.com/$organisation/$project/_apis/git/repositories?api-version=6.0" -Method Get -Headers $headers $result.value.remoteUrl } | Sort-Object
It makes use of the Repositories - List REST API, so you could ask for any of the other properties instead of or in addition to
remoteUrl
as well. -
Using GitHub Actions to update packages.lock.json for Dependabot PRs
I like using Dependabot to keep my package dependencies up to date. But it does have one problem if you're using
packages.lock.json
files with NuGet packages - it doesn't update them. So your csproj will be modified but the packages.lock.json file won't, which can lead to broken failing.Here's one approach to working around this. Hopefully GitHub will fix this properly in the future.
ChatOps
I'm going to make use of Peter Evans' Slash Command Dispatch GitHub Action to enable triggering by entering
/lockfiles
as a comment on the pull request. This action is extensible and can be used to create all kinds of 'slash' commands.First up, I created a new workflow that uses this action:
name: Slash Command Dispatch on: issue_comment: types: [created] jobs: slashCommandDispatch: runs-on: ubuntu-latest steps: - uses: xt0rted/pull-request-comment-branch@v1 id: comment-branch - name: Slash Command Dispatch uses: peter-evans/slash-command-dispatch@v2 id: slash-command with: token: ${{ secrets.PAT_REPO_FULL }} commands: | lockfiles permission: write issue-type: pull-request dispatch-type: workflow static-args: ref=${{ steps.comment-branch.outputs.head_ref }}
Things to note:
- We're triggering on a new comment being added to a pull request
- We use Pull Request Comment Branch Action to obtain the name of the branch that is linked to the pull request for the triggering comment.
- The
dispatch-type
is set toworkflow
as we want the secondary workflow to run against the pull request branch (not the default branch) - We set the
ref
argument to the branch name. This will be picked up by the second workflow.
The second workflow is named
lockfiles-command.yml
. It needs to follow the convention of commandname-command.yml.name: Update lockfiles on: workflow_dispatch: jobs: lockfiles: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 with: fetch-depth: 0 token: ${{ secrets.PAT_REPO_FULL }} - name: Setup .NET 5 uses: actions/setup-dotnet@v1 with: dotnet-version: 5.0.x - name: Restore dependencies run: dotnet restore --force-evaluate - uses: stefanzweifel/git-auto-commit-action@v4 with: commit_message: Update lockfiles
Things to note:
- This workflow uses the workflow_dispatch trigger.
- The checkout action notices that the ref value was set in the first workflow and so will checkout the pull request branch.
- We use the git-auto-commit Action to commit and push any changes made by the earlier
dotnet restore
command.
To trigger the workflow, add a new comment to a pull request with
/lockfiles
. eg.You can see a complete repo with example pull request over at https://github.com/flcdrg/dependabot-lockfiles/pull/1
Future ideas
It could be possible to have this workflow trigger automatically after Dependabot creates the pull request if you wanted to completely automate this approach, rather than needing to add the comment manually.
-
Synology Active Backup for G Suite
This is part 3 in a series of posts reviewing the DiskStation DS1621xs+.
In part 1, I unboxed the Synology and configured storage. In part 2, I set up Cloud Sync. Now I'm going to use the Active Backup for G Suite tool to do a full backup of a G Suite.
I have a domain that's hosted by Google's G Suite. Synology offer an on-premise backup solution for G Suite - Active Backup for G Suite.
The first time you open Active Backup for G Suite, you'll be prompted to activate the package by entering your Synology account.
Enter your Synology account email address and password.
And you're good to go.
We're now ready to start creating a backup task. First up, enter your domain, email address and service key. I wasn't sure what a 'Service Key' was, but you'll notice there's a hyperlink right there in the description that takes you a step by step tutorial on configuring G Suite and creating the service key.
I followed the tutorial through and created the service key JSON file that I then selected for this step.
Next enter a task name and confirm the destination folder. Unlike previous configurations, the only choice is to pick the top-level directory. The task will create a subdirectory under this location for you.
You can then select services to be automatically added should new G Suite accounts be created in the future.
You can choose whether to run backup continuously, or on a regular schedule. You also can set whether to keep all versions of files, or just a specific number of days.
Finally, a summary of all the settings before you finally create the task.
And then your task is created! Pretty easy really.
If you configured it to run on a schedule, then you can then choose to start your first backup immediately (otherwise I assume it will start at the specified time).
You can monitor the backup progress from the Task List page.
Here's an example having the Active Backup for G Suite run on a nightly schedule. The first couple of times were quite large, but the most recent backup was quite small, which makes sense if it's just the changes since the previous backup. Looks like I possibly had a network outage the previous night which interrupted that backup, but everything is green now.
Now backups are great, but they're essentially useless unless you can restore from them.
You can explore the backups using the Active Backup for G Suite Portal.
It is possible to allow multiple users to have access to the portal, but in my case I'm just using the admin account. You can select which Google account's services you're inspecting and (as per the screenshot) you can switch between the different services (eg. Drive, Mail, Contacts and Calendar).
The timeline across the bottom allows you to choose the point in time to restore from.
To restore an item, you simply select it, and then choose the 'Restore' button. There's also the option to restore the entire service (eg. mailbox).
You get a chance to confirm the restore action.
And then watch the restore progress.
.
Checking the inbox of the relevant account, I could then see the email 'magically' restored!
Handily it also adds a label on the email so you can easily identify it was restored.
.
In summary, I think this is a really useful feature that I plan to keep using. I can also imagine it would be valuable for organisations that need to retain data to comply with legal requirements.