-
7 years being a Microsoft MVP
A few days later than usual, but this morning I got an email to confirm I'm again a recipient of the Microsoft Most Valuable Professional (MVP) award!
Dear David Gardiner,
We’re once again pleased to present you with the 2022-2023 Microsoft Most Valuable Professional (MVP) award in recognition of your exceptional technical community leadership. We appreciate your outstanding contributions in the following technical communities during the past year:
- Developer Technologies
This is my 7th award since 2015.
What's the big deal?
Well if nothing else, it is nice to be recognised, acknowledged, and thanked for the things you do in the developer community. That's not to say that recognition, acknowledgement, and appreciation haven't been forthcoming in other ways. And it's not to say that the reason I do the things I do is that I want those from the community or Microsoft.
Having said that, if the only feedback I got was negative (not that I actually get much of that) then it probably would cause me to reconsider if it was worth doing the things I do. So thanks Microsoft for your support!
What's in it for me?
There are some tangible benefits that Microsoft give me which include:
- Early access to Microsoft products
- Direct communication channels with product teams
- An invitation to the Global MVP Summit (usually held in Redmond USA, but online the last few years)
- An executive recognition letter
- A Visual Studio technical subscription
- An Office 365 subscription
- A certificate and glass trophy
And not forgetting the occasional bit of swag! (the hoodie arrived in the mail earlier this week which was a nice surprise)
Several 3rd party software vendors also offer discounts or free licenses of their products to MVPs. JetBrains is one I do take advantage of.
Now it is true that some of these things I could purchase myself, or might be provided by my employer, but having them available and for use on my own personal and community projects is a bonus.
The other big benefit for me is the networking connections I make both within Microsoft and the broader MVP community. Leveraging those connections has been a real advantage in organising speakers for the Adelaide .NET User Group.
What's in it for Microsoft?
You might think with all those nice things, I'd feel obligated to constantly sing Microsoft's praises? Not necessarily. I will indeed re-share public updates that I think are of interest, but while I'm sure they appreciate that, what I think Microsoft benefit from the most is me (and other MVPs) giving them candid, open and honest feedback. Because most MVPs sign a non-disclosure agreement, they can give that feedback confidentially. Let me tell you, MVPs can be pretty passionate and honest at times!
What now?
If you're curious, check out my MVP Profile at https://mvp.microsoft.com/en-us/PublicProfile/5001655.
Regardless of whether I get another MVP award or not, I'll just keep on doing what I'm doing.
-
Passing variables between Azure Pipelines stages, jobs and deployment jobs
Azure Pipelines, the continuous integration and continuous deployment feature of Azure DevOps, has the concept of variables. For scripts and tasks, they behave just like environment variables. There's a bunch that are predefined that you might have used before, like
System.AccessToken
,System.DefaultWorkingDirectory
andBuild.ArtifactStagingDirectory
.In YAML pipelines, you can define your own variables in:
-
a
variable
block in YAMLvariables: one: initialValue
This variable block can also reference previously defined variable groups. Variable groups are managed in the Library tab under the Pipelines menu of your Azure DevOps project.
variables: - group: "Contoso Variable Group" - name: anothervariable value: 'Hi there'
-
The UI for the YAML pipeline.
-
A script using
task.setvariable
- bash: | echo "##vso[task.setvariable variable=myVar;]foo" - bash: | echo "You can use macro syntax for variables: $(myVar)"
It's this last case (using
task.setvariable
) that can be interesting, as by default that variable is only available to subsequent tasks in the same job. To access the variable from other jobs or stages, you need to addisoutput=true
and give the step a name. eg.- bash: | echo "##vso[task.setvariable variable=myOutputJobVar;isoutput=true]this is the same job too" name: setOutput
And here's where it gets interesting. Depending on whether the task that is setting the variable (often referred to as an 'output variable') is in a regular job or a deployment job, and whether you're wanting to reference the variable from another job in the same stage or a job in a subsequent stage, the syntax varies.
If you're not familiar with them, a deployment job is the YAML equivalent of the GUI-based 'Classic' Release pipelines. They are similar to regular pipeline jobs, but there are some differences too.
Most of the existing examples for referencing output variables are listed in the documentation Set variables in scripts, but as deployment jobs have a different syntax, there's another section under Deployment jobs. Unfortunately, between these two pages, not all the possibilities seem to be covered.
So I came up with a pipeline workflow that correctly demonstrates these options:
- Deployment Job to another Job in the same stage
- Deployment Job to another Job in a different stage
- Deployment Job to another Deployment Job in a different stage
- Job to another Job in a different stage
Here's the entire YAML pipeline definition (source on GitHub)
trigger: - master pool: vmImage: 'ubuntu-latest' stages: - stage: Stage1 displayName: Stage 1 jobs: - deployment: Stage1DeploymentJob1 displayName: Stage 1 Deployment Job 1 environment: Environment1 strategy: runOnce: deploy: steps: - bash: echo "##vso[task.setvariable variable=my_Stage1DeploymentJob1_OutputVar;isOutput=true]Variable from $(Agent.JobName)" name: stepVar_Stage1DeploymentJob1 displayName: Set my_Stage1DeploymentJob1_OutputVar - stage: Stage2 displayName: Stage 2 dependsOn: Stage1 jobs: - deployment: Stage2DeploymentJob1 displayName: Stage 2 Deployment Job 1 environment: Environment2 strategy: runOnce: deploy: steps: - bash: echo "##vso[task.setvariable variable=my_Stage2DeploymentJob1_OutputVar;isOutput=true]Variable from $(Agent.JobName)" name: stepVar_Stage2DeploymentJob1 displayName: Set my_Stage2DeploymentJob1_OutputVar - job: Stage2Job2 displayName: Stage 2 Job 2 dependsOn: Stage2DeploymentJob1 variables: varFrom_Stage2DeploymentJob1: $[ dependencies.Stage2DeploymentJob1.outputs['Stage2DeploymentJob1.stepVar_Stage2DeploymentJob1.my_Stage2DeploymentJob1_OutputVar'] ] steps: - checkout: none - bash: echo $(varFrom_Stage2DeploymentJob1) displayName: Display varFrom_Stage2DeploymentJob1 - stage: Stage3 displayName: Stage 3 dependsOn: Stage1 variables: varFrom_Stage1DeploymentJob1: $[ stageDependencies.Stage1.Stage1DeploymentJob1.outputs['Stage1DeploymentJob1.stepVar_Stage1DeploymentJob1.my_Stage1DeploymentJob1_OutputVar'] ] jobs: - deployment: Stage3DeploymentJob1 displayName: Stage 3 Deployment Job 1 environment: Environment3 strategy: runOnce: deploy: steps: - bash: echo $(varFrom_Stage1DeploymentJob1) displayName: Display varFrom_Stage1DeploymentJob1 - bash: printenv displayName: printenv - job: Stage3Job2 displayName: Stage 3 Job 2 steps: - checkout: none - bash: echo "##vso[task.setvariable variable=my_Stage3Job2_OutputVar;isOutput=true]Variable from $(Agent.JobName)" name: stepVar_Stage3Job2 displayName: Set my_Stage3Job2_OutputVar - stage: Stage4 displayName: Stage 4 dependsOn: # Need to mention stage here to be able to reference variables from it, even though the dependency is implied by Stage2 and Stage3 - Stage1 - Stage3 - Stage2 jobs: - job: Stage4Job1 displayName: Stage 4 Job 1 variables: varFrom_Stage1DeploymentJob1: $[ stageDependencies.Stage1.Stage1DeploymentJob1.outputs['Stage1DeploymentJob1.stepVar_Stage1DeploymentJob1.my_Stage1DeploymentJob1_OutputVar'] ] varFrom_Stage2DeploymentJob1: $[ stageDependencies.Stage2.Stage2DeploymentJob1.outputs['Stage2DeploymentJob1.stepVar_Stage2DeploymentJob1.my_Stage2DeploymentJob1_OutputVar'] ] varFrom_Stage3Job2: $[ stageDependencies.Stage3.Stage3Job2.outputs['stepVar_Stage3Job2.my_Stage3Job2_OutputVar'] ] steps: - checkout: none - bash: | echo "varFrom_Stage1DeploymentJob1: $(varFrom_Stage1DeploymentJob1)" echo "varFrom_Stage2DeploymentJob1: $(varFrom_Stage2DeploymentJob1)" echo "varFrom_Stage3Job2: $(varFrom_Stage3Job2)" displayName: Display variables - bash: printenv displayName: printenv
The stage dependencies result in the following flow:
Here's the output from the
Stage4Job1
's Display variables step:Starting: Display variables ============================================================================== Task : Bash Description : Run a Bash script on macOS, Linux, or Windows Version : 3.201.1 Author : Microsoft Corporation Help : https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/bash ============================================================================== Generating script. ========================== Starting Command Output =========================== /usr/bin/bash /home/vsts/work/_temp/8ce87e72-545d-4804-8461-3910a1412c90.sh varFrom_Stage1DeploymentJob1: Variable from Stage 1 Deployment Job 1 varFrom_Stage2DeploymentJob1: Variable from Stage 2 Deployment Job 1 varFrom_Stage3Job2: Variable from Stage 3 Job 2 Finishing: Display variables
Some things to note:
- The format for referencing a deployment job from a different stage is
$[ stageDependencies.<StageName>.<DeploymentJobName>.outputs['<DeploymentJobName>.<StepName>.<VariableName>'] ]
- I've seen some online posts suggesting the environment name be used instead of the secondDeploymentJobName
, but I think that is incorrect - this format works for me. - As the comment in the YAML mentions, if you want to reference a variable from a job or stage, then you have to add that to your
dependsOn
list. The fact that Stage 1 is 'upstream' from Stage 4 (because Stage 4 explicitly depends on Stage 2 and Stage 3) isn't enough.
Summary
In this post, we saw how to reference variables from previous Jobs or Deployment Jobs in either the same or different stages of an Azure Pipeline.
-
-
UniFi USW Flex Mini 5-port switch
My work from home desk has a great outlook, but unfortunately, it's on the opposite side to the room's network wall sockets. For this reason, I run a single cable around the room perimeter (mostly hidden behind sofas).
In my work with SixPivot, clients sometimes provide a laptop for me to use. In that case, I was having to use wireless, but I'll always prefer wired over wireless if possible. I could run a second cable, but that's getting messy, so the tidier option was to install a small switch on my desk to allow adding extra devices there. 5 port switches are pretty cheap, but often you pay a premium if you want a 'managed' switch. My colleague Shaw pointed out that Ubiquiti has a managed 5-port switch for around $AU50, and even better, it can be powered by Power over Ethernet (PoE).
The UniFi USW Flex Mini 5-port switch is (no surprise) a Gigabit switch with 5 ports. It can be powered either via a USB-C adapter, or 802.3af PoE (via port 1).
You can buy it from resellers on Amazon AU or Amazon US, but I had to hunt around to find a reseller with stock in Australia. I ended up going with UBWH Australia. The web page specifically calls out that the product does not ship with a power supply (which was fine as I was intending to use PoE).
It took a few days to arrive (I was starting to wonder if it was walking across the Nullabor. Despite the previous warnings, I was surprised to find that it did indeed include a power supply, but curiously it appears to have a US-style plug (so not that useful).
I made sure that the network cable coming to the UniFI switch had PoE and then plugged the network cable in. The device status LEDs then illuminated, giving me confidence that it was working. Adding extra cables to connect the two laptops on my desk and moments later they were both online.
It's a managed switch, and as I already run the UniFi Network Application (formerly UniFi Controller) software in a Docker container on my Synology server, I was able to add the device to the management application.
I followed these steps to manage the switch:
- Open UniFi Controller
- Navigate to 'UniFi Devices' tab
- Click on Click to Adopt
- In the details tab for device, click on Adopt Device
- Wait for device status to change to Online.
- You may see an optional firmware update available. You can apply the update by clicking on Click to Update in the device list, or on Apply Update in the device details panel
Here's the Overview panel for the device after it has been in operation for a few days:
So far, so good!