-
The biggest problem with CI/CD pipelines
Do you know the most common problem I encounter when creating or updating build or deployment pipelines?
Finding the correct path to specific files!
I think that's why I find I have to litter my pipelines with extra steps like this:
- script: ls -alR displayName: "Script: List files" workingDirectory: $(Build.ArtifactStagingDirectory)
I think the problem is really that tools used to build software with, and the tasks you use in your pipelines, are so inconsistent with how they generate and reference files and paths.
For example:
- Does the file generated by a tool get created in a single directory, or does it create a child directory for the file to live in?
- If you use wildcards with the tool, does that change how it generates the output file(s)?
- Does the tool default to looking for files in the current working directory or somewhere else?
- If you use relative paths for tool parameters, are they relative to the current working directory, or to another specific parameter?
- If you're creating a zip file by pointing to a directory, does it include that directory in the zip, or just the child files and directories?
- Does the tool or task support wildcards? And are they 'normal' wildcards (aka globbing), or are they actually regular expressions?
- If the tool supports wildcards, can you specify multiple patterns, or only one?
I'm sure there are other variations. You get the idea at least.
And once you've finished banging your head against the wall and got all your path 'ducks' in a row, do you leave all those
ls -alR
tasks in your pipelines just in case you might need to refer to them in the future, or do you remove them to get rid of the extra noise (and make things a tiny bit faster)?Using a tool like Cake as an abstraction over all your tools may help to a certain extent, as it then provides a more consistent interface in how you use the tools. But even then I've found myself having to add extra code in my .cake files to again list what files it can see to troubleshoot when things are not working as I think they should.
It's such a trivial thing, but it continues to trip me up, and I suspect I'm not alone 😀
-
2024 in review
It's the last day of 2024, and tomorrow (no surprise) is the first day of 2025. I remember when I was pretty young thinking that wouldn't it be fun when the year 2000 came around and would anyone else notice. Well yes, I think they kind of did! But now we're 25 years further along. It is certainly true, time seems to move so slowly when you're young, and much quicker as you get older.
This year has been quite an interesting mix.
Work things
Working remotely from home is such a wonderful thing and I don't take it for granted. Even more so when I read about other companies now reversing their flexible work arrangements. Unfortunately I think that's largely driven by old-fashioned senior managers who either think you're only productive if you're all colocated in an office, or that they're worried about all the lease/asset costs of empty office buildings.
Yes, there are specific times when it is valuable to all be together in the same room, but for the rest of the time if you're stuck in an open office which is so noisy that most people have to wear noise cancelling headphones to be able to get any work done, well that makes no sense to me. Let alone the time it takes to commute to the office.
I really value the trust that SixPivot places in its staff. The way we help each other out. Having that "reservoir of expertise on tap" just a Slack message away is so helpful. Or for that unusual day when everything seemed to be going crazy at once. To have the support of your peers, knowing they're there for you. That really matters.
Working remotely makes it all the more sweeter when I did get to catch up with my SixPivot colleagues at our annual in-person Summit in July.
I've been focusing a lot on "DevOps" kinds of things lately. I honestly prefer not use "DevOps" in a job title, but I think I'm probably fighting a losing battle. Most people know what you mean when you describe managing build and deployment pipelines and infrastructure as code as "DevOps stuff".
Professional things
I was able to attend the Microsoft MVP Summit in person over in Redmond for the first time since the pandemic. A definite privilege to be there, and I still get a huge kick out of visiting a different culture (and while there are similarities, there's also enough differences between Australia and the USA to be interesting).
We ran the DDD Adelaide conference, and I think that was pretty successful. And then two weeks later I got to speak and help out at DDD Brisbane. That was a real bonus.
The Adelaide .NET User Group met regularly throughout the year. It's been great having a team helping organise this with me too.
On the blogging side, I was hoping to get at least one post out each month. Looks like I missed out a few months, though some months did have multiple posts. Not too bad.
I am thinking of changing the blog engine I use. Currently using Jekyll, but I'd really like to switch to a .NET-based engine (as I think I'd find that much easier to customise). We'll see if I get something working well enough to migrate to.
I left X/Twitter. I'd stopped using it for quite a few months and didn't really miss it, but decided enough was enough with the antics of the current owner. Account deleted. I'm now on Mastodon and Bluesky.
Technology things
That Apple TV I got the family for Christmas 2023 turned out to be quite a hit. I think at first there was a bit of "what's the point of this?" but it has just become the most common way we watch TV now by all members of the family.
The Synology NAS is just a really useful thing. The backup features came into their own when my parents were travelling overseas and managed to lose a bunch of contacts from their email account. Because I had backups configured I was able to restore the contacts for them.
I think we've finally found a working combination of bits to enable us to record and live stream our ADNUG meetups. It's taken a while, and if there was a single device I could use to replace all the separate things plugged into my laptop, that would make it so much easier, but I haven't found that yet (and I suspect it wouldn't be cheap).
My laptop has been behaving itself since those overheating issues earlier in the year. I've been pretty loyal to Dell in the past, but those kind of reliability issues have raised some questions in my mind. I'll be taking a very close look at the Framework for my next hardware refresh to see how it compares.
Personal things
Probably the biggest highlight of 2024 would have to be my son getting married. He and his fiancee decided they'd prefer an shorter engagement, so getting everything organised for the big day was challenging, but it came off. We're especially grateful for their friends who put in so much effort to ensure the church and reception were all ready to go. The day was wonderful, everyone had a great time, and it's a really special event we'll always remember.
The garden also got a bit of an overhaul (largely in preparation for the wedding). This included rebuilding a large part of the irrigation system. Originally we had someone lined up to do this but they pulled out at the last minute so I ended up doing it all myself.
Health continues to be a challenge particularly for family members. I had a few niggles of my own (not at all comparable to be clear) that put pause to my usual before-work walks and the odd bike ride for a couple of months, but hopefully those are mostly resolved now.
We had a few Sevenfold band gigs this year. To be able to play music with special friends who have known you for ages and are so encouraging is just the best thing.
I've also gotten more involved in the song leading side of things at church (in addition to being on the roster for video camera operator and producer). It's different being in the worship band compared to Sevenfold. With Sevenfold you have the same group of people you play with all the time, and you can spend a lot of time working on the arrangements for songs. Whereas the people who are in the worship band changes week to week, so each time you're rostered on you'll likely be with different combination of musicians and singers. And usually you only get an hour to practise together early on Sunday morning before the first service starts. I think I'm still learning to be comfortable up the front when we're leading the singing. Given it is a church service, most of the time I feel a responsibility to adopt more of an attitude of reverence (and keep my 'silly' side in check), but there have been a couple of times when it's been ok to let loose a little bit, and that is fun.
I'm sure I've forgotten some significant things. I'll have to include those in the next post.
Anyway, goodbye 2024. See you all next year!
-
.NET Code Coverage in Azure DevOps and SonarCloud
Sonar offer some really useful products for analysing the quality of your application's source code. There's a great mix of free and paid products, including SonarQube Cloud (formerly known as SonarCloud), SonarQube Server (for on-prem), and SonarQube for IDE (formerly SonarLint) static code analysers for IntelliJ, Visual Studio, VS Code and Eclipse.
I was looking to integrate an Azure DevOps project containing a .NET application with SonarQube Cloud, and in particular include code coverage data both for Azure Pipelines (so you can view the coverage in the pipeline run), but also in SonarQube Cloud.
This process is quite similar if you're using the self-hosted SonarQube Server product, though note that there are different Azure Pipeline tasks provided by a different extension for SonarQube Server.
A sample project can be found at https://dev.azure.com/gardiner/SonarCloudDemo
Prerequisites
- You have a SonarQube Cloud account.
- You've configured it to be integrated with your Azure DevOps organisation.
- You've installed the SonarQube Cloud extension (or SonarQube Server extension if you're using SonarQube Server)
- You've created a service connection in the Azure DevOps project pointing to SonarQube Cloud.
I've created a .NET solution which contains a simple ASP.NET web application and an xUnit test project.
By default, when you add a new xUnit test project, it includes a reference to the coverlet.collector NuGet package. This implements a 'Data Collector' for the VSTest platform. Normally you'd run this via:
dotnet test --collect:"XPlat Code Coverage"
You would then end up with a
TestResults
subdirectory which contains acoverage.cobertura.xml
file. But the problem here is that the xml file is one level deeper - VSTest creates GUID-named subdirectory under TestResults. So you will need to go searching for the file, there's no way to ensure it gets created in a known location.It turns out that's a problem for Sonar, as the SonarCloudPrepare task needs to be told where the code coverage file is located, and unfortunately that property doesn't support wildcards!
We can solve that problem by removing the reference to
coverlet.collector
, and instead adding a package reference tocoverlet.msbuild
.dotnet remove package coverlet.collector dotnet add package coverlet.msbuild
To collect code coverage information with this package, you run it like this:
dotnet test /p:CollectCoverage=true
But more importantly, it supports additional parameters so we can now fix the location of output files. The
CoverletOutput
property lets us define the directory (relative to the test project) where output files will be written.dotnet test /p:CollectCoverage=true /p:CoverletOutput='./results/coverage' /p:CoverletOutputFormat=cobertura
Notice that I've not just set
CoverletOutput
to the directory (results
), but also the first part of the coverage filename (coverage
).In the pipeline task, you can let SonarQube know where the file is by setting
sonar.cs.opencover.reportsPaths
like this:- task: SonarCloudPrepare@3 inputs: SonarQube: "SonarCloud" organization: "gardiner" scannerMode: "dotnet" projectKey: "Gardiner_SonarCloudDemo" projectName: "SonarCloudDemo" extraProperties: | sonar.cs.opencover.reportsPaths=$(Build.SourcesDirectory)/Tests/results/coverage.opencover.xml
SonarQube and Azure Pipelines coverage
So now we've solved the problem of where the coverage file will be saved. Can we also deliver the coverage data to both SonarQube and Azure Pipelines?
Let's review what we need to make that happen.
According to the docs for coverlet.msbuild, it supports generating the following formats:
- json (default)
- lcov
- opencover
- cobertura
- teamcity*
(The TeamCity format just generates special service messages in the standard output that TeamCity will recognise, it doesn't create a file)
According to the docs for SonarCloud, it supports the following formats for .NET code coverage:
- Visual Studio Code Coverage
- dotnet-coverage Code Coverage
- dotCover
- OpenCover
- Coverlet (OpenCover format)
- Generic test data
The docs for the Azure Pipelines PublishCodeCoverageResults@2 task don't actually mention which formats are supported (hopefully this will be fixed soon). But in the blog post that announced the availability of the v2 task the following formats were mentioned (including ones from the v1 task):
- Cobertura
- JaCoCo
- .coverage
- .covx
So unfortunately there isn't a single format that all three components understand. Instead we will have to ask
coverlet.msbuild
to generate two output files - OpenCover for SonarQube, and Cobertura for Azure Pipelines.We want to generate two outputs, but there is a known problem with trying to pass in parameters to dotnet test on Linux. The workaround is to set properties in the csproj file instead.
<PropertyGroup> <CoverletOutputFormat>opencover,cobertura</CoverletOutputFormat> </PropertyGroup>
Our Azure Pipeline should look something like this:
steps: - checkout: self fetchDepth: 0 - task: SonarCloudPrepare@3 inputs: SonarQube: "SonarCloud" organization: "gardiner" scannerMode: "dotnet" projectKey: "Gardiner_SonarCloudDemo" projectName: "SonarCloudDemo" extraProperties: | # Additional properties that will be passed to the scanner, put one key=value per line # Disable Multi-Language analysis sonar.scanner.scanAll=false # Configure location of the OpenCover report sonar.cs.opencover.reportsPaths=$(Build.SourcesDirectory)/Tests/results/coverage.opencover.xml - task: DotNetCoreCLI@2 inputs: command: build - task: DotNetCoreCLI@2 inputs: command: test projects: "Tests/Tests.csproj" arguments: "/p:CollectCoverage=true /p:CoverletOutput=results/coverage" - task: SonarCloudAnalyze@3 inputs: jdkversion: "JAVA_HOME_17_X64" - task: SonarCloudPublish@3 inputs: pollingTimeoutSec: "300" - task: PublishCodeCoverageResults@2 inputs: summaryFileLocation: "$(Build.SourcesDirectory)/Tests/results/coverage.cobertura.xml" failIfCoverageEmpty: true
A few things to point out:
- We're doing a full Git clone (not shallow) so that SonarQube can do a proper analysis. This avoids you seeing warnings like this:
- [INFO] SonarQube Cloud: Analysis succeeded with warning: Could not find ref 'main' in refs/heads, refs/remotes/upstream or refs/remotes/origin. You may see unexpected issues and changes. Please make sure to fetch this ref before pull request analysis.
- [INFO] SonarQube Cloud: Analysis succeeded with warning: Shallow clone detected during the analysis. Some files will miss SCM information. This will affect features like auto-assignment of issues. Please configure your build to disable shallow clone.
- Set
sonar.scanner.scanAll=false
to avoid this warning:- [INFO] SonarQube Cloud: Analysis succeeded with warning: Multi-Language analysis is enabled. If this was not intended and you have issues such as hitting your LOC limit or analyzing unwanted files, please set "/d:sonar.scanner.scanAll=false" in the begin step.
And now we can view our code coverage in SonarQube:
And in Azure Pipelines!
Check out the example project at https://dev.azure.com/gardiner/_git/SonarCloudDemo, and you can view the SonarQube analysis at https://sonarcloud.io/project/overview?id=Gardiner_SonarCloudDemo
- •
- 1
- 2