A Bitbucket pipeline for PowerShell scripts
Using Bitbucket Pipelines to automate the publishing of PowerShellGet script packages to multiple release channels.
Published in Articles on by Michiel van Oosterhout ~ 4 min read
This article demonstrates an implementation of the Git-based workflow described in an earlier article. The workflow, implemented using a Bitbucket pipeline, will publish a PowerShell script package to a prerelease and a release channel. We will use the same script described previously, with a minor modification.

Git repository layout
The code for the script package will be hosted in a dedicated Bitbucket repository. The script file and its unit tests file should be added to the root of the repository. A Bitbucket pipeline is defined in a YAML file named bitbucket-pipelines.yaml
, which must be saved in the root of the repository.
bitbucket-pipelines.yaml
Example.ps1
Example.Tests.ps1
Publish-PowerShellScript.ps1
(The Publish-PowerShellScript.ps1
file is described in detail in a previous article.)
Bitbucket pipeline
Docker image
Bitbucket pipelines run inside a Docker container, which by default is based on the atlassian/default-image:latest
. But we need PowerShell, so we use Microsoft's official PowerShell image.
image: mcr.microsoft.com/powershell:7.2-ubuntu-22.04
You could create a custom Docker image as well. That could contain all the software required by the script, and could even use pwsh
as the default shell. But this outside of the scope of this article.
Checkout
We configure the checkout to perform a clone of the Git repository with --depth 2
, a requirement of the script.
clone:
depth: 2
Publish
Bitbucket pipelines supports YAML anchors as a way to compose complex pipelines out of re-usable parts. Each pipeline will re-use this publish step:
definitions:
steps:
- step: &publish
script:
- apt-get update && apt-get install -y git dotnet6
- |
pwsh -c '
Set-PackageSource PSGallery -Trusted > $null
Install-Module Pester
Install-Module PSScriptAnalyzer'
- |
pwsh -c '
. ./Publish-PowerShellScript.ps1
Publish-PowerShellScript `
-Name "Example" `
-Ref ($Env:BITBUCKET_TAG ?? $Env:BITBUCKET_BRANCH) `
-Build $Env:BITBUCKET_BUILD_NUMBER `
-NuGetApiKey "$Env:NUGET_API_KEY" `
-NuGetUrl "$Env:NUGET_URL"'
Since our Docker image is missing a few required software packages and PowerShell modules, the first 2 script
entries install these requirements. Because our image is Ubuntu-based we can use apt-get
to install Git and the .NET CLI. Notice that we need to call pwsh
to run PowerShell commands, since the Docker image we use uses Bash as its default shell.
The last entry executes the script. Similar to the Azure DevOps pipeline described in the previous article we must pass a value to the -Name
parameter, because otherwise the parameter would take its name from the current working directory, which in a Bitbucket pipeline is /opt/atlassian/pipelines/agent/build/
.
The value of the -Ref
parameter is the value of the current version tag or current branch (see section on triggers below). This will be used by the script to determine the correct versioning strategy.
The value of the -Build
parameter is an incrementing number that uniquely identifies the run, and will be used by the script to generate an incrementing prerelease label for each iteration of a pull request.
The values for both -NuGet
parameters will come from the environment set for each trigger (see section on triggers below). This way the script does not need to know about any specific release channel.
Failed tests report
Bitbucket pipelines has built-in support for reporting failed unit tests, which requires test data in the JUnit format.
after-script:
- |
pwsh -c '
New-Item test-results -Force -Type Directory > $null
$transform = [System.Xml.Xsl.XslCompiledTransform]::new();
$transform.Load("https://raw.githubusercontent.com/jenkinsci/nunit-plugin/ec8e9079/src/main/resources/hudson/plugins/nunit/nunit-to-junit.xsl");
$transform.Transform("./artifacts/tests.xml", "./test-results/tests.xml")'
artifacts:
- test-results/tests.xml
The after-script
, which runs even if the script
before it failed, transforms the NUnit test data to JUnit format, and places the data in the test-results
directory. By declaring this file an artifact, Bitbucket Pipelines will automatically report any failed tests.

Triggers
The pipeline should be triggered when commits are pushed to the main branch, when a pull request is (re)opened or its head branch is updated, and when a semantic version tag (e.g. v1.2.3
) is created.
pipelines:
branches:
main:
- step:
<<: *publish
name: Publish to MyGet
deployment: MyGet
pull-requests:
'**':
- step:
<<: *publish
name: Publish to MyGet
deployment: MyGet
tags:
v*.*.*:
- step:
<<: *publish
name: Publish to PowerShell Gallery
deployment: PowerShell Gallery
Bitbucket Pipelines requires us to declare a separate pipeline per trigger. The use of YAML anchors allows us to re-use the publish step declared earlier in each pipeline.
Each step selects a predefined environment, which is similar to environments in GitHub repositories. Each environment can have (secret) variables associated with it, which become available to steps as environment variables. This is how can use the environments to represent release channels.
We use wildcards (called glob patterns) to limit the tags that may trigger our pipeline. The wildcard support for triggers in Bitbucket Pipelines is limited compared to GitHub Actions. Where GitHub supports a subset of regular expressions (e.g. v[0-9]+.[0-9]+.[0-9]+
), Bitbucket only supports *
to match zero or more characters except /
and **
to also match /
.
Summary
Triggers and pipelines in Bitbucket Pipelines can be used to automate the publishing of PowerShell Get script packages to multiple release channels. YAML anchors allow re-use and composition of steps. By targeting a specific environment, each pipeline ensures that the script remains unaware of release channels.