Published in Articles on by Michiel van Oosterhout ~ 6 min read

Recent articles demonstrated how to publish a PowerShell script package using the automation features (typically called pipelines) of 4 popular Git hosting services: GitHub, Azure DevOps, Bitbucket, and GitLab. The approach assumed a dedicated Git repository for the PowerShell script package. But what if you want to use a single Git repository for multiple PowerShell script packages? This article explains the changes to the pipeline definitions that are required to make that work. The publish script remains unchanged.

General

Regardless of the platform that hosts the Git repository and runs the pipelines, a general approach is recommended for the repository layout and for namespacing version tags.

Git repository layout

Each PowerShell script package will be published from its own (preferably top-level) directory in the Git repository. So given 2 scripts, Example1.ps1 and Example2.ps1, the repository layout should look like this:

  • Example1/
    • Example1.ps1
    • Example1.Tests.ps1
  • Example2/
    • Example2.ps1
    • Example2.Tests.ps1

A pipeline for one of these scripts should filter its triggers to the corresponding path, and ensure the publish script runs in that directory.

Tag prefixes

Filtering by changes made to a certain directory is only relevant for pipelines triggered by commits. Pipelines triggered by tags require the tag to carry the information that can be used to determine for which PowerShell script the tag is meant. Any kind of naming pattern can work, but for consistency with the repository layout a pattern with a path-like prefix is probably most intuitive: Example1/v1.2.3.

GitHub Actions

We will need one workflow per PowerShell script package (the caller workflows), and one [re-usable workflow] (the callee workflow), that is called by the caller workflows. The workflow files should be added to the repository like this:

  • .github/
    • workflows/
      • Example1.yaml
      • Example2.yaml
      • publish-powershell-script.yaml

The caller workflows Example1.ps1 and Example2.ps1 define the triggers and filters such that each caller workflow only triggers for one specific PowerShell script package. The only job of the caller workflows is to call the re-usable workflow, passing the path from which to publish the PowerShell script package, and any secrets.

on:
  push:
    branches:
      - main
    tags:
      - Example1/v[0-9]+.[0-9]+.[0-9]+
    paths:
      - Example1/**
  pull_request:
    paths:
      - Example1/**

jobs:
  publish-powershell-script:
    uses: ./.github/workflows/publish-powershell-script.yaml
    with:
      path: Example1
    secrets: inherit

The re-usable workflow is almost the same as the one shown in the original article, except for 2 changes. The first change is the replacement of its triggers by a workflow_call trigger that requires a path input:

on:
  workflow_call:
    inputs:
      path:
        required: true
        type: string

The second change is in the Publish step, where the value of the path input is used to set the working-directory (and the publish script is sourced from its parent directory), and to remove the prefix from the semantic version tag (e.g. Example1/v.1.2.3 becomes v1.2.3),

      - name: Publish
        working-directory: ${{ inputs.path }}
        run: |
          . ../Publish-PowerShellScript.ps1
          Publish-PowerShellScript `
            -Ref ( "${{ github.event_name }}" -eq "pull_request" ? "${{ github.head_ref }}" : "${{ github.ref_name }}".Replace("${{ inputs.path }}/", "")) `
            -Build ${{ github.run_number }} `
            -NuGetApiKey "${{ secrets.NUGET_API_KEY }}" `
            -NuGetUrl "${{ secrets.NUGET_URL }}" `
            -InformationAction Continue

Azure DevOps pipelines

We will need one pipeline per PowerShell script package, and an extendable template. The files should be added to the repository like this:

  • Example1/
    • azure-pipeline.yaml
  • Example2/
    • azure-pipeline.yaml
  • publish-powershell-script.yaml

With Azure DevOps, pipelines have to be created manually through the user interface. It is recommended for the pipelines to be added to a folder that corresponds with the repository name, and for each pipeline to be named after the directory it is in.

The pipelines define the triggers and filters such that each pipeline only triggers for one specific PowerShell script package. The pipelines will extend from a template, setting the templates path parameter to the path from which to publish the PowerShell script package.

trigger:
  branches:
    include:
      - main
  tags:
    include:
      - Example1/v*.*.*
  paths:
    include:
      - Example1/**

extends:
  template: ../publish-powershell-script.yaml
  parameters:
    path: Example1

The template is almost the same as the pipeline shown in the original article, except for 3 changes. The first change is the replacement of its triggers by a parameters section that declares the path parameter:

parameters:
  - name: path
    type: string

The second change is in the variables section, where the value of the path parameter is used to remove the prefix from the semantic version tag (e.g. Example1/v.1.2.3 becomes v1.2.3):

  - ${{ elseif startsWith(variables['Build.SourceBranch'], 'refs/tags/') }}:
    - group: PowerShell Gallery
    - name: Ref
      value: ${{ replace(variables['Build.SourceBranchName'], '${{ parameters.path }}/', '') }}

The 3rd and last change is in the Publish step, where the value of the path parameter is used to set the workingDirectory. The publish script is then sourced from its parent directory.

      - pwsh: |
          . ../Publish-PowerShellScript.ps1
          Publish-PowerShellScript `
              -Ref "$(Ref)" `
              -Build $(Build.BuildId) `
              -ArtifactsPath "$(Build.ArtifactStagingDirectory)" `
              -NuGetApiKey "$(NuGetApiKey)" `
              -NuGetUrl "$(NuGetUrl)" `
              -InformationAction Continue
        displayName: Publish
        workingDirectory: ${{ parameters.path }}

Bitbucket pipelines

Bitbucket does not support multiple pipelines for different subdirectories.

GitLab CI/CD

GitLab supports hierarchical pipeline execution, where one pipeline can trigger another pipeline. The triggered pipeline is a downstream pipeline. When the downstream pipeline is in the same repository it is a child pipeline. The .gitlab.yml file contains the parent pipeline with one job for each PowerShell script package:

stages:
  - triggers

Example1:
  stage: triggers
  trigger:
    include: publish-powershell-script.yml
    strategy: depend
  rules:
    - changes:
      - Example1/**
  variables:
    POWERSHELL_PACKAGE: Example1
    TAG_PREFIX_PATTERN: /^Example1\/v\d+\.\d+\.\d+$/

Example2:
  needs:
    - Example1
  stage: triggers
  trigger:
    include: publish-powershell-script.yml
    strategy: depend
  rules:
    - changes:
      - Example2/**
  variables:
    POWERSHELL_PACKAGE: Example2
    TAG_PREFIX_PATTERN: /^Example2\/v\d+\.\d+\.\d+$/

The original .gitlab-ci.yml file has been renamed to publish-powershellget-package.yml and is included by each parent pipeline using the trigger:include keyword. The use of strategy:depend and needs makes the jobs in the parent pipeline sequential, which prevents race conditions when multiple child pipelines use the same deployment environment. The rules:changes keyword ensures each job in the parent pipeline is only created when there are changes under the corresponding path. The POWERSHELL_PACKAGE and TAG_PREFIX_PATTERN variables are set to communicate to the child pipeline for which package it should run. This approach is similar to the pipeline architecture documented here.

The included file is almost the same as the one shown in the original article, except for the fact that the POWERSHELL_PACKAGE and TAG_PREFIX_PATTERN variables are used to ensure the child pipeline runs for the correct PowerShell script package.

First, the POWERSHELL_PACKAGE variable is used to change the current working directory before running the publish script:

.publish: &publish
  stage: publish
  script:
    - cd $POWERSHELL_PACKAGE

Second, the POWERSHELL_PACKAGE variable is used to change the current working directory before running the script that creates the test report for GitLab CI/CD:

  after_script:
    - cd $POWERSHELL_PACKAGE

Third, the POWERSHELL_PACKAGE variable is used to specify the path to the test report artifact:

  artifacts:
    reports:
      junit: $POWERSHELL_PACKAGE/artifacts/junit.xml

And finally, the TAG_PREFIX_PATTERN variable is used to limit the tag job to a version tag that is specific to one PowerShell script package.

tag:
  <<: *publish
  environment: PowerShell Gallery
  rules:
    - if: $CI_COMMIT_TAG =~ $TAG_PREFIX_PATTERN

The last change is in script, where the POWERSHELL_PACKAGE variable is used to remove the prefix from the semantic version tag (e.g. Example1/v.1.2.3 becomes v1.2.3):

    - |
      pwsh -c '
          . ../Publish-PowerShellScript.ps1
          Publish-PowerShellScript `
              -Ref ($Env:CI_MERGE_REQUEST_SOURCE_BRANCH_NAME ?? $Env:CI_COMMIT_BRANCH ?? $Env:CI_COMMIT_TAG.Replace("$Env:POWERSHELL_PACKAGE/", "")) `
              -Build $Env:CI_PIPELINE_IID `
              -NuGetApiKey "$Env:NUGET_API_KEY" `
              -NuGetUrl "$Env:NUGET_URL"'

Summary

A single repository can be used to maintain multiple PowerShell scripts. Publishing script packages from such a repository requires a slightly different approach. The automation features of GitHub and Azure DevOps lend themselves quite naturally to this scenario, with caller/callee workflows and extendable templates respectively. Bitbucket does not support this scenario at all, and GitLab's parent-child setup is not a good fit.