In my previous post I explained the basic steps of building a CI pipeline for a Go library with Dagger. Let’s move onto the next level: Go CLI applications.
TL;DR: Check out this repository for a complete example.
Important: As in the previous post, I’m going to focus on Go CLI specific details and will not explain Dagger basic concepts. Please check out the documentation for an introduction into Dagger and my previous post for Go related examples.
Recap
In my previous post I defined four common categories of CI steps:
- Running various builds and tests in a build matrix (eg. for multiple Go versions)
- Static analysis and linters
- Publishing analysis results (eg. code coverage) to a third-party service
- Publishing artifacts
One could argue that the last two categories are actually the same, but they usually serve different audiences, so I prefer listing them separately.
Most steps of the Go library use case fit into the first three categories. While they might still be present in the pipeline for a CLI tool, the most important step for this use case is building and publishing artifacts.
Prior art: GoReleaser
GoReleaser is the standard tool in the Go ecosystem for building and publishing Go CLI applications. It’s capable of building, packaging and publishing CLI applications in several package formats to several artifact stores.
GoReleaser itself is a CLI application (written in Go) which makes it a good candidate for being integrated into a Dagger pipeline.
To do that, create a new CUE file for the action definition (in ci/goreleaser/release.cue
):
// Release Go binaries using GoReleaser
#Release: {
// Source code
source: dagger.#FS
// GoReleaser version
version: *"1.10.1" | string
_image: docker.#Pull & {
source: "index.docker.io/goreleaser/goreleaser:v\(version)"
}
go.#Container & {
name: "goreleaser"
entrypoint: []
"source": source
input: _image.output
command: {
name: "goreleaser"
}
}
}
Similarly to other Go related actions, we use the go.#Container
definition to make use of all the
caches and mounts set in it.
Next, create a new action in your Dagger plan:
release: goreleaser.#Release & {
source: client.filesystem["."].read.contents
env: {
if client.env.GITHUB_TOKEN != _|_ {
GITHUB_TOKEN: client.env.GITHUB_TOKEN
}
}
}
Finally, add and commit all pending changes,
tag a new version and run GITHUB_TOKEN=<TOKEN> dagger do release
to publish a new release to GitHub.
Custom release pipeline
Looking closer at what GoReleaser does under the hood, it’s possible to do with a “pure” Dagger pipeline:
- Build the application for multiple targets/platforms (build matrix)
- Package the binaries and additional files (eg.
README.md
) into archives/distro packages - Push the resulting packages to artifact stores (eg. GitHub Releases)
Build the application
Let’s start with the build step. Similarly to running tests for a library, you can use Cue templating to setup a build matrix to support multiple OSes and architectures:
build: {
"linux/amd64": _
"darwin/amd64": _
"windows/amd64": _
[platform=string]: go.#Build & {
source: client.filesystem["."].read.contents
package: "."
os: strings.Split(platform, "/")[0]
arch: strings.Split(platform, "/")[1]
ldflags: "-s -w"
env: {
CGO_ENABLED: "0"
}
}
}
You can build the application (all variants) with dagger do build
.
Packaging
The next step is packaging binaries and other files (eg. README.md
) into archives.
Since there is no builtin archive action in Dagger Universe, we have to build our own.
First we need to create an image with the necessary tools (in ci/archive/image.cue
):
// Build an archive base image
#Image: {
alpine.#Build & {
packages: {
bash: _
coreutils: _
p7zip: _
}
}
}
Next, we need a definition for an action that creates an archive from a list of files.
The action has to be able to create .tar.gz
files for Linux and Darwin,
.zip
files for Windows:
// Create a new archive
#Create: {
// Source files for the archive
source: dagger.#FS
// Archive name
name: string
_image: #Image
_sourcePath: "/src"
bash.#Run & {
input: _image.output
script: contents: """
case "\(name)" in
*.tar.gz | *.tgz)
tar -czvf \(name) *
;;
*.zip)
7z a \(name) *
;;
*)
echo "Unsupported archive type"
exit 1
;;
esac
mkdir -p /result
mv \(name) /result
"""
workdir: _sourcePath
mounts: {
"source": {
dest: _sourcePath
contents: source
}
}
export: directories: "/result": _
}
}
As a next step, let’s add package building to our Dagger plan!
Binary archives often contain additional files, like README.md
,
so prepare those files first:
package: {
_files: core.#Copy & {
input: dagger.#Scratch
contents: client.filesystem["."].read.contents
include: [
"README.md",
"LICENSE",
]
}
// ...
}
Create the actual archives using the archive.#Create
we defined earlier.
The input for each archive is the binary from the build
action and the
additional files we prepared earlier:
package: {
// ...
_archives: {
"linux/amd64": _
"darwin/amd64": _
"windows/amd64": _
[platform=string]: archive.#Create & {
_source: core.#Merge & {
inputs: [
_files.output,
if platform == "darwin/amd64" {
build."darwin/amd64".output
},
if platform == "linux/amd64" {
build."linux/amd64".output
},
if platform == "windows/amd64" {
build."windows/amd64".output
},
]
}
source: _source.output
_os: strings.Split(platform, "/")[0]
_arch: strings.Split(platform, "/")[1]
_type: string | *"tar.gz"
if _os == "windows" {
_type: "zip"
}
name: "dagger-go-cli_\(_os)_\(_arch).\(_type)"
}
}
// ...
}
Finally, define the output of the package
action:
package: {
// ...
_packages: core.#Merge & {
inputs: [
package._archives."linux/amd64".export.directories."/result",
package._archives."darwin/amd64".export.directories."/result",
package._archives."windows/amd64".export.directories."/result",
]
}
output: _packages.output
}
package.output
will contain all three archives for the three supported platforms.
You can build them running dagger do package
.
In case you want to check the content of the resulting archives, you can write them to the filesystem by adding the following to the Dagger plan:
client: filesystem: "./_build": write: contents: actions.package.output
Publishing the release on GitHub
The final step is creating a release on GitHub and uploading the archives. The easiest way to do that is using the GitHub CLI.
First (as usual), we need to create an image definition (in ci/github/image.cue
):
#Image: {
version: string | *"2.13.0"
docker.#Build & {
steps: [
docker.#Pull & {
source: "index.docker.io/alpine:3.15.0@sha256:21a3deaa0d32a8057914f36584b5288d2e5ecc984380bc0118285c70fa8c9300"
},
docker.#Run & {
command: {
name: "apk"
args: ["add", "bash", "curl", "git"]
flags: {
"-U": true
"--no-cache": true
}
}
},
bash.#Run & {
script: contents: """
apk add curl tar
curl -L https://github.com/cli/cli/releases/download/v\(version)/gh_\(version)_linux_amd64.tar.gz | tar -zOxf - gh_\(version)_linux_amd64/bin/gh > /usr/local/bin/gh
chmod +x /usr/local/bin/gh
"""
},
]
}
}
Then we can create the definition for the release action (in ci/github/release.cue
):
#Release: {
// Source files
source: dagger.#FS
// Artifact files
artifacts: dagger.#FS
// Create release from this specific tag
tag: string
_image: #Image
_sourcePath: "/src"
_artifactPath: "/artifacts"
bash.#Run & {
input: *_image.output | docker.#Image
script: contents: "gh release create --title '\(tag)' \(tag) /artifacts/*"
workdir: _sourcePath
mounts: {
"source": {
dest: _sourcePath
contents: source
}
"artifacts": {
dest: _artifactPath
contents: artifacts
}
}
}
}
Finally, we can integrate the new action into your Dagger plan:
release: github.#Release & {
source: client.filesystem["."].read.contents
artifacts: package.output
tag: client.env.GITHUB_REF_NAME
env: {
if client.env.GITHUB_TOKEN != _|_ {
GITHUB_TOKEN: client.env.GITHUB_TOKEN
}
}
}
You can run GITHUB_REF_NAME=SOME_TAG GITHUB_TOKEN=<TOKEN> dagger do release
locally
to create a new release or run it on GitHub Actions.
Conclusion
GoReleaser is the standard tool for publishing CLI applications and it can be easily integrated into a Dagger pipeline. Based on the complexity of the pipeline, it might not worth the effort compared to just running GoReleaser on GitHub Actions, but Dagger can still be useful when it comes to debugging pipelines.
Alternatively, you can choose to build your pipeline without GoReleaser which gives you more control, but requires (a lot) more work that may not be worth the investment for small projects.
Personally, I’m looking forward to see which pattern becomes more popular in the Dagger community: using ready-made tools (like GoReleaser, MegaLinter, etc) integrated into Dagger or writing “pure” Dagger pipelines using lower-level tools.
Make sure to check out the complete example for more details.