- contribute
- Release process
- Context
- Prepare your checklist
- TRAP: whenever you go AFK for long times
- Requirements
- Environment
- Pre-freeze
- Sanity check
- Freeze
- Set base branch
- Manual testing coordination
- Update included files
- If preparing a final release
- Update other base branches
- Update more included files
- Enable OpenPGP signing
- Build the almost-final images
- SquashFS file order
- Build images
- Generate the OpenPGP signatures and Torrents
- Get the internal manual testing started
- Prepare incremental upgrades
- Check that incremental upgrades are working
- Done with OpenPGP signing
- Upload images
- Run the automated test suite
- Public testing
- Lead the internal manual testing to conclusion
- Notify the Trusted Reproducer
- Prepare announcements and blog posts
- Go wild!
- Prepare for the next development cycle
- Related pages
See the release schedule.
Context
Release types
There are exactly 3 types of releases:
A bugfix release
A major release
A release candidate. A release candidate can only be made for a major release
We will not cover beta release in this document.
Both a bugfix release and a major release are called "final". A release candidate is not a final release.
Sections that only apply to some of those types of release, we'll have it written clearly in the title.
If you want to know whether the release you are doing is a major or not, you need to check our release calendar, and use these criteria:
if nothing is specified about being a major, then it's most probably a bugfix release
if it is a major, there has to be a release candidate before the final
Plan your journey
The release process is long. With proper planning and some luck, it's almost enjoyable.
Here are some useful tips that will hopefully help you have a pleasant release process:
if time allows, try to unblock other people's work. For example, reaching the Get the internal manual testing started on Monday would be great, because it allows manual testers to plan their job better.
when you are clocking off, or just spending some time afk, consider keeping your CPU busy with some long-running jobs. Here are some good candidates:
- Building the final image
- Building IUKs
- Running the automated test suite
cross your fingers 🤞
Prepare your checklist
Merge the current
master
branch into the branch used to prepare the release (stable
for a bugfix release,testing
for a major release,devel
for a RC).Copy this very file from the branch used to prepare the release to some place outside of Git.
Close the version of this file that's in Git.
From now on, follow the instructions in your own local copy, using it as a checklist: delete steps once you've successfully completed them. This ensures you won't mistakenly skip a step, without relying on memory, so "oops, I forgot X" cannot happen.
Remove useless sections. Some sections of this checklist only apply to specific release types (RC, Major, Bugfix, etc.). While you can skip them just-in-time, it's easier and less error-prone to do that now that you have a fresh mind. You can search for
^##* If
.
TRAP: whenever you go AFK for long times
Keep this section at the top of this copy of the release process, so you don't forget
When you clock off, or take any other kind of long break, please comunicate this on XMPP. Clarify:
when do you expect to come back
what's the status right now
try to give an ETA (or clarify that you cannot) for reaching this steps:
When will the final image be available?
When will manual testers be able to start their work?
Requirements
Packages
To release Tails you'll need some packages installed:
oathtool gitlab-cli j2cli intltool jq tidy libnotify-bin mktorrent podman pwgen python3-bs4 python3-debian python3-git python3-gitlab python3-jinja2 python3-html2text python3-sh python3-toml python3-voluptuous transmission-cli moreutils wl-clipboard rename i18nspector
Note: if the Debian version you are running does not include all required Go packages, such as podman, you should be able to install them from a more recent Debian release with APT pinning.
perl5lib
dependenciesiuk
dependenciessquashfs-tools-ng
andlibsquashfs1
from Bookwormpo4a
0.62:Different versions extract Markdown headings in a different way, which makes tons of strings fuzzy.
Different versions emit different sets of message flags, some of which are not supported by i18nspector yet.
packages to build a local version of the website
System configuration
Enable unprivileged user namespaces
If you're running a Debian kernel older than 5.10.1-1~exp1, set the
kernel.unprivileged_userns_clone
sysctl to 1:echo 'kernel.unprivileged_userns_clone=1' \ | sudo tee /etc/sysctl.d/unprivileged-user-namespaces.conf && \ sudo sysctl --system
Configuration files
To release Tails you need:
The correct version of
python3-gitlab
.~/.python-gitlab.cfg
You need at least this content:
[global] ssl_verify = true [TailsRM] url = https://gitlab.tails.boum.org per_page = 100 private_token = XXX [Tails] url = https://gitlab.tails.boum.org per_page = 100 private_token = XXX
Then:
In the
TailsRM
section, set the value of theprivate_token
option to therole-release-automation
GitLab user's API token, which you'll find inrm.git
's keyringer. Test it with:python-gitlab --gitlab TailsRM -o json project-issue get --project-id 3 --iid 18122 | jq .title
In the
Tails
section, set the value of theprivate_token
option to a GitLab API token for your own user. Test it with:python-gitlab --gitlab Tails -o json project-issue get --project-id 3 --iid 18122 | jq .title
~/.config/tails/release_management/local.yml
If you have no such file yet, copy
config/release_management/examples/local.yml
there and adjust to your local environment.Optional: configure automailer by running
./bin/automailer.py --long-help
, which will give you tips on how to configure it. This will make sending emails a bit easier.
Environment
To be able to copy'n'paste the code snippets found on this page, you need to set a bunch of environment variables.
Generate the base environment
To generate the
~/.config/tails/release_management/current.yml
template, run:./bin/rm-config generate-boilerplate
Edit
~/.config/tails/release_management/current.yml
and replaceFIXME
:s:"${EDITOR:?}" ~/.config/tails/release_management/current.yml
Ensure there's a
iuks/v2
directory under the directory you specified forisos
in~/.config/tails/release_management/local.yml
.python3 <<EOF import yaml,os from pathlib import Path fpath = Path(os.path.expanduser("~/.config/tails/release_management/local.yml")) isos = Path(yaml.safe_load(fpath.open())["isos"]) iuks = isos / "iuks/v2" if not iuks.exists(): raise Exception("IUKS dir '%s' not found" % iuks) EOF
Generate the resulting environment variables and export them into your environment:
. $(./bin/rm-config generate-environment --stage base)
Notes
Unless the release process explicitly instructs you to change the value of one such variable, treat it as a constant: else, inconsistency will surely arise, which can cause trouble later on.
Regarding version numbers, what follows supports just fine the case when we do something else than alternating bugfix and major releases consistently. For example, if the next two releases are bugfix ones, do not set
$NEXT_PLANNED_MAJOR_VERSION
to one of these bugfix releases. Instead, set it to the version number of the next major release.The
$NEXT*VERSION
constants are used only for two types of operations: preparing upgrade-description files and adding changelog entries. This two types of operations have to be consistent with each other: for example, if one adds a dummy entry for version X in a changelog, an UDF must exist for version X as well… hence the use of shared constants to encode the values that must be the same on both sides :)
Pre-freeze
Check the Release Manager role
Make sure all the shift preparation steps, in the release manager role documentation, have been done. Fill the gaps if needed, otherwise the release process will be blocked later.
Coordinate with Debian security updates
Look at the list of upcoming Debian Security Advisories (DSA) found in the Debian security tracker's Git repository and in the debian-security-announce mailing list.
Rationale: we don't want to release Tails and learn 2 days later about important security issues fixed by Debian security updates. In some cases, it may be worth delaying a Tails release a bit, while waiting for a DSA to happen.
Sanity check
Visit the Jenkins RM view and check that the jobs for the release branch have good enough results:
the last build and reproducibility testing builds have succeeded for:
if you're preparing a final release: the release branch
else, if you're preparing a release candidate: the
devel
branch
look at the last build of the corresponding
test_Tails_ISO_*
job; if anything looks unusual or fishy, check the few previous builds to see if it's a one-off robustness issue, a serious regression, or what not. In particular:Check the results of Scenario: All packages are up-to-date. If it does not pass, evaluate whether the problems fixed in the missing package update are worth delaying the release. If not, create an issue to track them.
Freeze
This section ensures freezing is done. Freezing is relevant for Release Candidates and for bugfix releases. You can skip this if you're releasing a final major version.
If preparing a RC
Check that you are indeed preparing a RC:
[ "${DIST:?}" = alpha ]
Follow this part ONLY if you are preparing the release candidate for a major release.
Merge the
master
Git branch intodevel
:git checkout devel && git fetch origin && git merge origin/devel && git merge --no-ff origin/master
Merge each APT overlay suite listed in the
devel
branch'sconfig/APT_overlays.d/
into thedevel
APT suite:./bin/merge-APT-overlays devel
Merge the
devel
Git branch into thetesting
one:git checkout testing && git merge origin/testing && git merge devel
... and check that the resulting
config/APT_overlays.d/
in thetesting
branch is empty.Hard reset the
testing
custom APT suite to the current state of thedevel
one:./bin/reset-custom-APT-suite testing devel
Freeze the time-based APT repository snapshots that shall be used during the code freeze, and those used by our builder VM:
./bin/freeze-all-APT-snapshots
Make it so the time-based APT repository snapshots are kept around long enough, by bumping their
Valid-Until
half a year from now:git checkout "$RELEASE_BRANCH" && \ ./bin/bump-APT-snapshots-expiration-date 183
If preparing a Bugfix release
Check that you are indeed preparing a Bugfix:
[ "${DIST:?}" = stable ] && [ "${MAJOR_RELEASE:?}" -eq 0 ]
Follow this part ONLY if you are preparing a bugfix release.
Merge the
master
Git branch intostable
:git checkout stable && git fetch && git merge origin/stable && git merge --no-ff origin/master
Merge each APT overlay suite listed in the
stable
branch'sconfig/APT_overlays.d/
into thestable
APT suite:./bin/merge-APT-overlays stable
In any case
Look at tails/tails-private. If anything was merged there during the current development cycle, you need to merge.
Set base branch
Reset the release branch's config/base_branch
:
(
echo "${RELEASE_BRANCH:?}" > config/base_branch
git commit config/base_branch \
-m "Restore ${RELEASE_BRANCH:?}'s base branch." || :
)
Manual testing coordination
Bootstrap internal manual testing coordination:
To create a new pad, visit https://pad.tails.net/new
To give the pad a clear title, add this section on top:
cat <<EOF --- title: Testing ${VERSION:?} --- EOF
Set the
PAD
environment variable to the URL of this pad.Copy the manual test suite into this pad.
Generate boilerplate for the call for manual testing:
./bin/generate-call-for-manual-testers \ --dist "${DIST:?}" \ --pad "${PAD:?}" \ --version "${VERSION:?}" | ./bin/automailer.py
Send the call for manual testing:
- encrypt and sign your email
- use the headers and body generated above
- adjust the body as needed
Update included files
Upgrade Tor Browser
If you did not import the new Tor Browser yet, do so now.
Update PO files
Expected time: 15-30 minutes depending on how many errors you'll get.
Pull updated translations for languages translated in Transifex, refresh the code PO files, and commit the result, including new PO files:
(
cd "${RELEASE_CHECKOUT:?}"
set -x
err() { echo "Error $*"; exit 1; }
./import-translations || err "importing translations"
./refresh-translations || err "refreshing translations"
./submodules/jenkins-tools/slaves/lint_po || err "in lint"
./bin/check-po-msgfmt || err "in msgfmt"
git add po
git commit -m 'Update PO files.' || :
)
If lint_po
complains:
- rollback the offending PO files and retry; worst case, delete it
- send a note to tails-l10n@boum.org [public] so that they get in touch with whoever can fix them.
Regenerate the localization list:
./bin/locale-descriptions generate > config/chroot_local-includes/usr/share/tails/browser-localization/descriptions
Take a look at error lines that are suggesting you to add more languages to po/po-to-mozilla.toml
. Copy-paste everything into a new issue (title: Update mozilla locale descriptions
, content: https://tails.net/contribute/release_process/update_locale_descriptions/
) and move on.
Manually inspect the diffs:
git diff config/chroot_local-includes/usr/share/tails/browser-localization/descriptions
Line reordering is fine
The only lines added or removed should reflect the addition or removal of a file with a matching name in
po/
.For example, if you see
zz-YY:YY
added inconfig/chroot_local-includes/usr/share/tails/browser-localization/descriptions
it must be because of apo/zz_YY.po
which includes a line"Language: zz_YY\n"
, orpo/zz.po
and"Language: zz\n"
if azz="zz_YY"
mapping exists inpo/po-to-mozilla.toml
.
Commit the changes (if any):
git commit -m "Refresh localization list" config/chroot_local-includes/usr/share/tails/browser-localization/descriptions || :
If preparing a final release
If we're about to prepare the images for a final (non-RC) release, then follow these instructions:
If preparing a Major release
Check that you are indeed preparing a major release:
[ "${DIST:?}" = stable ] && [ "${MAJOR_RELEASE:?}" -eq 1 ]
Merge each APT overlay suite listed in the testing
branch's
config/APT_overlays.d/
into the testing
custom APT suite:
./bin/merge-APT-overlays testing
If preparing a Bugfix release
Check that you are indeed preparing a bugfix release:
[ "${DIST:?}" = stable ] && [ "${MAJOR_RELEASE:?}" -eq 0 ]
Merge each APT overlay suite listed in the stable
branch's
config/APT_overlays.d/
into the stable
custom APT suite:
./bin/merge-APT-overlays stable
Update other base branches
Merge the release branch into
devel
:./bin/merge-main-branch "${RELEASE_BRANCH:?}" devel
If you did not proceed with the merge, it means you're in a non-standard situation, and need to investigate what's appropriate to do.
Thaw, on the devel branch, the time-based APT repository snapshots being used during the freeze:
git checkout devel && \ ./auto/scripts/apt-snapshots-serials thaw && \ git commit \ -m "Thaw APT snapshots." \ config/APT_snapshots.d/*/serial || :
It's fine if that results in a no-op (it depends on how exactly previous operations were performed).
Look at the APT overlays for
$RELEASE_BRANCH
anddevel
:for branch in ${RELEASE_BRANCH} devel; do git switch ${branch} echo "{{{ ${branch}" ls config/APT_overlays.d/ echo "}}}" done
Verify that the
$RELEASE_BRANCH
anddevel
branches have nothing inconfig/APT_overlays.d/
. If there still are unmerged APT overlays at this point (which is odd), please fix the situation by merging them usingbin/merge-APT-overlays
.Push the modified branches to Git:
git push origin \ "${RELEASE_BRANCH:?}:${RELEASE_BRANCH:?}" \ devel:devel
Update more included files
Changelog
Update the Changelog entry for the release you're preparing:
git checkout "${RELEASE_BRANCH:?}" && \
./bin/update-changelog --version "${MILESTONE:?}"
If you're preparing a release candidate:
[ "${DIST:?}" = alpha ]
Set the version for the new entry in
debian/changelog
to$VERSION
: the above code has incorrectly set it to$MILESTONE
.
If you're preparing a release candidate or a major release:
[ "${DIST:?}" = alpha ] || [ "${MAJOR_RELEASE:?}" -eq 1 ]
For each merge request listed in the new Changelog entry, copy the "Closes: #NNNNNN" string from the MR description into the relevant section of the Changelog.
In any case
Then, gather other useful information from the diff between the previous
version's .packages
file and the one from the to-be-released images:
Generate the diff.
if <span class="createlink"> "${DIST:?}" = stable </span>; then NEW_BRANCH="$RELEASE_BRANCH" else NEW_BRANCH=devel fi diff --color=never -u \ "wiki/src/torrents/files/tails-amd64-${PREVIOUS_STABLE_VERSION}.packages" \ <(curl --silent "https://nightly.tails.boum.org/build_Tails_ISO_${NEW_BRANCH}/lastSuccessful/archive/latest.packages") \ | wdiff --diff-input --terminal | grep -v --perl-regexp '^[~.:+a-zA-Z_0-9 \t\s-]*$'
In the diff, look for:
- new upstream releases of applications mentioned in features
- new upstream releases of other important components such as the Linux kernel
And those changes in
debian/changelog
Sanity check the version, commit, and push:
if [ "$(dpkg-parsechangelog -SVersion)" = "${VERSION:?}" ]; then git commit debian/changelog -m "Update changelog for ${VERSION:?}." else echo 'Error: version mismatch: please compare ${VERSION:?} with the last entry in debian/changelog' fi && \ git push origin "${RELEASE_BRANCH:?}"
Let Technical Writers know that the changelog is ready
Open the GitLab issue about the release notes
Notify
@technical-writers
that the changelog is ready, and provide them a link to latest relevant Jenkins built image, so they can use that to make screenshots or write documentation:( echo "Dear @technical-writers," echo "- The changelog entry is ready: https://gitlab.tails.boum.org/tails/tails/-/blob/${RELEASE_BRANCH:?}/debian/changelog" echo "- The latest corresponding images built by Jenkins are there: https://nightly.tails.boum.org/build_Tails_ISO_${NEW_BRANCH}/lastSuccessful/archive/build-artifacts" ) | pee cat wl-copy
Included website
./bin/prepare-included-website-for-release
Expect this to take 5-15 minutes. While the website is building, if this is not a good time for taking a break, you can follow the "Enable OpenPGP signing" instructions.
Ignored versions for automatic upgrades
Add to $IGNORED_TAGS
, in ./bin/iuk-source-versions
, any recent alpha or beta
release that gets no security support, and as such, shall not get automatic
upgrades. Here is a useful heuristics: if we did not do our manual test suite on
a release, it does not get security support.
Enable OpenPGP signing
If you have an OpenPGP smart card
If you have an OpenPGP smart card with a Tails signing subkey, go fetch it.
Until you reach "Done with OpenPGP signing":
You may leave it plugged.
You may unplug it, when you are not signing anything with it, if that's more convenient for you.
Otherwise: importing the signing key
This is only relevant when the master key has been reassembled, e.g. for signing a Tails emergency release where none of the usual release managers are available.
You should never import the Tails signing key into your own keyring, and a good practice is to import it to a tmpfs to limit the risks that the private key material is written to disk:
export GNUPGHOME=$(mktemp -d)
sudo mount -t ramfs ramfs "${GNUPGHOME:?}"
sudo chown $(id -u):$(id -g) "${GNUPGHOME:?}"
sudo chmod 0700 "${GNUPGHOME:?}"
gpg --homedir ${HOME:?}/.gnupg --export ${TAILS_SIGNATURE_KEY:?} | gpg --import
gpg --import path/to/private-key
Let's also ensure that strong digest algorithms are used for our signatures, like the defaults we set in Tails:
cp config/chroot_local-includes/etc/skel/.gnupg/gpg.conf "${GNUPGHOME:?}"
Build the almost-final images
Build ISO and USB images from the release branch, with
$TAILS_BUILD_OPTIONS
set like this:Set
defaultcomp
, so we can more accurately optimize our SquashFS file ordering.Do not set
keeprunning
, norignorechanges
, norrescue
. Expected time: 15-45 minutes depending on your computer.
Keep the resulting build artifacts until the end of this release process.
Record where the manifest of needed packages is stored:
To update the
~/.config/tails/release_management/current.yml
template with newly required variables, run:./bin/rm-config generate-boilerplate --stage built-almost-final
Edit
~/.config/tails/release_management/current.yml
and replaceFIXME
:s:"${EDITOR:?}" ~/.config/tails/release_management/current.yml
Generate the resulting environment variables and export them into your environment:
. $(./bin/rm-config generate-environment --stage built-almost-final)
SquashFS file order
Expected time: 20 minutes
Feel free to skip this section when preparing an emergency release.
Install the almost-final USB image to a USB stick.
Boot this USB stick a first time to trigger re-partitioning.
Shut down this Tails.
If possible, set up a wired connection to avoid having to deal with wireless settings.
Boot this USB stick on bare metal again.
Add
profile login
to the kernel command-lineUnless you've set up a wired connection, connect to Wi-Fi.
Automatically connect to Tor with Tor Connection.
Wait for Tor Connection to report success.
Start Tor Browser.
Backup the old sort file:
cp config/binary_rootfs/squashfs.sort{,.old}
3 minutes after the start of the GNOME session, the
boot-profile
process will be killed. Retrieve the new sort file from/var/log/boot-profile
.Copy the new sort file to
config/binary_rootfs/squashfs.sort
.To remove:
runtime-generated files that don't exist in the rootfs, in order to avoid confusing noise in the build output
the bits about
kill-boot-profile
at the end: they're only useful when profiling the bootsome hardware-dependent files
Run this command:
./bin/clean-squashfs-sort-file config/binary_rootfs/squashfs.sort
Inspect the Git diff (including diff stat), apply common sense:
diff -NaurB \ <( cut -d' ' -f1 config/binary_rootfs/squashfs.sort.old | sort ) \ <( cut -d' ' -f1 config/binary_rootfs/squashfs.sort | sort ) \ | less
Commit the new SquashFS sort file and clean up
git commit -m 'Updating SquashFS sort file' \ config/binary_rootfs/squashfs.sort && \ rm -f config/binary_rootfs/squashfs.sort.old
Build images
Expected time: 50-95 minutes depending on your computer and the available Jenkins workers.
Sanity check
Verify that the Tor Browser release used in Tails still is the most
recent. Also look if there's a new -buildX
tag (e.g.
tor-browser-60.3.0esr-8.0-1-build1
) for the Firefox version the Tor
Browser we want to ship is based on in these Git repositories:
- tor-browser-build activity
(
maint-X.Y
branch) - tor-browser activity
A new tag may indicate that a new Tor Browser release or rebuild is imminent.
Better catch this before people spend time doing manual tests.
Finalize and tag the release
Mark the version as "released" in the changelog:
dch --release --no-force-save-on-release --maintmaint && \ git commit -m "Mark Tails ${VERSION:?} as released." debian/changelog
To update the
~/.config/tails/release_management/current.yml
template with newly required variables, run:./bin/rm-config generate-boilerplate --stage finalized-changelog
Edit
~/.config/tails/release_management/current.yml
and replaceFIXME
:s:"${EDITOR:?}" ~/.config/tails/release_management/current.yml
Generate the resulting environment variables and export them into your environment:
. $(./bin/rm-config generate-environment --stage finalized-changelog)
Tag the release:
git tag -u "${TAILS_SIGNATURE_KEY:?}" \ -m "tagging version ${VERSION:?}" "${TAG:?}" && \ git push origin "${TAG:?}" "${RELEASE_BRANCH:?}"
Prepare the versioned APT suites
Expected time: 15 minutes
Versioned APT suite in our custom APT repository
Within a few minutes after pushing the new release's Git tag, a cronjob
creates a new APT suite in our custom APT repository. This new APT suite is
called $TAG
and is initially empty.
Wait for this APT suite to be created and initialize it with the packages currently found in the APT suite corresponding to the branch used to prepare the release:
while ! ssh reprepro@incoming.deb.tails.boum.org reprepro list "${TAG:?}" >/dev/null 2>&1; do
sleep 5
done && \
ssh reprepro@incoming.deb.tails.boum.org \
tails-merge-suite "${RELEASE_BRANCH:?}" "${TAG:?}"
Tagged snapshots of upstream APT repositories
Create tagged snapshots of upstream APT repositories:
./bin/tag-apt-snapshots "${ALMOST_FINAL_BUILD_MANIFEST:?}" "${TAG:?}"
Note:
This command takes a while (about a dozen minutes).
It's expected that the packages that were pulled from our custom APT repository are listed under "some packages were not found anywhere" (because we are currently not using time-based snapshots for our custom APT repository). However, no other package should be on that list. Now, we have a "safety" net, in case you don't notice such a problem: if other packages are missing, the next build (that will use the newly created partial, tagged APT repository) will fail.
Build the final images
Expected time: 30-75 minutes depending on your computer and the available Jenkins workers.
Build the final images:
rake build
Do not set
keeprunning
norrescue
in$TAILS_BUILD_OPTIONS
. Our build system will apply the correct compression settings automatically so don't bother setting it yourself.If the build fails and you commit some last-minute fix, then you must redo the previous step where you re-tag and force-push to origin!
Manually start a build of the release branch on Jenkins, removing other builds from the Build Queue if needed so it can start immediately. Take note of which job number was used for later (see
matching_jenkins_images_build_id
below).Note that jenkins automatically started a build earlier when you pushed the tag, but since the versioned APT suite wasn't finished by then we avoid using that build and rely on the manually started one to be safe.
Make sure the Jenkins build is right: you can recognize this is the case, when you see Jenkins starts building the website (which should happen in \~2-3 minutes, at least on
dragon
). Jenkins will take 20-65 minutes to complete the build, depending on theisoworkerN
that is used.This could be a good time for break!
Compare the new build manifest with the one from the previous, almost-final build:
diff -Naur \ "${ALMOST_FINAL_BUILD_MANIFEST:?}" \ "${ARTIFACTS:?}/tails-amd64-${VERSION:?}.build-manifest"
They should be identical, except that the
debian-security
serial might be higher.
Verify that Jenkins reproduced your images
Verify that Jenkins reproduced your images:
(
set -eu
cd "${ARTIFACTS:?}"
url="https://nightly.tails.boum.org/build_Tails_ISO_${RELEASE_BRANCH:?}/lastSuccessful/archive/build-artifacts/tails-build-artifacts.shasum"
curl "$url" > SHA512SUMS-jenkins.txt
grep --perl-regexp '[.]img|iso' SHA512SUMS-jenkins.txt | sha512sum -c -
)
If the ISO and USB images hashes match
yay, we're good to go!
The .build-manifest
may differ — that's OK.
Then:
To update the
~/.config/tails/release_management/current.yml
template with newly required variables, run:./bin/rm-config generate-boilerplate --stage reproduced-images
Edit
~/.config/tails/release_management/current.yml
and set thematching_jenkins_images_build_id
value to the ID of this job (an integer):"${EDITOR:?}" ~/.config/tails/release_management/current.yml
Generate the resulting environment variables and export them into your environment:
. $(./bin/rm-config generate-environment --stage reproduced-images)
If there is a hash mismatch
Now we are in a tricky situation: on the one hand it seems like a poor idea to block users from benefiting from this release's security updates, but on the other hand the failure might imply that something nefarious is going on. At this stage, no matter what, immediately fetch Jenkins' image, compare it with your, and try to rule out build system compromise:
sudo diffoscope \
--text diffoscope.txt \
--html diffoscope.html \
--max-report-size 262144000 \
--max-diff-block-lines 10000 \
--max-diff-input-lines 10000000 \
path/to/your/tails-amd64-${VERSION:?}.iso \
path/to/jenkins/tails-amd64-${VERSION:?}.iso
Do the same for the USB image as well.
Then carefully investigate the diffoscope
report:
If you cannot rule out that the difference is harmful: let's take a step back; we might be compromised, so we are in no position to release. Halt the release, involve the rest of foundations@tails.net, and then try to re-establish trust in all build machines and infra involved, etc. Have fun!
Otherwise, if the change is definitely harmless:
If the source of non-determinism is identified quickly and is easy and fast to fix, and the QA of the current images has not gone very far (so at least that time is not wasted), then you should consider abandoning the current version, and immediately start preparing an emergency release with:
the reproducibility fix,
a new changelog entry,
adjustments to the release notes so they are re-purposed for this emergency release (the abandoned release gets none, since it effectively never will be released publicly).
Otherwise, if the fix looks time-consuming or difficult, let's release anyway. But let's add a known issue about "This Tails release does not build reproducibility" to the release notes, linking to the issue where the nature of the reproducibility failure is clearly described.
Initialize the website release branch
The final steps towards publishing the release will be done in a new, dedicated branch.
if [ "${DIST:?}" = alpha ]; then
git checkout -b "${WEBSITE_RELEASE_BRANCH:?}" origin/master && \
git push -u origin "${WEBSITE_RELEASE_BRANCH:?}"
else
git checkout -b "${WEBSITE_RELEASE_BRANCH:?}" "${TAG:?}" && \
git push -u origin "${WEBSITE_RELEASE_BRANCH:?}"
fi
Generate the OpenPGP signatures and Torrents
Run the following command
to create a directory with a suitable name, go there, move the built
images to this brand new directory, generate detached OpenPGP
signatures for the images to be published (in the same directory as the
images and with a .sig
extension), then go up to the parent
directory, create a .torrent
file and check the generated .torrent
files metadata:
./bin/generate-images-signatures-and-torrents
Get the internal manual testing started
Tell testers that they can start testing. Remember to encrypt and sign your email!
./bin/automailer.py <<EOF
To: rm@tails.net, sajolida@pimienta.org
Subject: Testing Tails ${VERSION:?}
X-Attach: $(/bin/ls -1 "$ISOS/tails-amd64-${VERSION:?}/tails-amd64-${VERSION:?}.packages" "$ISOS/tails-amd64-${VERSION:?}/"tails-amd64-${VERSION:?}.{iso,img}.sig "$ISOS/tails-amd64-${VERSION:?}".{iso,img}.torrent | tr '\n' ,)
You can now download the images, verify them using the attached signatures, and start testing:
https://nightly.tails.boum.org/build_Tails_ISO_${RELEASE_BRANCH:?}/builds/${MATCHING_JENKINS_IMAGES_BUILD_ID:?}/archive/build-artifacts/
To verify downloaded data, run:
gpg --verify tails-amd64-${VERSION:?}.img.sig tails-amd64-${VERSION:?}.img
gpg --verify tails-amd64-${VERSION:?}.iso.sig tails-amd64-${VERSION:?}.iso
Please note that the attached Torrents don't work yet.
EOF
Send a message on XMPP, too:
Tails Team, please check your inbox and get the QA started!
Prepare incremental upgrades
Reference: design documentation
Sanity checks
Check that you have the correct version of squashfs-tools-ng
and libsquashfs1
installed:
[ "$(dpkg-query --showformat '${Version}\n' --show squashfs-tools-ng)" \
= '1.2.0-1' \
] || echo 'ERROR! Your squashfs-tools-ng is not the required version, so any generated IUKs will *not* be reproducible!'
[ "$(dpkg-query --showformat '${Version}\n' --show libsquashfs1)" \
= '1.2.0-1' \
] || echo 'ERROR! Your libsquashfs1 is not the required version, so any generated IUKs will *not* be reproducible!'
Build the Incremental Upgrade Kits locally
Verify there's enough free disk space in $IUKS_DIR
:
N_IUKS="$(echo ${IUK_SOURCE_VERSIONS?:} | wc -w)"
MIN_DISK_SPACE=$((N_IUKS * 700))
FREE_DISK_SPACE=$(/bin/df --block-size=M --output=avail "${IUKS_DIR:?}" \
| tail -n1 | sed --regexp-extended 's,M$,,')
if [ "$FREE_DISK_SPACE" -lt "$MIN_DISK_SPACE" ]; then
echo "ERROR! Not enough free space in ${IUKS_DIR:?}"
fi
To build the IUKS, run this command:
(
IUK_CREATION_LOG=$(mktemp --tmpdir=/var/tmp wrap_tails_create_iuks.XXXXXXXX.log)
(
set -eu
set -o pipefail
WORK_DIR=$(mktemp -d)
TAILS_REMOTE="$(git -C "${RELEASE_CHECKOUT?:}" remote get-url origin)"
PUPPET_TAILS_REMOTE=$(echo -n "${TAILS_REMOTE?:}" | perl -p -E 's,:tails/tails(?:[.]git)?\z,:tails/puppet-tails,')
cd "${WORK_DIR?:}"
git clone "$PUPPET_TAILS_REMOTE"
TMPFILES_CONF=/etc/tmpfiles.d/tails-create-iuks.conf
OUR_TMPDIR=/var/tmp/tails-create-iuks
echo "D ${OUR_TMPDIR} 0755 root root -" | sudo tee "$TMPFILES_CONF"
sudo systemd-tmpfiles --remove --create "$TMPFILES_CONF"
N_IUKS="$(echo ${IUK_SOURCE_VERSIONS?:} | wc -w)"
MIN_DISK_SPACE=$((N_IUKS * 3000))
FREE_DISK_SPACE=$(/bin/df --block-size=M --output=avail "${OUR_TMPDIR:?}" \
| tail -n1 | sed --regexp-extended 's,M$,,')
if [ "$FREE_DISK_SPACE" -lt "$MIN_DISK_SPACE" ]; then
echo "ERROR! Not enough free space in ${OUR_TMPDIR:?}"
exit 2
fi
time \
sudo \
nice \
./puppet-tails/files/jenkins/slaves/isobuilders/wrap_tails_create_iuks \
--tails-git-remote "file://${RELEASE_CHECKOUT?:}/.git" \
--tails-git-commit "${TAG?:}" \
--source-date-epoch "${SOURCE_DATE_EPOCH?:}" \
--local-isos-dir "${ISOS?:}" \
--tmp-dir "${TMPDIR:-${OUR_TMPDIR}}" \
--output-dir "${IUKS_DIR?:}" \
--source-versions "${IUK_SOURCE_VERSIONS?:}" \
--new-version "${VERSION?:}" \
--verbose \
--jobs "$(grep '^core id' /proc/cpuinfo | sort -u | wc -l)" \
2>&1 | tee "$IUK_CREATION_LOG"
) && \
cd "${IUKS_DIR?:}" && \
sha256sum Tails_amd64_*_to_${VERSION?:}.iuk > "${IUKS_HASHES?:}"
)
This command takes a long time. In parallel, while it is running, you can follow the next steps:
ISO history
Build the Incremental Upgrade Kits on Jenkins (you can start the build, but stop at the
rm-config --stage built-iuks
step which has to wait for your local IUK build to finish)If you expect you will have to do some manual testing yourself and you have a spare computer, you can prepare a USB stick and start testing.
ISO history
In this section, you will push the released ISO and USB images and their artifacts (.buildlog
,
.build-manifest
, and .packages
files) to our Tails ISO history git-annex
repo, so that:
- The Jenkins
build_IUKs
job can fetch them. - Our isotesters can fetch them from there for their testing.
Make sure that Jenkins has finished building the images so you have set
MATCHING_JENKINS_IMAGES_BUILD_ID
(see above), then run:
cd "${RELEASE_CHECKOUT:?}" && \
./bin/add-release-to-iso-history \
--version "${VERSION:?}" \
--isos "${ISOS:?}" \
--release-branch "${RELEASE_BRANCH:?}" \
--matching-jenkins-images-build-id "${MATCHING_JENKINS_IMAGES_BUILD_ID:?}"
Then, wait (a few minutes, */15
crontab) until the images appear
on https://iso-history.tails.boum.org/?C=M&O=D:
(
dirurl="https://iso-history.tails.boum.org/tails-amd64-${VERSION:?}/"
isourl="${dirurl}tails-amd64-${VERSION:?}.iso"
isopath="$ISOS/tails-amd64-${VERSION:?}/tails-amd64-${VERSION:?}.iso"
while ! curl --silent --fail --head "$isourl" > /dev/null; do
sleep 10
done
echo "Dir created: ${dirurl}"
expected_size="$(stat -L -c %s "${isopath}")"
while true
do
size=$(curl --silent --head "${isourl}" | grep -iPo '^content-length:\s*\K\d+')
echo "FOUND: ${size} bytes"
if [ "$size" -eq "${expected_size}" ]; then
echo 'FOUND: ready'
notify-send "Image published" 'found in ISO history'
exit 0
fi
sleep 10
done
)
Build the Incremental Upgrade Kits on Jenkins
Make sure the push to ISO history (started in the previous section) has finished, and that images have appeared on the web server: https://iso-history.tails.boum.org/?C=M&O=D
On https://jenkins.tails.boum.org/job/build_IUKs/configure, adjust the
SOURCE_VERSION
axis to list all versions in$IUK_SOURCE_VERSIONS
, and save the updated configuration.On https://jenkins.tails.boum.org/job/build_IUKs/build?delay=0sec, fill the form with these values:
TAILS_GIT_COMMIT
: the value of$TAG
in your release environmentSOURCE_DATE_EPOCH
: the value of$SOURCE_DATE_EPOCH
in your release environment, i.e. the same value you are using to "Build the Incremental Upgrade Kits locally"; you can find it in~/.config/tails/release_management/current.yml
NEW_VERSION
: the value of$VERSION
in your release environmentEXTRA_ARGS
: leave it blank
Click the Build button
After a few seconds, a new build appears on top of the Build History sidebar. Click on the progress bar of this new build.
Record the ID of the Jenkins
build_IUKs
build:To update the
~/.config/tails/release_management/current.yml
template with newly required variables, run:./bin/rm-config generate-boilerplate --stage built-iuks
Edit
~/.config/tails/release_management/current.yml
and set thecandidate_jenkins_iuks_build_id
value to the ID of thatbuild_IUKs
build (an integer):"${EDITOR:?}" ~/.config/tails/release_management/current.yml
Generate the resulting environment variables and export them into your environment:
. $(./bin/rm-config generate-environment --stage built-iuks)
Wait until the
build_IUKs
build completes successfully. It should take about 15 minutes for each member of the$IUK_SOURCE_VERSIONS
list, distributed acrossisoworkerN
workers.If you have a fast computer, you'll be waiting for Jenkins to be done, while your CPU is idle. Consider starting the automated test suite in the meanwhile to optimize this: running it does not depend on any of the IUK-related steps.
Verify that Jenkins reproduced your IUKs
Compare the IUKs
Run this command to verify that Jenkins reproduced your IUKs:
"${RELEASE_CHECKOUT:?}"/bin/copy-iuks-to-rsync-server-and-verify \
--hashes-file "${IUKS_HASHES:?}" \
--work-dir /srv/tmp \
--jenkins-build-id "${CANDIDATE_JENKINS_IUKS_BUILD_ID:?}"
The download step takes a few minutes.
If this verification succeeds, good!
Otherwise, see the next sections.
If it fails with urllib.error.HTTPError: HTTP Error 404: Not Found
This problem can be expected when we are maintaining Tails N.x
together with Tails (N+1).y
.
If this is the case:
Run the same command, adding the
--ignore-404
optionIn the output of this second attempt, verify that the only missing IUKs are not relevant to this release
If IUK verification fails
if the last command failed:
Visit this page:
https://jenkins.tails.boum.org/job/build_IUKs/lastBuild/parameters/
It tells you which parameters you've passed to the upstream Jenkins job, which spawned the many workers. Make sure to use the build history browser to find the right
build_IUKs
job in case there were several attempts; the URL above assumes the last build is what you're interested in (seelastBuild
part).Double-check that you've passed the correct parameters.
If you notice you made a mistake, build the IUKs on Jenkins again, and do the verification again.
Else, if the parameters where correct, then follow the next steps.
Leave the IUKs built by Jenkins where they already are (in
rsync.lizard:/srv/tmp
): at least in some cases, this will speed up uploading your own IUKs later by a 500+ factor, thanks torsync --partial --inplace
.File an issue about this problem.
Specify:
Which set of parameters you've passed to the build_IUKs
job, so that the person who'll investigate the problem can reproduce it.
The ID of the build that failed to reproduce your locally-built IUKs.
Set:
/label ~"To Do" ~"Core Work:Foundations Team" ~"C:Continuous Integration" ~"C:Upgrader" ~"C:Release management"
- milestone: the next release
Later on, after you're done with OpenPGP signing, you will upload the IUKs you've built locally.
Prepare upgrade-description files
Prepare upgrade-description files (see the upgrade-description files specification for details).
Inmost cases the example command below is exactly the one you should run. But in order to tell whether you're in one of the exceptional cases when you have to adjust that command, it's important that you understand what follows.At this step, we use the
tails-iuk-generate-upgrade-description-files
tool in order to:Create a new upgrade-description for the version being released and for the next one, that expresses that no upgrade is available for these ones yet.
This is what the
--version
and--next_version
arguments in the example command below do. You do not need to modify them.For every recent previous release that's not listed in
$IUK_SOURCE_VERSIONS
: tell the users of that old version they need to manually upgrade to the version being released.The example command below only does it automatically to notify users of N.x that they should upgrade to (N+1).y. But in other cases, you need to do this manually.
To do so, pass this previous version to
--previous_version
.
Run this command, after adjusting it if needed as explained above:
${RELEASE_CHECKOUT:?}/config/chroot_local-includes/usr/src/iuk/bin/tails-iuk-generate-upgrade-description-files \ --version "${VERSION:?}" \ $( \ if [ "${DIST:?}" = stable ]; then echo \ --next_version "${NEXT_PLANNED_MAJOR_VERSION:?}" \ --next_version "${NEXT_PLANNED_MAJOR_VERSION:?}~rc1" \ --next_version "${NEXT_PLANNED_BUGFIX_VERSION:?}" \ --next_version "${NEXT_POTENTIAL_EMERGENCY_VERSION:?}" else echo \ --next_version "${NEXT_PLANNED_MAJOR_VERSION:?}" \ --next_version "${SECOND_NEXT_PLANNED_MAJOR_VERSION:?}" fi ) \ $( \ for version in $(echo ${IUK_SOURCE_VERSIONS:?}); do echo "--previous_version ${version:?}" done \ ) \ $( \ find ${RELEASE_CHECKOUT:?}/wiki/src/upgrade/v2/Tails \ -type d \ -name "$((${VERSION%%.*}-1)).*" \ -printf '--previous_version %f ' \ ) \ --iso "${ISO_PATH:?}" \ --iuks "${IUKS_DIR:?}" \ --release_checkout "${RELEASE_CHECKOUT:?}" \ --major_release "${MAJOR_RELEASE:?}" \ --channel "${DIST:?}"
Create an armoured detached signature for each created or modified upgrade-description file.
./bin/sign-updated-udfs
Verify that all UDFs are correctly signed:
./bin/check-udfs-signature
If this isn't the case, check what went wrong in the signing step!
Add and commit the upgrade-description files and their detached signatures to the Git branch used to prepare the release (
$WEBSITE_RELEASE_BRANCH
):cd "${RELEASE_CHECKOUT:?}" && \ git add wiki/src/upgrade && \ git commit -m "Update upgrade-description files." && \ git push origin ${WEBSITE_RELEASE_BRANCH:?}
Copy the generated UDF for the previous stable release to the test channel in
$MASTER_CHECKOUT
, modify their content accordingly, sign them, commit and push:./bin/publish-test-udfs
Prepare the image description file for the download verification
If preparing a RC, skip this part:
[ "${DIST:?}" = stable ]
Update the image description file (IDF) used by the verification JavaScript:
cd "${RELEASE_CHECKOUT?:}" && \
./bin/idf-content \
--version "${VERSION:?}" \
--iso "${ISO_PATH:?}" \
--img "${IMG_PATH:?}" \
> wiki/src/install/v2/Tails/amd64/stable/latest.json && \
git add wiki/src/install/v2/Tails/amd64/stable/latest.json && \
git commit -m "Update IDF for download verification" && \
git push origin "${WEBSITE_RELEASE_BRANCH:?}"
Check that incremental upgrades are working
Check if the upgrades file looks good:
(
set -e
cd "${MASTER_CHECKOUT:?}"
for version in ${TEST_IUK_SOURCE_VERSIONS:?}; do
echo "Upgrade from ${version}:"
udf="wiki/src/upgrade/v2/Tails/${version}/amd64/test/upgrades.yml"
echo -n "- IUK: "
grep --quiet --fixed-strings "to_${VERSION}.iuk" "$udf" && echo ok || echo missing
echo -n "- ISO: "
grep --quiet --fixed-strings -- "-${VERSION}.iso" "$udf" && echo ok || echo missing
done
)
The above output should not mention any missing IUKs/ISOs, except when
releasing a new major version (N+1).0
(i.e. when Tails is upgraded
to a new Debian major release) when it is expected that the previous
stable N.x
release has no IUK (but it should have an ISO!).
Done with OpenPGP signing
Unplug your OpenPGP smartcard and store it away, so you don't plug it back semi-mechanically later on.
Beware! If your have to plug your OpenPGP smart card or reassemble the key again after this point it invalidates everything done for the reproduction of this release| so it has to be started from the beginning:
- the original text is restored on the pad, and
- some tester follows it from scratch, and
- the Trusted Reproducer follows awaits the new input from said tester and then starts from scratch.
So please try to avoid this!
If you don't know, or don't remember, why plugging the smart card again would invalidate all that work: you're not alone. We've all been there. We will document this: #20617.
Upload images
Sanity check
Verify once more that the Tor Browser we ship is still the most recent (see above).
Delete obsolete IUKs
Skip this section if you're preparing a non-final release (beta, RC):
[ "${DIST:?}" = stable ]
Delete the IUKs that upgrade to any beta or RC corresponding to the version you are preparing:
first check that it's not going to remove anything we want to keep:
ssh rsync.lizard /bin/sh -c \ \"find /srv/rsync/tails/tails/alpha/iuk/v2 \ /srv/rsync/tails/tails/stable/iuk/v2 \ -type f \ -name "*_to_${VERSION:?}~*.iuk" \ -not -name '*~test_*~test.iuk' \ -not -name '*~testoverlayfs_*~testoverlayfs.iuk' \ -not -name '*~testoverlayfsng_*~testoverlayfsng.iuk' \ -ls \ \"
then actually delete the files:
ssh rsync.lizard /bin/sh -c \ \"find /srv/rsync/tails/tails/alpha/iuk/v2 \ /srv/rsync/tails/tails/stable/iuk/v2 \ -type f \ -name "*_to_${VERSION:?}~*.iuk" \ -not -name '*~test_*~test.iuk' \ -not -name '*~testoverlayfs_*~testoverlayfs.iuk' \ -not -name '*~testoverlayfsng_*~testoverlayfsng.iuk' \ -print -delete \ \"
Rationale: avoid the need for mirrors to store 3 concurrent sets of IUKs (#17944). It's OK to delete them now, because we're past the deadline we've announced in the call for testing.
Publish the ISO, IMG, and IUKs over HTTP
If the IUKs reproducibility check you did earlier has failed, then upload the IUKs you've built to our rsync server (we trust your machine more than our Jenkins):
for source_version in $(echo ${IUK_SOURCE_VERSIONS:?}); do
rsync --partial --inplace --progress -v \
"${IUKS_DIR:?}/Tails_amd64_${source_version:?}_to_${VERSION:?}.iuk" \
rsync.lizard:/srv/tmp/
done
While waiting for the IUKs to be uploaded, you can proceed with the next steps.
Upload the ISO and USB image signatures to our rsync server:
scp "${ISO_PATH:?}.sig" "${IMG_PATH:?}.sig" rsync.lizard:
Copy the ISO and USB images to our rsync server, verify their signature, move them in place with proper ownership and permissions:
cd "${RELEASE_CHECKOUT:?}" && \
./bin/copy-images-to-rsync-server-and-verify \
--version "${VERSION:?}" \
--dist "${DIST:?}" \
--release-branch "${RELEASE_BRANCH:?}" \
--matching-jenkins-images-build-id "${MATCHING_JENKINS_IMAGES_BUILD_ID:?}"
Update the time in project/trace
file on our rsync server
and on the live website (even for a release candidate):
./bin/update-trace-time \
"Updating trace file after uploading the ISO and USB images for ${VERSION:?}."
At this stage, either IUKs were reproduced by Jenkins and left on the rsync server in a temporary location, or you must wait until the upload of the locally built IUKs (started in the first part of this section) is completed.
Finally, move the IUKs in place with proper
ownership and permissions and update the time in project/trace
file
on our rsync server and on the live website (even for a release
candidate):
./bin/publish-iuks
As tracked on #19396, it might well be that Mirrorbits will miss this update. Let's be sure that this happens via
ssh rsync.lizard "sudo cat /etc/mirrorbits-rpc-password && echo" \
| ssh rsync.lizard mirrorbits -a refresh -rehash
If, at a later stage of the process, you realize Mirrorbits is still not "noticing" that mirrors are up-to-date, don't be afraid of refreshing/rehashing again, or even restart the whole service (which also triggers a rehash):
ssh rsync.lizard sudo systemctl restart mirrorbits.service
Announce, seed and test the Torrents
Check if there's enough space on our Bittorrent seed to import the new ISO and USB images:
ssh bittorrent.lizard df -h /var/lib/transmission-daemon/downloads
If not, list already running Torrents:
ssh bittorrent.lizard transmission-remote --list
… set $ID
to the oldest one and delete it (do this both for the ISO and USB images):
ssh bittorrent.lizard transmission-remote -t "${ID:?}" --remove-and-delete
… and finally check disk space again:
ssh bittorrent.lizard df -h /var/lib/transmission-daemon/downloads
Now you can announce and seed the Torrents for the release you're preparing:
./bin/announce-and-seed-torrents
Test that you can start downloading the ISO and USB images with a BitTorrent client. It can take a few dozen minutes before this works.
Manual testers can now do the "Torrents" part of the manual test suite, so delete "⚠ Wait until the Release Manager […]" in that section on the testing pad.
Run the automated test suite
This is not a required step: check the manual testers pad and see if anyone volunteered for that. But if noone did, start the automated test suite. The sooner, the better.
You probably already know how to do that, but keep this in mind:
you need to use the
--old-iso
optionyou probably want to specify
--tmpdir
to point to a non-volatile directory. The default lives in/tmp/
, which might lead to unpleasant surprises if, for whatever reasons, you need to turn off your computer.if you are running this and going AFK for long time (ie: overnight) you might want to keep your CPU busy by rerunning failed scenarios. See the "Automatically re-run failures" section.
if the following conditions are met:
you are making out an emergency release OR you still feel the need to make the release as soon as possible for security reasons
and you are not running test overnight, so skipping tests will reduce the lead time
then, feel free to skip testing
@not_release_blocker
, settingwhat=(--tag ~@not_release_blocker)
in the following example.
Automatically re-run failures
The file test-rerun.sh will help you rerunning the test suite automatically, which avoids wasting CPU.
cp bin/test-rerun.sh ../
Tweak it: the tmpdir
might especially need some customization based on your directory structure.
export VERSION ISOS PREVIOUS_STABLE_VERSION
../test-rerun.sh
Public testing
If you're publishing a RC
Using
check-mirrors
, choose a fast mirror that already has the tentative ISO and USB images. E.g. https://mirrors.edge.kernel.org/tails/ or https://mirrors.wikimedia.org/tails/ are reliable and have plenty of bandwidth:( success=false while true; do for mirror in mirrors.wikimedia.org mirrors.edge.kernel.org; do url="https://$mirror/tails/" if [[ -z "$(./check-mirrors.rb --allow-multiple --fast --channel ${DIST:?} \ --url-prefix "$url" \ tails-amd64-${VERSION:?} )" ]] then echo "WORKING: $url" success=true fi done $success && exit 0 sleep 1m done )
Emailed tails-testers@boum.org to ask them to test the tentative images:
- Point them to the up-to-date mirror you've found previously.
- This is a public list, don't point to the pad.
- Make a deadline clear: until when is feedback useful?
[ "${DIST:?}" = alpha ] && ./bin/automailer.py <<EOF
To: tails-testers@boum.org
Subject: Call for testing: ${VERSION?}
Hi,
The *candidate* images for Tails ${VERSION} are ready for testing:
https://download.tails.net/tails/stable/tails-amd64-${VERSION}/tails-amd64-${VERSION}.img
They did not go through our entire QA yet.
Please test and report back by Tuesday, 09:00 UTC.
Thanks!
EOF
Lead the internal manual testing to conclusion
Make sure that enough people are here to run the tests, that they report their results in due time, and that they make it clear when they're leaving for good.
Ensure the "Incremental upgrades" part of the manual test suite is done:
Wait until the IUKs are available:
for old in ${TEST_IUK_SOURCE_VERSIONS:?}; do while ! curl --silent --head --location --fail "https://download.tails.net/tails/stable/iuk/v2/Tails_amd64_${old}_to_${VERSION:?}.iuk" do sleep 30 done done
Delete "⚠ Wait until the Release Manager […]" in the "Incremental upgrades" section on the pad. Replace "⚠ PLACEHOLDER" in that section with the versions to run the test from (i.e. the versions that UDFs were added for when running
publish-test-udfs
earlier):echo ${TEST_IUK_SOURCE_VERSIONS:?}
Fill the holes and make sure that the automated and manual test suites are done in due time.
If you were the one running the automated test suite, check its results.
Triage test results (except for the reproducibility test), reproduce bugs as needed, decide what the next step is and make sure it happens: add to known issues? file an issue? release blocker? improve the test description (steps, expected outcome)?
Notify the Trusted Reproducer
Send an email to the TR telling them they can now start the Reproducibility test.
Note that the Trusted Reproducer has 72 hours to complete this check, you do not have to wait for this to be completed, just make sure they are informed that the Reproducibility check can be started.
./bin/automailer.py <<EOF
Subject: Trusted Reproducer for ${VERSION:?}
You can now start the reproducibility test.
EOF
Prepare announcements and blog posts
What follows in this section happens on the $WEBSITE_RELEASE_BRANCH
branch in ${RELEASE_CHECKOUT:?}
, making sure we catch up with the
technical writers who have likely prepared release notes:
cd "${RELEASE_CHECKOUT:?}" && \
git checkout "${WEBSITE_RELEASE_BRANCH:?}" && \
git fetch && \
git merge "origin/${WEBSITE_RELEASE_BRANCH?:}"
If no release notes where prepared, create one with placeholders:
cp wiki/src/news/version_${PREVIOUS_STABLE_VERSION:?}.mdwn \
wiki/src/news/version_${VERSION:?}.mdwn &&
${EDITOR:?} wiki/src/news/version_${VERSION:?}.mdwn
Make sure to:
adjust the
!meta date
at the top.adjust all version numbers everywhere appropriately.
add placeholders to the "Changes and updates" and "Fixed problems", e.g. "This section will be updated soon. Sorry for the inconvenience!".
add any known issues to the "Known issues" section.
The technical writers will update the release notes after the release.
If preparing a final release
Skip this part if preparing a RC.
[ "${DIST:?}" = stable ]
Rename, copy, garbage collect and update various files:
./bin/add-release-files-to-website
Then, build the website and commit this last set of changes:
if [ "$(./bin/po4a-version)" = 0.62 ]; then
./build-website && \
git add wiki/src/{inc,torrents} && \
git commit -m "Update various website source files for ${VERSION:?}"
else
echo "Wrong po4a version"
fi
Ensure our technical writer has
written the
announcement for the release in wiki/src/news/version_${TAG:?}.mdwn
.
Generate an announcement mentioning the security vulnerabilities affecting the previous version
in wiki/src/security/known_security_vulnerabilities_in_${PREVIOUS_STABLE_VERSION:?}.mdwn
, in
order to let the users of the old versions know that they have to upgrade:
./bin/generate-security-advisory \
--previous-version "${PREVIOUS_STABLE_VERSION:?}" \
--version "${VERSION:?}" \
--tag "${TAG:?}" \
> "wiki/src/security/known_security_vulnerabilities_in_${PREVIOUS_STABLE_VERSION:?}.mdwn"
Remove obsolete bits from wiki/src/home/testing.html
. For example,
remove any text that's specific to a release candidate we may have
published for the version you are preparing.
If preparing a RC
Skip this part if preparing a final release:
[ "${DIST:?}" = alpha ]
Copy the signatures and the Torrents into the website repository:
cp "${ISO_PATH:?}.sig" \
"${IMG_PATH:?}.sig" \
"${ISOS:?}/tails-amd64-${VERSION:?}".{iso,img}.torrent \
"${RELEASE_CHECKOUT:?}/wiki/src/torrents/files/"
If our technical writer did not prepare the announcement for the release, write it yourself:
Set this environment variable, in the format
YYYY-MM-DD
:FINAL_RELEASE_DATE
: the date when we plan to publish the final release you're preparing a RC for
Generate the boilerplate contents from the template:
./bin/generate-call-for-testing \ --version "${VERSION:?}" \ --tag "${TAG:?}" \ --date "${RELEASE_DATE:?}" \ --final-version "${NEXT_PLANNED_MAJOR_VERSION:?}" \ --final-date "${FINAL_RELEASE_DATE:?}" \ > ${RELEASE_CHECKOUT:?}/wiki/src/news/test_${TAG:?}.mdwn
Edit
${RELEASE_CHECKOUT:?}/wiki/src/news/test_${TAG:?}.mdwn
:Mention major user-visible improvements.
Document important config changes that users of the Persistent Storage have to do themselves.
Document known issues.
Look for FIXME.
In wiki/src/home/testing.html
, in the "Current calls for testing" section,
add a bullet point with a pointer to
the call for testing for this release candidate
(https://tails.net/news/test_$TAG/
, that you just wrote).
In any case
Generate PO files for the pages added or updated earlier in this
section, and publish them on the $WEBSITE_RELEASE_BRANCH
branch:
if [ "$(./bin/po4a-version)" = 0.62 ]; then
./build-website && \
git add wiki/src && \
git commit -m "Releasing version ${VERSION:?}" && \
git push origin "${WEBSITE_RELEASE_BRANCH:?}"
else
echo "Wrong po4a version"
fi
Draft the Tor blog post
Skip this if you are not preparing a stable release.
Generate the draft blog post:
./bin/generate-Tor-blog-post "${VERSION:?}" "${TAG:?}" "${DIST:?}" "${RELEASE_DATE:?}"
Copy the introduction sentences of the body into the
summary
section of the draft blog post. If there is no such sentence, this can be enough:Tails ${VERSION:?} is out.
Proof-read the draft blog post
Prepare a draft merge request against the Tor blog
- Use your personal account on their GitLab
- Follow the Tor documentation. It's OK to create the MR on the command line instead of in the GitLab Web IDE.
- Take inspiration from previous MR
- Reuse our existing lead image:
cd
into the directory you created for the new blog postCreate a symlink to the existing lead image:
ln -s ../../../assets/static/images/blog/tails.jpeg lead.jpg && \ git add lead.jpg
Go wild!
Wait for the HTTP mirrors to catch up
Look at the mirror status.
(
base='https://download.tails.net/tails/stable'
echo \
"${base}/iuk/v2/Tails_amd64_${PREVIOUS_STABLE_VERSION:?}_to_${VERSION:?}.iuk?mirrorlist" \
"${base}/tails-amd64-${VERSION:?}/tails-amd64-${VERSION:?}.iso?mirrorlist" \
"${base}/tails-amd64-${VERSION:?}/tails-amd64-${VERSION:?}.img?mirrorlist" \
)
Wait for each of these pages to include: - at least 6 working mirrors - at least one among the fallback mirrors: https://download.tails.net/tails/?mirrorlist
To disable mirrors that are lagging behind:
Set their weight to 0 in the mirrors pool.
Push to the tails/mirror-pool
master
branch.Verify that the CI pipeline passed on
https://gitlab.tails.boum.org/tails/mirror-pool/-/pipelines
.Notify sysadmins@tails.net, so they communicate with the operators of the outdated mirrors.
Sanity checks
Check for any feedback in the Internal testing pad.
Check for any feedback on tails-testers@boum.org
Verify once more that the Tor Browser we ship is still the most recent (see above).
Push
Push the last commits to our Git repository and put master
in the
following state:
( cd "${RELEASE_CHECKOUT:?}" && \
git push origin \
"${WEBSITE_RELEASE_BRANCH:?}:${WEBSITE_RELEASE_BRANCH:?}" \
devel:devel \
) && \
(
set -eu
cd "${MASTER_CHECKOUT:?}"
git fetch
git merge origin/master
git merge --edit "origin/${WEBSITE_RELEASE_BRANCH:?}"
git submodule update --init
echo "stable" > config/base_branch
git diff-index --quiet HEAD config/base_branch || \
git commit config/base_branch \
-m "Restore master's base branch." \
)
Close the "Write release notes" ticket manually since
Closes:
statements are not sufficient when updating master
.
Finally, push the master
branch to make the changes go live on our
website:
( cd "${MASTER_CHECKOUT:?}" && \
git push origin master:master \
)
Wait for the release to be fully published, that is for the website to have been fully refreshed. To do so, first wait for the [!tails_gitlab desc="GitLab CI pipeline" tails/tails/-/pipelines/]] triggered by your last push to build and deploy the website. And then:
If you're publishing a final release
Check:
[ "${DIST:?}" = stable ]
wait until all these conditions are met: - https://tails.net/news/ includes the release notes - https://tails.net/news/index.de.html includes the release notes - https://tails.net/torrents/files/ includes the files for the new release
If you're publishing a RC
Check:
[ "${DIST:?}" = alpha ]
wait until https://tails.net/news/test_$TAG/
is live.
Bug tracker [if preparing a final release]
Skip this part if preparing a release candidate:
[ "${DIST:?}" = stable ]
Ensure you have an up-to-date
tails:gitlab-triage-stable
container image:cd "${MASTER_CHECKOUT:?}" && \ git checkout master && \ ./config/gitlab-triage/bin/ensure-up-to-date-container-image
Postpone to the next scheduled release any remaining open issue and merge request whose milestone is the version you've just released:
cd "${MASTER_CHECKOUT:?}" && \ git checkout master && \ ./bin/gitlab-triage-post-release \ --host-url "$(bin/gitlab-url TailsRM)" \ --token "$(bin/gitlab-api-token TailsRM)"
Finally, submit a MR against tails/gitlab-config to close the just-released milestone. See this as an example: tails/gitlab-config!18
A cheap way to do this, could be to use Gitlab web interface to create the MR
Assign the MR to the on-duty sysadmin (see link to "Semi-public calendar" in https://gitlab.tails.boum.org/tails/summit/-/wikis/Teams#sysadmins).
Issues linked from the website
Go through the issues linked from the documentation and support sections of the website to identify the issues that might be resolved in this release.
cd "${MASTER_CHECKOUT:?}" && \ find wiki/src/{doc,support} ! -path wiki/src/support/known_issues/graphics.mdwn -name "*.mdwn" -o -name "*.html" | xargs cat | \ ruby -e 'puts STDIN.read.scan(/\[\[!tails_ticket\s+(\d+)[^\]]*\]\]/)' | \ while read ticket; do ticket_status="$(python-gitlab --gitlab TailsRM -o json --fields state project-issue get --project-id 2 --iid "$ticket" | jq --raw-output .state)" if [ -z "${ticket_status:-}" ]; then echo "Failed to find the status of #${ticket:?}" >&2 continue fi if [ "${ticket_status:?}" = "closed" ]; then echo "It seems issue #${ticket:?} has been fixed. Please find all instances on the website and adjust them as needed. Ticket URL: https://gitlab.tails.boum.org/tails/tails/-/issues/${ticket:?}" fi done
Create a GitLab issue to list them.
Title: Issues linked from the website for Tails $VERSION Labels: Core Work:Technical writing and To Do
Check in the comments of the issue for the release notes if the technical writers have prepared a tweet. Otherwise tweet a simple link to the release notes:
echo "Tails ${VERSION:?} is out: https://tails.net/news/version_${TAG:?}/"
To login:
Get username and password:
keyringer tails-rm decrypt credentials.asc | grep twitter.com
Get the OTP code:
oathtool -b --totp \ "$(keyringer tails decrypt credentials.asc | grep 'twitter OTP code' | grep -Po 'otpauth://totp/.[?]secret=\K[A-Z0-9]' )"
Request the publication of the Tor blog post
Skip this section unless you have prepared a draft Tor blog post earlier.
Check if you have developer status on https://gitlab.torproject.org/tpo/web/blog/-/project_members
If you have developer status
To inspect the result, see the pipeline of the MR, in particular the
deploy-review
job. On top of the job page, a message will tell "This job is deployed to ". Click on the link to the environment. Then click on "View deployment". Check if everything looks good.Merge!
Once merged a new pipeline is started whose last stage (Deploy) will require manual action: click on it to open a menu where you will see "deploy-prod ▶️", then click on the ▶️ to deploy.
Else
Mark the merge request as ready by remove the Draft:
prefix, and request a review from one of your fellow RMs who has Developer status
Amnesia news
Log in
and accept the release announcement (it's been automatically sent
to amnesia-news@
every half hour and is stuck in the
moderation queue).
The password is in keyringer in rm.git
.
Prepare for the next development cycle
If you just released a final release
Check that you are indeed preparing a final release:
[ "${DIST:?}" = stable ]
If you just released a new stable release, remove the previous stable release from:
- our rsync server:
ssh rsync.lizard sudo rm -rf /srv/rsync/tails/tails/stable/tails-amd64-${PREVIOUS_STABLE_VERSION:?}/
our Bittorrent seed: get the previous release's Transmission IDs (ISO and USB image, use comma-separated values) with:
ssh bittorrent.lizard transmission-remote --list | grep --fixed-strings "tails-amd64-${PREVIOUS_VERSION:?}-i"
Then delete them with (
PREVIOUS_VERSION_TRANSMISSION_IDS
must be a comma-separated list of IDs, based on the previous command):ssh bittorrent.lizard transmission-remote \ -t "${PREVIOUS_VERSION_TRANSMISSION_IDS:?}" --remove-and-delete
Finally, check that everything looks good:
ssh bittorrent.lizard transmission-remote --list
- our rsync server:
Remove any remaining RC for the just-published release from
rsync.lizard:/srv/rsync/tails/tails/alpha/
.Check if there is anything:
ssh rsync.lizard find /srv/rsync/tails/tails/alpha/ -type f
Decide what to do!
If you've published a final release, remove IUKs that upgrade to an older version as they were superseded by this release:
first check that it's not going to remove anything we want to keep:
ssh rsync.lizard /bin/sh -c \ \"find /srv/rsync/tails/tails/alpha/iuk/v2 \ /srv/rsync/tails/tails/stable/iuk/v2 \ -type f -name '*.iuk' \ -not -name "*_to_${VERSION:?}.iuk" \ -not -name '*~test_*~test.iuk' \ -not -name '*~testoverlayfs_*~testoverlayfs.iuk' \ -not -name '*~testoverlayfsng_*~testoverlayfsng.iuk' \ -ls \ \"
then actually delete the files:
ssh rsync.lizard /bin/sh -c \ \"find /srv/rsync/tails/tails/alpha/iuk/v2 \ /srv/rsync/tails/tails/stable/iuk/v2 \ -type f -name '*.iuk' \ -not -name "*_to_${VERSION:?}.iuk" \ -not -name '*~test_*~test.iuk' \ -not -name '*~testoverlayfs_*~testoverlayfs.iuk' \ -not -name '*~testoverlayfsng_*~testoverlayfsng.iuk' \ -print -delete \ \"
Check how much space our mirrors need:
ssh rsync.lizard du -sh /srv/rsync/tails
Compare it to the minimum disk space we ask of our mirror operators (80 GiB) and determine if any further action is needed to either reduce our usage by deleting stuff, or asking them to give us more space.
Delete Git branches that were merged:
cd "${MASTER_CHECKOUT:?}" && \ git checkout master && \ git fetch && \ git merge origin/master && \ git submodule update --init && \ bare_repo=$(mktemp -d) && \ git clone --bare --reference "${MASTER_CHECKOUT:?}" \ git@gitlab-ssh.tails.boum.org:tails/tails \ "${bare_repo:?}" && \ PYTHONPATH=lib/python3 ./bin/delete-merged-git-branches \ --repo "${bare_repo:?}" && \ rm -rf "${bare_repo:?}"
On the
stable
branch, remove all old versions that were never released fromwiki/src/upgrade/v2/Tails
:cd "${RELEASE_CHECKOUT:?}" && \ git checkout stable && \ ./bin/remove-unused-udfs --before-version "${VERSION:?}"
Explanation: the post-release APT repository steps from the previous stable release will usually have had us prepare for an emergency release that was never made.
Pull
master
back and merge it intostable
:cd "${RELEASE_CHECKOUT:?}" && \ git fetch origin && \ git checkout stable && \ git merge origin/master
Note: After a final major release is published, it's normal to leave the
testing
branch alone: it will only become relevant again when the next freeze starts.Put the
stable
Git branch and the corresponding APT suite in the desired shape:if [ "${MAJOR_RELEASE:?}" -eq 1 ]; then ./bin/reset-custom-APT-suite stable testing && \ git checkout stable && \ if [ $(find config/APT_overlays.d -maxdepth 1 -type f | wc -l) -gt 1 ]; then git rm config/APT_overlays.d/* && \ git commit config/APT_overlays.d/ \ -m "Empty the list of APT overlays: they were merged" fi fi
Increment the version number in stable's
debian/changelog
, so that next builds from thestable
branch do not use the APT suite meant for the last release:cd "${RELEASE_CHECKOUT}" && \ git checkout stable && \ dch --newversion "${NEXT_STABLE_CHANGELOG_VERSION:?}" \ "Dummy entry for next release." && \ git commit debian/changelog \ -m "Add dummy changelog entry for ${NEXT_STABLE_CHANGELOG_VERSION:?}."
Merge the release branch and its APT suite into
devel
:./bin/merge-main-branch "${RELEASE_BRANCH:?}" devel
When there's a merge conflict, the script aborts to let you resolve the conflict. Then:
- Resolve the conflict,
git add
andgit merge --continue
. - Re-run the exact same
merge-main-branch
command.
- Resolve the conflict,
Check if the version number of the last entry in
devel
'sdebian/changelog
matches the next major release, so that next builds from thedevel
branch do not use the APT suite meant for the last release; if this is not the case yet, run:cd "${RELEASE_CHECKOUT}" && \ git checkout devel && \ dch --newversion "${NEXT_PLANNED_MAJOR_VERSION:?}" \ "Dummy entry for next release." && \ git commit debian/changelog \ -m "Add dummy changelog entry for ${NEXT_PLANNED_MAJOR_VERSION:?}."
If this command outputs anything:
git show devel:config/chroot_apt/preferences \ | grep -E '^Explanation: freeze exception'
Then checkout the
devel
branch, and there thaw the packages that were granted freeze exceptions.Verify that the snapshots used in the release branch are ok, e.g. they use the correct snapshots, and they were bumped appropriately (they should expire after the next planned major release date). Look carefully at the output of this command:
cd "${RELEASE_CHECKOUT:?}" && \ git checkout "${RELEASE_BRANCH:?}" && \ ./bin/apt-snapshots-expiry
Push the resulting branches:
git push origin stable testing devel
Check the Mozilla and Tor Browser release calendars:
-
What we're looking for here is the release events, not the rebase ones
If it looks surprising, check their release preparation issues. If necessary, ask the Tor Browser developers.
If the upcoming release dates in the Mozilla calendar does not match what we have in our release calendar, then alert tails-dev@boum.org, rm@tails.net, and explicitly Cc the designated RM for that upcoming release.
-
Announce the date of the next release:
./bin/automailer.py <<EOF To: tails-dev@boum.org,tails-l10n@boum.org Subject: Release schedule for Tails ${NEXT_PLANNED_VERSION:?} Hi, XXX will be the RM for Tails ${NEXT_PLANNED_VERSION:?} The current plan is: - Monday, XXX: build images, start testing - Tuesday, XXX: release Developers, please: - Book time for automated and manual QA during office hours, from Monday 15:00 to Tuesday 11:00 (Europe/Berlin). If you can't, please let the RM know. - During these 2 days, don't push any change to the branch used for the release. EOF
- If a release candidate is planned, include it in the release schedule and the call for manual testers.
Check the entries in our release calendar for the next two (2) releases.
Each of them must:
- exist
- say who is the release manager
- say who is the trusted reproducer
If any of this information is missing, open an issue for each release that lacks a RM or TR:
/label ~"C:Release management" ~"Core Work:Foundations Team" ~"To Do" ~"P:Elevated" /milestone %"Tails_${NEXT_PLANNED_VERSION:?}"
Include this information:
- For the Trusted Reproducer
- Your job will be to reproduce the ISO and USB images for the RC (if applicable) and final release within 72 hours after the RM has unplugged their smartcard.
- If you volunteer, you must immediately read the "Preparation" section of the instructions: https://tails.net/contribute/release_process/reproducibility/#preparation
If you are the release manager for the next release too, look at the tasks that must be done at the beginning of your shift in the release manager role page.
Otherwise, kindly remind the next release manager about this :)
Send an email to the next RM with:
subject "RM for Tails NEXT_PLANNED_VERSION"
body: "This is a reminder that you will be RM for Tails NEXT_PLANNED_VERSION. Please see https://tails.net/contribute/working_together/roles/release_manager/#shift".
./bin/automailer.py <<EOF Subject: RM for Tails ${NEXT_PLANNED_VERSION}
This is a reminder that you will be RM for Tails ${NEXT_PLANNED_VERSION}.
Please see https://tails.net/contribute/working_together/roles/release_manager/#shift EOF
If there's a release candidate scheduled before our next release, do the same for the release manager of that release candidate.
Put aside the configuration file used for this release:
mkdir -p ~/.config/tails/release_management/old && \ mv ~/.config/tails/release_management/current.yml \ ~/.config/tails/release_management/old/${VERSION:?}.yml.disabled
You're done for today! Suggestion: enjoy some time AFK :)
Tomorrow, make sure Jenkins manages to build all updated major branches: https://jenkins.tails.boum.org/view/RM/.
If you just released an RC
Check that you are indeed preparing a RC:
[ "${DIST:?}" = alpha ]
If this command outputs anything:
git show devel:config/chroot_apt/preferences \ | grep -E '^Explanation: freeze exception'
Then checkout the
devel
branch, and there thaw all packages (if any) that were granted freeze exceptions.On the devel branch, thaw the time-based APT repository snapshots:
git checkout devel && \ ./auto/scripts/apt-snapshots-serials thaw && \ git commit \ -m "Thaw APT snapshots." \ config/APT_snapshots.d/*/serial || :
This should generally be a no-op but if there was some hiccup earlier it could be needed.
Increment the version number in
debian/changelog
on the branch used for the release, to match the upcoming non-RC release, so that the next builds from it do not use the APT suite meant for the RC:cd "${RELEASE_CHECKOUT}" && \ git checkout "${RELEASE_BRANCH:?}" && \ dch --newversion "${NEXT_PLANNED_MAJOR_VERSION:?}" \ "Dummy entry for next release." && \ git commit debian/changelog \ -m "Add dummy changelog entry for ${NEXT_PLANNED_MAJOR_VERSION:?}."
Increment the version number in
devel
'sdebian/changelog
to match the second next major release, so that images built from there have the right version number:cd "${RELEASE_CHECKOUT}" && \ git checkout devel && \ dch --newversion "${SECOND_NEXT_PLANNED_MAJOR_VERSION:?}" \ "Dummy entry for next release." && \ git commit debian/changelog \ -m "Add dummy changelog entry for ${SECOND_NEXT_PLANNED_MAJOR_VERSION:?}." || :
Merge testing into devel (keeping a single changelog entry, for the second next major release), and push them:
cd "${RELEASE_CHECKOUT:?}" && \ git checkout devel && \ git merge testing && \ git push origin testing devel
Follow the "Verify that the snapshots used in the release branch are ok" step for final releases, above.
In our release calendar, mark the entry about the version that you've just released as released by adding " - Released" to the title.
Put aside the configuration file used for this release:
mkdir -p ~/.config/tails/release_management/old && \ mv ~/.config/tails/release_management/current.yml \ ~/.config/tails/release_management/old/${VERSION:?}.yml.disabled
Delete your local copy of the IUKs [XXX:automate]
Delete logs
rm -f /var/tmp/wrap_tails_create_iuks*.log
You're done for today! Suggestion: enjoy some time AFK :)
Tomorrow, on https://jenkins.tails.boum.org/view/RM/, make sure Jenkins successfully:
- builds
devel
and${RELEASE_BRANCH}
- builds the website
- checks PO files
- builds