Compare commits

...

443 Commits
status ... main

Author SHA1 Message Date
Befator
50276fb9a2 More fixes
Some checks failed
test / test (ubuntu-24.04, 3.12) (push) Has been cancelled
test / test (ubuntu-24.04, 3.13) (push) Has been cancelled
test / test (windows-11-arm, 3.12) (push) Has been cancelled
test / test (windows-11-arm, 3.13) (push) Has been cancelled
test / test (windows-2022, 3.12) (push) Has been cancelled
test / test (windows-2022, 3.13) (push) Has been cancelled
test / zizmor (push) Has been cancelled
build / schedule (push) Has been cancelled
build / ${{ matrix.name }} (push) Has been cancelled
2025-10-18 19:19:29 +02:00
Befator
84c03f504e Giteafication
Some checks are pending
test / test (windows-11-arm, 3.13) (push) Waiting to run
test / test (ubuntu-24.04, 3.12) (push) Waiting to run
test / test (ubuntu-24.04, 3.13) (push) Waiting to run
test / test (windows-11-arm, 3.12) (push) Waiting to run
test / test (windows-2022, 3.12) (push) Waiting to run
test / test (windows-2022, 3.13) (push) Waiting to run
test / zizmor (push) Waiting to run
2025-10-17 19:45:13 +02:00
Christoph Reiter
553846537b add some debug logs for JOB_CHECK_RUN_ID
seems like it's not there in some cases (?)
2025-10-05 09:49:19 +02:00
Christoph Reiter
c6213b4d1a Partly remove hack to fetch the current job ID
GH runner now exposes a "job.check_run_id" in the template
language, which we can use directly for the API to fetch
information about the job currently in progress.

Previously we looked through all active jobs and matched them
by name.

There is no env var for job.check_run_id, so we have to set the
env var in the yaml file still.
2025-10-03 22:16:09 +02:00
Christoph Reiter
ecd1d51f4d Use native arm64 Python 2025-09-17 11:11:28 +02:00
Christoph Reiter
fd1952d205 Update mypy 2025-09-17 10:59:55 +02:00
Christoph Reiter
19926ce9c5 Update requirements.txt
missed it
2025-09-17 10:29:34 +02:00
Christoph Reiter
33a052a413 Update deps 2025-09-17 10:18:31 +02:00
Christoph Reiter
59740a3f2e Port to PEP 735
And depend on poetry 2.2.
This allows one to use uv instead of poetry if wanted.
Add .venv to flake8 ignore since that's the uv default.

Also update deps while at it, and specify a license.
2025-09-14 21:43:15 +02:00
Christoph Reiter
4704486d49 Update deps; require PyGithub 2.8.1
move to the new digest property
2025-09-09 09:44:28 +02:00
Christoph Reiter
dc632d9934 build: custom makepkg config for building
During the build create a temporary config file in makepkg.conf.d
which changes some defaults.

For starters this sets the zstd compression, and bumps it for source
builds.

This allows us to make the default zstd config faster, while compressing
with a higher level in autobuild.
2025-08-28 19:00:00 +02:00
Christoph Reiter
3687fa3a0b Fix condition for selecting msys build
this happened to work by accident via substring match, and being
the last condition
2025-08-26 22:21:19 +02:00
Christoph Reiter
42b02362e1 Use ruff to upgrade code 2025-08-26 22:05:51 +02:00
Christoph Reiter
05abf4e953 Assume os.path.isjunction is available
now that we depend on Python 3.12+
2025-08-26 22:05:51 +02:00
Christoph Reiter
a3bae5a40c Drop support for Python 3.11
We kinda depend on os.path.isjunction, so just drop it
2025-08-26 22:05:43 +02:00
Christoph Reiter
456089ba22 Remove old compat code 2025-08-26 21:32:32 +02:00
Christoph Reiter
d15bda6f83 CI: update actions/checkout 2025-08-25 09:49:26 +02:00
Christoph Reiter
de38d16edd Update deps 2025-08-25 09:48:02 +02:00
Christoph Reiter
fd77359a5a Drop support for Python 3.10 2025-08-01 08:26:25 +02:00
Christoph Reiter
3581de3619 fix Generator usage with older Python
it doesn't have defaults there, so pass None
2025-08-01 08:19:56 +02:00
Christoph Reiter
84d3306857 Update mypy 2025-08-01 08:16:45 +02:00
Christoph Reiter
5bbfb7bb18 Verify checksums when downloading assets
In the last few weeks (I think) GH added checksums to the API reponses
for release assets. Use them to verify the downloaded files.

Also bump the chunk size a bit while at it, it was quite small..
2025-08-01 08:13:06 +02:00
Christoph Reiter
69ce064955 Update pygithub
there are API changes, so bump the minimum
2025-08-01 07:56:41 +02:00
Christoph Reiter
ab3c2437e8 Update deps 2025-07-22 06:38:31 +02:00
Christoph Reiter
70dec0bd33 CI: revert to windows-2022 for now
see https://github.com/msys2/MINGW-packages/pull/24677#issuecomment-3017919467
2025-06-30 08:22:56 +02:00
Christoph Reiter
54197e6af4 Update deps 2025-06-20 20:16:36 +02:00
Christoph Reiter
c237bc163a Update deps 2025-06-13 10:26:44 +02:00
Christoph Reiter
5f5d7aafa2 update default cycle breakers for winpthreads rename 2025-06-08 10:12:46 +02:00
Christoph Reiter
5c2504702e Update deps 2025-06-02 09:22:39 +02:00
Christoph Reiter
776a26e021 Update deps 2025-05-25 15:46:04 +02:00
Christoph Reiter
999e4e9327 Try to match the install paths of the packages CI more closely
Things are failing and the only difference in the logs are paths, so
try to get rid of that difference at least.
2025-05-19 09:32:07 +02:00
Christoph Reiter
3a5fc4c416 Update deps 2025-05-04 19:46:18 +02:00
Christoph Reiter
663b7acdc1 zizmor: allow unpinned setup-msys2
we trust our own code
2025-04-28 06:31:28 +02:00
Christoph Reiter
e8d10d7e9e config: switch to windows-11-arm for clangarm64 2025-04-16 19:59:15 +02:00
Christoph Reiter
caa6a73b53 CI: remove useless condition
as pointed out in https://github.com/msys2/msys2-autobuild/pull/112/files#r2047370653
if release==false then the location is not used anyway
2025-04-16 19:54:04 +02:00
Christoph Reiter
839b8befc3 config: fold MAXIMUM_BUILD_TYPE_JOB_COUNT into RUNNER_CONFIG as well 2025-04-16 07:24:19 +02:00
Christoph Reiter
a2fb8db0e7 config: add more runner specific config
instead of hardcoding them in multiple places
2025-04-16 06:34:39 +02:00
Christoph Reiter
311b4cd295 CI: run tests on windows-11-arm
force x64 Python still since installing our deps still fails
for arm64 there
2025-04-16 06:34:39 +02:00
Christoph Reiter
0d471ea5b7 build: try removing junctions before calling git clean
See https://github.com/msys2/msys2-autobuild/issues/108#issuecomment-2776420879

It looks like git can under some circumstances hang forever when trying
to clean the checkout when there are junction loops. So try to remove
them manually before calling git clean.

Fixes #108
2025-04-11 14:11:50 +02:00
Christoph Reiter
8d9cbcb54c Update deps 2025-04-11 10:52:11 +02:00
Christoph Reiter
23845c53e0 Update deps 2025-03-29 12:37:12 +01:00
Christoph Reiter
e9e823c2e7 build: try to use actions/cache@v4 for pip caching
To work around https://github.com/actions/setup-python/issues/1050
2025-03-12 06:48:14 +01:00
Christoph Reiter
fe4bcd08a9 CI: disable "update-environment" for "setup-python"
setup-python, by default, sets various cmake and pkg-config env
vars, so that packages using cmake can be built. Since this might
interfere with out package builds disable it.

We only care about the Python executable itself, so use the action
output to create the venv.
2025-03-10 08:38:48 +01:00
Christoph Reiter
47cc05c39f CI: also use a venv for the Windows build job
To be more isolated from the host system
2025-03-10 08:30:20 +01:00
Christoph Reiter
a2ebb72da0 CI: use a venv for the linux jobs
To gain more isolation from the host
2025-03-09 19:26:37 +01:00
Christoph Reiter
4413e41389 Port to PEP 621
only dev deps left, for that we need PEP 735 which isn't in poetry yet
2025-03-07 16:41:28 +01:00
Christoph Reiter
d45f6720f4 CI: move to Python 3.13 2025-03-07 12:06:20 +01:00
Christoph Reiter
e2042058f1 gh: improve repo caching
We were caching based on the build type, but for most build types the repo
is the same, so cache one level below instead.
2025-03-07 12:04:27 +01:00
Christoph Reiter
bb54adc298 Add verbosity option and write logs to stderr by default
default is warning, -v means info, -vv means debug
2025-03-07 12:04:27 +01:00
Christoph Reiter
1ef3f8f5f5 Use new pygithub global lazy feature
Requires 2.6.0. Means data is only fetched if it is accessed,
so fewer API calls for us (hopefully).
2025-03-07 12:04:21 +01:00
Christoph Reiter
ca6dd299ee Update dependencies 2025-03-07 11:02:48 +01:00
Christoph Reiter
5f9bed8409 CI: remove VCPKG_ROOT workaround
This was to avoid breakage from https://github.com/actions/runner-images/pull/6192
But it was reverted in the image long ago: https://github.com/actions/runner-images/issues/6376
2025-03-07 10:07:04 +01:00
Christoph Reiter
625631832e CI: derive the build root from GITHUB_WORKSPACE
On hosted runners this means D:, on self-hosted it
will point to C: if there is only one drive.
2025-03-06 00:27:11 +01:00
Christoph Reiter
7ec5a79b46 CI: build on D: instead of C:
Related to 796ec1c1ba

D: is both faster and has more free space compared to C: with the current runner setup.
2025-03-05 23:19:23 +01:00
Christoph Reiter
a187346d08 Update deps 2025-02-15 15:11:06 +01:00
Christoph Reiter
b442168127 build: delete all junctions before calling "git clean"
git clean can't deal with junctions and in case there is a loop
it follows them forever (or until stack overflow).
https://github.com/git-for-windows/git/issues/5320

To work around this try to delete all junctions in the clean
re-try code path.

Fixes #108
2025-01-31 16:01:13 +01:00
Christoph Reiter
bdd38ec73c Update deps 2025-01-25 07:32:02 +01:00
Christoph Reiter
98f6ea2875 CI: set default permissions to make newer zizmor happy 2025-01-19 09:27:23 +01:00
Christoph Reiter
a977f9deb9 remove leftover debug print 2025-01-11 08:58:12 +01:00
Christoph Reiter
4f60392b3e make_tree_writable: handle junctions and add tests
As found out here, os.walk() by default follows junctions, which we don't
want and can even lead to loops:
https://github.com/msys2/msys2-autobuild/issues/101#issuecomment-2583121845

Integrate the workaround mentioned in the CPython bug report:
https://github.com/python/cpython/issues/67596#issuecomment-1918112817
Since this is Python 3.12+ only and we still support 3.10 make
it optional though.

This also adds tests, which uncovered some other minor issues:
It was not chmoding top-down, which meant that os.walk would
skip things if there were no read permissions. So chmod before
os.walk() lists the dir.
2025-01-10 21:32:14 +01:00
Christoph Reiter
35ff0b71b6 Update deps 2024-12-20 11:24:43 +01:00
Christoph Reiter
1575848e81 Move to windows-2025 2024-12-20 11:23:33 +01:00
مهدي شينون (Mehdi Chinoune)
657fd89531 remove clang32 2024-12-19 08:19:08 +01:00
Christoph Reiter
0f20d6bfa8 Reapply "CI: remove code scanning again"
This reverts commit c553f33cf05394be3733705adc4c3ad86e1a044d.

I still can't get it to work and I give up
2024-12-13 22:16:39 +01:00
Christoph Reiter
c553f33cf0 Revert "CI: remove code scanning again"
This reverts commit c5b593a34c9d51fac31f3c3e158db7b15a004804.

try the suggestion from
https://github.com/woodruffw/zizmor/discussions/291
2024-12-13 21:51:40 +01:00
Christoph Reiter
c5b593a34c CI: remove code scanning again
And just fail normally in the job if anything is found.
I can't get the code scanning to fail a check somehow.
2024-12-13 21:34:31 +01:00
Christoph Reiter
1bc0a28e35 CI: run zizmor 2024-12-13 20:55:43 +01:00
Christoph Reiter
0f71ee73cf Update deps 2024-12-06 17:43:59 +01:00
Christoph Reiter
4deb3111d3 CI: move to ubuntu-24.04
from ubuntu-22.04
2024-12-06 14:41:07 +01:00
Christoph Reiter
5bf958fd1b CI-hardening: move permissions to the job level
Instead of giving all jobs write permissions, default to no permissions
and enable them on a per-job basis.

This does not change anything for us, but avoids accidental write
permissions if a new job gets added without considering that it inherits
the top level permissions, even if it doesn't need them.

See https://woodruffw.github.io/zizmor/audits/#excessive-permissions
2024-12-06 14:17:15 +01:00
Christoph Reiter
7eed3d8bc1 CI-hardening: escape the msys2-location output
While that comes from our own action, so we can in theory trust it,
escape it for good measure. Can't hurt and silences a warning.
2024-12-06 13:59:38 +01:00
Christoph Reiter
7c78444174 CI-hardening: set persist-credentials=false for all actions/checkout
To avoid writing the token to disk. It still gets exposed via env vars
to various steps, but this removes the access from any steps before that.

As recommeded by the zizmor scanner
2024-12-06 13:59:25 +01:00
Christoph Reiter
19c8f00aba CI: update to Python 3.12; also test with 3.13 2024-12-06 13:59:06 +01:00
jeremyd2019
a6b3079ae3 update package name in OPTIONAL_DEPS
The pkgbase was renamed from mingw-w64-clang to mingw-w64-llvm, but this still had the old name, requiring manual specification of breaking the cycle with libc++
2024-11-12 23:28:24 +01:00
Christoph Reiter
acafab9b5f queue: fix missing cycles with build-only deps of deps
We only looked at the dependencies of a package that are needed for building,
but for detecting build cycles we also have to look at all transitive deps.

Unless the dependency is already finished, then we can ignore its build deps,
even if they are not finished yet.

The test shows such a case where things indirectly create a cycle via cmake.

Fixes #91
2024-10-26 20:06:33 +02:00
Christoph Reiter
ef67d84096 Update deps 2024-10-26 14:27:05 +02:00
Christoph Reiter
7c56a1d764 Update deps 2024-10-07 07:44:41 +02:00
Christoph Reiter
cfdccd0a03 Update deps 2024-09-21 11:15:17 +02:00
Jeremy Drake
22f1e5ad0b Revert "Partially revert "CI: Update actions/setup-python""
The upstream issue has (finally) been fixed.

This reverts commit 3e617554bbe6fc206a4032e86b0cc79aedad42e6.
2024-08-29 20:52:12 +02:00
Christoph Reiter
05a051162d Update deps 2024-08-28 08:30:18 +02:00
Christoph Reiter
f968d2f0ca Update deps 2024-08-09 11:49:18 +02:00
Christoph Reiter
67d510ec4b CI: use the new setup-msys2 output for finding the install location 2024-08-03 13:50:05 +02:00
Christoph Reiter
f44d95e7c2 Update deps 2024-08-02 09:40:41 +02:00
Christoph Reiter
00495cb263 Update deps 2024-06-23 09:56:16 +02:00
Christoph Reiter
40ab937954 Update deps 2024-06-07 17:56:03 +02:00
Christoph Reiter
59bb7f6f18 fetch-assets: test all downloaded files with zstd
Test them before moving them to the final location.
This makes the download fial of there is some file corruption etc.

This adds a dependency on the zstd exectuable for the fetch-assets
command.

Motivated by https://github.com/msys2/msys2-main-server/issues/42
2024-05-25 14:03:54 +02:00
Christoph Reiter
bf3cf80161 Update deps 2024-05-25 13:10:28 +02:00
Christoph Reiter
63ea6585cd config: clean up manual build and ignore rdep lists
those packages eiher no longer exist, or should proably build now
that the CI runners are faster.
2024-05-20 10:20:22 +02:00
Christoph Reiter
ea149103be Update deps 2024-05-20 10:13:34 +02:00
Christoph Reiter
9c7e8d3135 Update deps 2024-04-17 08:13:54 +02:00
Christoph Reiter
8d08599c2e Update deps 2024-04-02 22:20:33 +02:00
Jeremy Drake
3e617554bb Partially revert "CI: Update actions/setup-python"
Due to actions/setup-python#819, it fails to install python on a Windows
11 (or presumably Server 2022) self-hosted runner, when a suitable
version of python was not already installed.

Closes #85

This partially reverts commit d5779cd65dbe2e5dceb418c040b7d0d505372294.
2024-03-18 05:42:33 +01:00
Christoph Reiter
9a0b6a31c9 Update deps 2024-03-17 16:22:03 +01:00
Christoph Reiter
8d7df1587a add missing urllib3 dep 2024-03-17 16:21:14 +01:00
Christoph Reiter
dad6671556 cache: old files didn't contain _ 2024-03-03 07:40:34 +01:00
Christoph Reiter
bf9a4e2862 cache: version the file using the library version instead
so we don't have to care about this in the future.
2024-03-03 07:37:29 +01:00
Christoph Reiter
719254cb89
bump the cache file
looks like it's not compatible anymore
2024-03-03 06:59:02 +01:00
Christoph Reiter
281ad3e16e Update deps 2024-03-02 21:22:33 +01:00
Christoph Reiter
d4515ba2fe CI: Update al-cheb/configure-pagefile-action 2024-02-02 19:13:29 +01:00
Christoph Reiter
b78070c653 Update deps 2024-02-01 21:08:42 +01:00
Christoph Reiter
aa0637d87b CI: Update actions/cache 2024-02-01 20:25:36 +01:00
Christoph Reiter
d5779cd65d CI: Update actions/setup-python 2024-01-30 07:24:44 +01:00
Christoph Reiter
1c45f2ab2e Update deps 2024-01-10 08:22:56 +01:00
Christoph Reiter
0eca067dd7 Update deps 2023-12-07 11:32:22 +01:00
Christoph Reiter
1ed7c15c97 flake8 fixes 2023-10-22 16:19:57 +02:00
Christoph Reiter
dae5e305db CI: Update to actions/checkout@v4 2023-10-22 16:16:45 +02:00
Christoph Reiter
1d8af300c4 Move flake8 config from setup.cfg to .flake8
We don't use setuptools, so this makes things clearer
2023-10-22 16:07:49 +02:00
Christoph Reiter
1f4971c293 Drop support for Python 3.8/9 2023-10-22 16:03:57 +02:00
Christoph Reiter
fd1d5cc9ef Update deps 2023-10-22 15:54:52 +02:00
Christoph Reiter
e6700d2089 Disable the new pygithub read throttling
It seems a bit excessive and doesn't take into account that
lots of our request hit the cache via etags.
2023-10-16 20:28:22 +02:00
Christoph Reiter
d1048413f8 Update pygithub to v2
It now has its own default retry logic that fits the GH API,
so no longer pass our own and assume it handles things better.

The datetimes are now timezone aware, so we no longer have to fix
them.
2023-10-16 20:21:47 +02:00
Christoph Reiter
3e0391eb26 Update mypy 2023-10-16 19:44:28 +02:00
Christoph Reiter
049635cd1a Handle disabling certain build types
Even it the API returns them, if they are not in the active list they
will be ignored.
2023-10-16 18:58:08 +02:00
Christoph Reiter
ca30448b74 Update deps 2023-10-16 18:28:03 +02:00
Christoph Reiter
a79a8c4c7a Update deps 2023-09-25 21:39:59 +02:00
Christoph Reiter
3f5f60aa62 Remove Alexpux from uploaders
He hasn't required this in a while, so remove.

Feel free to ask to be re-added.
2023-09-16 14:07:08 +02:00
Christoph Reiter
79a45bf6c7 Require a user confirmation for manual uploads
We currently allow some users to manually upload packages (in case
they take too long for CI, or to bootstrap things).

In case of an account takeover this would allow an attacker to upload/replace
files in staging. To reduce the risk a bit ask for confirmation when downloading
the manually uploaded files.

Also add a "--noconfirm" option so we can avoid the questions in the staging
download script.

Ideally we would require users to sign their files, but this helps a bit at least.
2023-09-16 14:07:08 +02:00
Christoph Reiter
0852421d17 Update deps 2023-08-29 07:38:58 +02:00
Christoph Reiter
f368fb4951 update_status: handle github returning 404 for assets returned by the API
some time after replacing the file it randomly returns 200 and 404 a few times
until it settles on 200.
2023-08-19 23:11:54 +02:00
Christoph Reiter
0af6deb998 CI: allow the "Configure Pagefile" step to fail
it's only a requirement for some packages (flang), and there is a much higher
chance that it fails for a job that doesn't need it currently.
2023-08-16 20:55:46 +02:00
Christoph Reiter
c9fb5c61ab Update dependencies 2023-08-16 20:53:29 +02:00
Christoph Reiter
a3a5c1da40 update-status: only replace the status file if something has changed
Before uploading the status file we make a cached request for the old status
content and if there is no difference we don't upload anything.

This reduces the amount of write API calls and the amount of useless
packages.msys2.org refreshes a bit.
2023-08-16 20:40:22 +02:00
Christoph Reiter
1f1fabade2 clean-assets: only re-create releases if there are many assets
re-creating causes notifications for users. While users can disable them
let's just limit it to larger rebuilds, like the Python rebuilds

Fixes #77
2023-08-01 08:07:23 +02:00
Christoph Reiter
4db4e22d09 clean-assets: delete release in case all assets need to be deleted
In case a release has hundreds of files that need to be deleted this
requires quite a bit of time and also works against the API rate limiting.

In case we want to delete all assets of an release just delete and
re-create the whole release instead.

Fixes #77
2023-07-30 15:00:23 +02:00
Christoph Reiter
5b61a937a1 Update dependencies 2023-07-29 21:10:41 +02:00
Christoph Reiter
edc9089808 clean-assets: fewer parallel deletes
we are still hitting the secondary rate limit
2023-07-29 21:05:00 +02:00
Christoph Reiter
a1540964f5 Remove winjitdebug again
things should be fixed with Python 3.11
2023-07-24 18:22:19 +02:00
Christoph Reiter
95ab14dfe7 Update requirements.txt 2023-06-30 22:04:57 +02:00
Christoph Reiter
d68ad18de2 Port to new pygithub auth API 2023-06-30 22:00:01 +02:00
Christoph Reiter
305e7b4c68 Update dependencies 2023-06-30 21:57:33 +02:00
Christoph Reiter
79096b753c build: disable setup-msys2 caching for the arm64 runner
it's not really helping in case of self-hosted runners,
so just disable it there.
2023-05-28 21:02:12 +02:00
Christoph Reiter
6d6d83ea3e CI: Python 3.10 -> 3.11
looks like all dependencies have wheels for 3.11 now
2023-05-27 08:54:06 +02:00
Christoph Reiter
f78c47f441 Update dependencies 2023-05-26 22:49:17 +02:00
مهدي شينون (Mehdi Chinoune)
13b6b27fea Enable autobuild for qt5-static 2023-05-20 19:10:29 +02:00
Christoph Reiter
dfc132af9d download_text_asset: don't use the cache
This gets called in a threadpool and something in requests_cache
deadlocks.
2023-05-08 09:15:06 +02:00
Christoph Reiter
1a8a881082 build: try to make all files writable if git clean fails
I'm again not sure if this helps, but let's see..
2023-04-13 18:03:47 +02:00
Christoph Reiter
aa61bfdedd Update dependencies 2023-04-07 19:29:24 +02:00
Christoph Reiter
b51cfd02af Avoid upload_asset()
git fails to delete files we have uploaded, and I'm wondering if
upload_asset() is somehow keeping a handle open. While I can't find
anything suspicious in pygithub let's make the file handling explicit
and open/close ourselves.
2023-04-07 19:21:42 +02:00
Christoph Reiter
76a815c145 build: enable core.longpaths for the git repo
so "git clean" can potentially remove overly long paths created
during build time.
2023-04-07 19:12:47 +02:00
Christoph Reiter
236220ef8e typo 2023-04-07 10:43:40 +02:00
Christoph Reiter
60a287290d build: try to run git clean multiple times before giving up
For example it failed with:

warning: failed to remove B/mingw-w64-clang-i686-seacas-2023.02.03-2-any.pkg.tar.zst: Invalid argument

We now always use the same build directory, so if files can't be removed
we fail. Retry git clean/reset a few times before giving up and also
try before we start so in case it is fixed while the job isn't running on
a self-hosted runner we can continue automatically.
2023-04-07 10:41:49 +02:00
Christoph Reiter
cc301e1e62 build: shorter build paths
see https://github.com/msys2/msys2-autobuild/issues/71
2023-04-06 08:53:11 +02:00
Christoph Reiter
f3bf1b80b0 Revert "CI: run every 2 hours instead of 3"
This reverts commit 3116e844bee9bb9515ea892b37238e54cf2fcb98.
2023-04-05 16:58:30 +02:00
Christoph Reiter
3ef72c5eed CI: try per-job concurrency
so that we start jobs even if other jobs from a previous workflow are still running
2023-04-05 16:52:08 +02:00
Christoph Reiter
ccaad93b62 Don't ignore rdeps for mingw-w64-qt6-static
let's try
2023-04-05 07:43:58 +02:00
Christoph Reiter
fb16cedabf config: re-enable automatic builds for mingw-w64-qt6-static
see https://github.com/msys2/MINGW-packages/pull/16637#issuecomment-1496360513
2023-04-05 07:42:33 +02:00
Christoph Reiter
3116e844be CI: run every 2 hours instead of 3 2023-03-24 17:18:21 +01:00
Christoph Reiter
e3bb36afac more type annotations 2023-03-24 14:09:24 +01:00
Christoph Reiter
30fbfffb96 CI: installing wheel shouldn't be needed anymore
pip pulls it in now if needed
2023-03-24 13:44:21 +01:00
Christoph Reiter
8cb3c65f55 turns out matrix in needs is broken
https://github.com/orgs/community/discussions/25364
2023-03-24 13:27:50 +01:00
Christoph Reiter
7417496d9e write_build_plan: rework + build src last
* don't show the cycles when generating the build plan
  (we have other places that show it now)
* interleave the different build types when generating jobs
* make the src jobs depend on the non-src jobs, as src builds
  depend on eiher msys or ucrt64 build results and otherwise
  will just stop due to missing deps. Could be improved by only
  depending on msys/ucrt64, but this is still an improvement.
2023-03-24 13:19:12 +01:00
Christoph Reiter
c27f9a7c40 we can only clean assets for the current repo 2023-03-23 12:32:59 +01:00
Christoph Reiter
19857e3fa0 looks like fromJson() can't handle newlines 2023-03-23 12:07:49 +01:00
Christoph Reiter
956ac59246 write_build_plan: remove the check for running workflows
This was required to avoid running multiple builds at the same time.
But GHA now has concurrency groups which solves the same problem,
so drop that code
2023-03-23 11:59:37 +01:00
Christoph Reiter
ba632451ef README: document the env vars 2023-03-23 11:59:37 +01:00
Christoph Reiter
606b782bb0 config: add option to limit the job count for specific build types
Limit src builds because they are quite fast anyway, and clangarm64
because the self hosted runner can only do one job at a time.
2023-03-23 11:58:17 +01:00
Christoph Reiter
e2ca121180 Replace readonly with write everywhere
less confusing, at least for me
2023-03-23 11:58:17 +01:00
Christoph Reiter
b453032363 Get rid of MAIN_REPO
in most cases at least. either derive from the current
build type, or via get_current_repo() which reads the
GITHUB_REPOSITORY env var.
2023-03-23 11:58:17 +01:00
Christoph Reiter
98697683a5 main: remove --repo option again
this was meant for the arm runner, but it was never used.
2023-03-23 11:58:17 +01:00
Christoph Reiter
6f93057f83 make the tests a package
to make pytest happy
2023-03-23 11:58:13 +01:00
Christoph Reiter
88871c4cb0 Rename _PathLike to PathLike
it's no longer internal
2023-03-23 11:17:10 +01:00
Christoph Reiter
ad34ca14b6 Move some hard coded IDs to the config 2023-03-23 11:17:10 +01:00
Christoph Reiter
e0e19de2c1 Add some unit tests
just one to get things started
2023-03-22 12:47:27 +01:00
Christoph Reiter
5085f864b3 Missed one command 2023-03-22 11:13:33 +01:00
Christoph Reiter
6f40845ba3 README: add a short description and remove the process info
the process info is now moved to the main MSYS2 documentation
2023-03-22 10:42:42 +01:00
Christoph Reiter
6788467670 README: update the CLI help output 2023-03-22 10:09:36 +01:00
Christoph Reiter
87f0603c87 Split the code up into separate modules
with minimal code changes
2023-03-22 09:59:05 +01:00
Christoph Reiter
0d25d51a04 Convert the script to a Python package
It can now be invoked via `python -m msys2_autobuild` or
by installing it, which adds a "msys2-autobuild" script.

This is a first step towards splitting up the code.

The HTTP cache is now stored in the working directory
instead of the source directory.
2023-03-21 11:34:39 +01:00
Christoph Reiter
d0ddf60737 Update dependencies 2023-03-18 10:40:06 +01:00
Christoph Reiter
91ab34350f cache: clean up at the end and limit to 3 hours
it's unlikely there will be many hits after some hours, so better
keep the upload size low. Also clean at the end to make
the upload smaller.
2023-02-19 17:02:55 +01:00
Christoph Reiter
38e6bc6e47 requests_cache: port to new cache cleanup function
I find the API still confusing, but it's better then before.
2023-02-19 16:43:35 +01:00
Christoph Reiter
6ccea00bba Bump the max number of jobs again
Since the last commit we should need fewer API calls
2023-02-19 16:10:29 +01:00
Christoph Reiter
c152a6dbbf Depend on the new pygithub assets API
This exposes the assets inline from a release, so this
should save us lots of requests. Available since v1.58.0
2023-02-19 16:08:51 +01:00
Christoph Reiter
b7df29ff56 CI: skip installing wheel
This was for packages without wheels to build them initially.
In theory newer pip should handle this automatically, let's see
2023-02-19 16:07:19 +01:00
Christoph Reiter
77c2d02a4d Update dependencies 2023-02-19 16:03:41 +01:00
Jeremy Drake
aea263ec2c CI: Remove enabling of clangarm64 in pacman.conf
It is now enabled by default so this is a no-op.
2023-01-29 00:33:11 +01:00
Jeremy Drake
1666f6d3b0 Allow building qt6-static on clangarm64.
The timeout on a self-hosted runner is much larger (72 hours, though
there seemed to be a different limit related to the token hit before
reaching that).
2023-01-18 19:44:42 +01:00
Christoph Reiter
63a1b6020e Extend manual build for mingw-w64-qt5-static to clang32/64
seems the clang build got slower, it now hits the 6h limit always
2023-01-18 18:36:19 +01:00
Christoph Reiter
77a21114a8 CI: set "MSYS" env var later
so the cache action doesn't override it
See https://github.com/actions/toolkit/issues/1312
2023-01-14 12:33:26 +01:00
Christoph Reiter
6e2c5b47d4 Revert "Unset MSYS env everywhere"
This reverts commit e2f4f874a20304bc94047ddf92ca63a9ee9aa5e5.

We are depending on it being set in CI, so this isn't the right approach
2023-01-14 12:32:40 +01:00
Christoph Reiter
e2f4f874a2 Unset MSYS env everywhere
See https://github.com/actions/toolkit/issues/1311
2023-01-14 07:58:13 +01:00
Christoph Reiter
63f65d30bc Delete old assets in a thread pool
To speed things up a bit
2023-01-01 11:34:17 +01:00
Christoph Reiter
307799fd27 Update the status file format and include cycles
This moves it closer to the buildqueue format, and also includes cycles,
and allows future additions.
2022-12-27 16:16:47 +01:00
Christoph Reiter
bf82f9fff2 Don't include broken cycles in the cycle output 2022-12-27 16:16:08 +01:00
Christoph Reiter
a9862b27c1 Missed one left over src build-type 2022-12-24 00:03:29 +01:00
Christoph Reiter
2ae439cd00 Build all source packages in a separate build job
See https://github.com/msys2/msys2-autobuild/issues/69

Building source packages requires git etc to be installed, but
ideally we wouldn't pollute the builder with extra packages that
it doesn't explicitly require.

To avoid this build msys and mingw source packages in a separate job.
2022-12-23 23:53:08 +01:00
Christoph Reiter
21a84297d8 Update deps 2022-12-21 12:09:05 +01:00
Christoph Reiter
e22cc1cc17 Update dependencies 2022-12-10 21:55:46 +01:00
Christoph Reiter
eee25ec33f CI: run on ubuntu-22.04 2022-12-10 21:21:01 +01:00
Christoph Reiter
59e8e1af5d CI: create a larger pagefile
so we can build flang in CI, same as https://github.com/msys2/MINGW-packages/pull/13791
2022-10-29 21:18:54 +02:00
Christoph Reiter
1fd41adbfa CI: test with 3.11 2022-10-27 08:05:20 +02:00
Christoph Reiter
e94b92f73e Update deps 2022-10-27 08:04:11 +02:00
Christoph Reiter
5d06444a57 CI: port away from ::set-output 2022-10-21 13:24:22 +02:00
Christoph Reiter
9d582e19b1 Build src packages in an ucrt64 env
It will be the new default
2022-10-10 18:39:24 +02:00
Christoph Reiter
bf34129d62 Update dependencies 2022-10-09 20:57:44 +02:00
Christoph Reiter
c9dd9afe5e Unset VCPKG_ROOT during build
see https://github.com/msys2/MINGW-packages/pull/13368
2022-10-02 12:35:50 +02:00
Christoph Reiter
b40229daa6 Drop BUILD_TYPES_WIP
This wasn't complete as it would only ignore broken builds
for direct deps and not indirect ones, but kinda worked in ignoring
some arm64 errors.

But it also causes problems if an error is ignored and the other arches
get uploaded. Then it's hard to roll back the update because lots of
packages with the new version are already in the repo.

With the new autobuild controller we can also restart flaky builds instead
of ignoring them and waiting for jeremy to fix them later.

Let's try removing that special case.
2022-09-20 08:02:56 +02:00
Jeremy Drake
253f8b8c4c GHA: accept extra 'context' input
This is meant for the invoker (ie, msys2-autobuild-controller) to
provide additional information to be logged with the job (specifically,
what user requested it).
2022-09-06 21:58:16 +02:00
Jeremy Drake
c03c642719 GHA: log workflow_dispatch inputs in job 2022-09-06 21:58:16 +02:00
Christoph Reiter
f581199930 try running the real pacman with exec
it seems like the pacman wrapper doesn't survive a runtime update.
try exec to avoid returning control to bash
2022-09-06 21:11:16 +02:00
Christoph Reiter
3637fea711 Update dependencies 2022-09-04 10:33:47 +02:00
Christoph Reiter
e23492ee15 also retry on 502
We just got "502 {"message": "Server Error"}" on a DELETE
2022-09-02 21:53:46 +02:00
Christoph Reiter
9f4f288d00 retry HTTP requests which return 500
We are getting "500 null" randomly recently, mabye this helps.
2022-08-27 13:51:32 +02:00
Christoph Reiter
b36a4da1da Use a temporary pacman.conf during building
Up until now we created a backup of pacman.conf and restored it after
the build was done. This can leave the environment in an undefined state
if something crashes inbetween.

Instead create a temporary pacman.conf and use that during building.
In theory pacman allows setting a custom config via "--config", but
makepkg doesn't expose this, so that's not working. Luckily makepkg
allows overriding the pacman path via the PACMAN env var, so we create
a temporary script which just forwards everything to pacman and always
sets the temporary config.
2022-08-26 15:09:05 +02:00
Christoph Reiter
5ecdbc97a7 thinko
we are in bash here, not powershell...
2022-08-21 21:52:59 +02:00
Christoph Reiter
0d4680c01f Don't use workflow inputs directly in scripts
They could inject commands that way. Instead assign them
to an env var and then use that env var in the powershell scripts.

We want to open those controls up to more people, so we need to make
sure they can only change the values and not extract tokens etc.

Fixes #60
2022-08-21 20:37:07 +02:00
Christoph Reiter
9374b1d9b4 main: run update-status at the end 2022-08-15 14:03:50 +02:00
Jeremy Drake
22ea970beb CI: uncomment clangarm64 from pacman.conf.
instead of adding from scratch.  Once the commented-out section was
added, the grep would match that and no longer run the sed to add it.

Also remove line adding clang32 section because that was added to
default pacman.conf (and was thus a no-op).
2022-08-14 22:39:52 +02:00
Christoph Reiter
45c6b89ec7 Update the build status before stopping due to timeout
In case the job stops because it has reached the time limit it would
not update the build status and just quit. Move the timeout check
a bit later to acoid that.
2022-08-08 18:50:27 +02:00
Christoph Reiter
70c6903191 CI: update python to 3.10 and setup-python to v4 2022-08-04 21:45:15 +02:00
Christoph Reiter
f33be41b0f Update dependencies 2022-08-04 21:39:19 +02:00
Christoph Reiter
5f53dab6de Enable winjitdebug to workaround python crashing issues
Why this helps, I don't know..
2022-07-24 12:31:25 +02:00
Christoph Reiter
4dbd2618fb Update deps 2022-07-21 21:54:07 +02:00
Christoph Reiter
a43bdf9479 Add a comment as to why we split up repo-add calls
This was pointed out here: 7d84a7e086 (r75916830)
2022-06-30 21:07:00 +02:00
Christoph Reiter
9360a8eebe Update dependencies 2022-06-30 21:04:54 +02:00
Christoph Reiter
7d84a7e086 Limit the amount of packages added with repo-add in one go
It errors out if there are too many (maybe memory?)
2022-06-12 18:28:48 +02:00
Christoph Reiter
ea46306e71 Add config key for limiting the max job count
We are hitting the API limit again, so reduce from 15 to 12.
This also allows self hosted runners to limit to 1 if needed.
2022-06-11 18:32:11 +02:00
Christoph Reiter
97faefb5b3 Update deps
requests-cache now provides an option to always revalidated, so use that.
Before it mostly worked by accident.
2022-05-29 10:29:38 +02:00
Christoph Reiter
745e5e2c40 Fetch pgp keys before upgrading packages
To avoid any ABI bump breaking pgp
2022-05-08 20:08:43 +02:00
Christoph Reiter
84315e8e56 cache: include the job ID in the cache key
we just want to store/replace it every time, so we need it to be as unique as possible
2022-05-04 19:53:03 +02:00
Christoph Reiter
0a302154b0 Cache the cache
This should lead to cache hits on the first calls from the spawned
jobs right after the scheduler runs.
2022-05-04 18:55:48 +02:00
Christoph Reiter
4a5355f5dc Another try at fixing the cache race
Turns out disabling the cache just disables the monkey patching,
so we have to disable when creating the session object and not when
we are using the cache.

Create a session object without cache in the main thread at first,
so that the download code can re-use it as is later on.
2022-04-30 22:19:15 +02:00
Christoph Reiter
88b49f2c6a Avoid disabling the cache in a thread pool
It isn't thread safe, so wrap the outer code instead and just
assert in download_asset() that the caching is disabled so it's
not called with caching by accident.
2022-04-30 22:07:40 +02:00
Christoph Reiter
1684dff8bc Update mypy 2022-04-30 17:28:08 +02:00
Christoph Reiter
8870b3a342 Use requests-cache for adding etag/last-modified based caching
This doesn't speed things up usually, since we still make the same amount
of requests, but it doesn't count against the rate-limit in case there
is a cache hit. Also there is a smaller chance of things going wrong,
since we don't transfer any payload.

The cache is store in a .autobuild_cache directory using a sqlite DB.
2022-04-30 17:15:18 +02:00
Christoph Reiter
258256e739 use lru_cache for Python 3.8 compat 2022-04-30 16:23:49 +02:00
Christoph Reiter
133ce88284 Cache the main github api instances
this leads to a shared session, and a bit fewer requests
2022-04-30 16:16:54 +02:00
Christoph Reiter
099438dc3f queue_website_update: just log errors instead of failing
this is optional really
2022-04-30 12:32:10 +02:00
Christoph Reiter
94d87dac25 Retry non-pygithub HTTP requests as well
There isn't an easier way to enabled retries with requests sadly.
This also shares the session between all non-pygithub requests, so
could make things a bit faster.
2022-04-30 12:27:43 +02:00
Jeremy Drake
4384e62d01 Use labels to restrict self-hosted runner selection. 2022-04-19 07:46:26 +02:00
Christoph Reiter
cb4434c72b CI: Move the clear-failed actions into its own workflow
So they can be run independently.
2022-04-18 17:29:31 +02:00
jeremyd2019
892e1a3206 break clang/libc++ cycle
before libc++ was split off from clang package, it was built after clang within the same PKGBUILD, so this order seems reasonably safe.

Also remove a couple of prior cycle breaks from before it was possible to break them manually in a run.  These packages are not related by the same source repo and release, like mingw-w64 and llvm-project are, so are less likely to consistently require a cycle break on every upstream update.
2022-04-05 09:39:39 +02:00
Christoph Reiter
5f5d895cb1 move more common inputs up 2022-04-01 20:11:18 +02:00
Christoph Reiter
e4c2d446d2 Include the input name in the description 2022-04-01 20:09:25 +02:00
Christoph Reiter
cfe519fbb0 clear-failed: Allow clearing the failed state for packages via a workflow input 2022-04-01 20:05:38 +02:00
Christoph Reiter
1b14e2ed4d cycles: skip cycles where one of the packages is already built 2022-04-01 18:36:32 +02:00
Christoph Reiter
8c060f3142 cycles: show the version change of all packages 2022-04-01 15:44:45 +02:00
jeremyd2019
c2f77181d7 work around powershell arg parsing fail
It appears that powershell doesn't properly handle an empty argument, resulting in all the subsequent arguments being shifted left by one.

So, don't specify --optional-deps argument if it is empty.
2022-03-30 23:56:20 +02:00
Christoph Reiter
cd67c3a66a show: also set optional deps 2022-03-30 17:29:59 +02:00
Christoph Reiter
5e037680d6 also add optional deps when checking if we should run 2022-03-30 17:17:04 +02:00
Christoph Reiter
5f628fb63a and now for real 2022-03-30 17:03:37 +02:00
Christoph Reiter
777bbb73af Try to make it possible to pass optional dependencies to the workflow
The idea is that in case of a cycle we explicitely break it on a case
by case basis.
2022-03-30 17:00:20 +02:00
Christoph Reiter
c1807c19a7 show the cycles also when writing the build plan
Otherwise we don't see it in CI, since the rest is skipped if there
is nothing to build.
2022-03-30 10:08:04 +02:00
Christoph Reiter
20ba53752d show: default to not fetch the build log URLs
Instead add a --details option.

It's quite slow and rarely needed, so default to off.
2022-03-30 10:00:18 +02:00
Christoph Reiter
81dd6cabad show: include a list of dependency cycles
This gives all cycles in the queue right now, ignoring the build
status of the packages.

If one part of the cycle is already built then it will not matter.
2022-03-30 09:57:02 +02:00
Christoph Reiter
5c6f39a511 break cycle 2022-03-29 20:26:19 +02:00
Christoph Reiter
e7fdb6dab2 fix yaml types 2022-03-19 10:17:22 +01:00
Christoph Reiter
d423d68901 CI: test with Python 3.10 2022-03-19 10:16:30 +01:00
Christoph Reiter
4cc7908a95 CI: update to Python 3.9 2022-03-19 10:15:12 +01:00
Christoph Reiter
0cf933cc9b CI: run every 3 hours instead of 4
it just takes 20 secs if there is nothing to do
2022-03-18 16:36:49 +01:00
Christoph Reiter
93dd330288 CI: run the schedule job on Ubuntu
We don't need Windows there, so let's give it a try.
2022-03-18 16:09:08 +01:00
Christoph Reiter
548cd95a30 break librsvg/gtk3 cycle 2022-03-18 09:05:46 +01:00
Christoph Reiter
a8d63e2852 Some cleanup; don't break cycles if the dep isn't in the repo
When bootstrapping a cycle we can't fall back to the repo, so
someone has to upload the package manually and we shouldn't try
to build it before that.
2022-03-12 08:53:05 +01:00
Christoph Reiter
1e254ee060 Another optional dep 2022-03-11 15:50:06 +01:00
Christoph Reiter
154402b355 Wrong package name
oops
2022-03-11 15:30:15 +01:00
Christoph Reiter
0ed108506a Add a list of optional dependencies
Since we no longer break cycles in msys2-web we have to do it here.
This adds a list of optional deps for some packages. If they are there
they will be used, if not they will be ignored.

By hardcoding it we should get more a more deterministic result, but
not sure if this scales well.
2022-03-11 15:20:47 +01:00
Christoph Reiter
be6f6f2a28 Update deps 2022-03-10 20:12:39 +01:00
Christoph Reiter
8144f50ad5 CI: switch from actions/cache to using the builtin cache feature of actions/setup-python
One thing less to care about, and one less node12 action
2022-03-10 20:04:45 +01:00
Christoph Reiter
51e8ee9f76 Update some actions 2022-03-07 19:33:34 +01:00
Christoph Reiter
7e96898a06 Stop setting GIT_COMMITTER_NAME/EMAIL
We no longer use "git am" in PKGBUILD files
2022-03-05 13:15:22 +01:00
Jeremy Drake
451dca0a27 run upgrade/downgrade twice for updated assets
If one or more of the assets are in the 'core' set (such as bash,
recently), only the 'core' packages will be upgraded/downgraded in the
first run.
2022-02-20 09:01:16 +01:00
Christoph Reiter
a316cb96c2
mermaid: don't set a theme
doesn't play well with dark mode
2022-02-17 18:09:33 +01:00
Christoph Reiter
9ff6282fd6 Use new markdown mermaid support for the process diagram 2022-02-17 17:15:02 +01:00
Christoph Reiter
8b9b746cfa Revert "Revert setting GIT_COMITTER_NAME/EMAIL. It doesn't do anything."
This reverts commit 9b01428dde1d2476f177c2437bdd60063ec8147c.
2022-01-25 21:09:36 +01:00
Christoph Reiter
9b01428dde Revert setting GIT_COMITTER_NAME/EMAIL. It doesn't do anything.
Since https://github.com/msys2/MSYS2-packages/commit/97491f06184abf6
makepkg sets them, so this wasn't really doing anything.
2022-01-25 20:17:16 +01:00
Christoph Reiter
3e28396ab0 Update deps 2022-01-14 16:41:36 +01:00
Christoph Reiter
6c461095e0 CI: don't install git by default
We only needed it to configure the commiter name/email which is now
done via env vars.

If a package still needs git we pull it in via makedepends.
2022-01-14 16:15:42 +01:00
Christoph Reiter
3a63bf21e1 Set GIT_COMMITTER_NAME/EMAIL when calling makepkg
Instead of setting it with "git config" early on. This way we don't
have to change some global files/state while still getting the same result.
2022-01-14 16:13:45 +01:00
Christoph Reiter
e93758b39c Set PACKAGER in autobuild directly
Instead of depending on the caller to set it.
2022-01-14 16:09:51 +01:00
Christoph Reiter
7c422261fc Use the same build environment for all makepkg calls
Just the default environ for now
2022-01-14 16:09:23 +01:00
Christoph Reiter
58fac3caaf Don't import environ directly
Use the same style everywhere, to also avoid shadowing locals
2022-01-14 15:53:03 +01:00
Christoph Reiter
6a436ac4e9 No longer install the toolchain groups
They are no longer required. See
https://github.com/msys2/MINGW-packages/discussions/10506
2022-01-13 17:44:52 +01:00
Christoph Reiter
698f9f514f Only start jobs for build types where we own the asset release 2022-01-13 09:36:45 +01:00
Christoph Reiter
f765fe5ea7
Drop clangarm64 from the manual build list 2022-01-12 22:38:17 +01:00
Christoph Reiter
f49b8afb91 Fetch certain build types also from other repos
Allow mapping build types to external repos and make some
read-only operations work with it.

This mainly means downloading assets will now also download clangarm64
and the clangarm64 build status will be included on packages.msys2.org.
2021-12-26 15:19:19 +01:00
Jeremy Drake
456f0a1e57 fetch_assets: add option to limit build_types. 2021-12-21 04:50:24 +01:00
Christoph Reiter
1aaafbed38 msys2-devel is no longer needed 2021-12-14 20:28:14 +01:00
Christoph Reiter
91bb7945cb Update deps 2021-12-09 20:10:55 +01:00
Christoph Reiter
f712bbd622 CI: switch the hosted runner env from windows-2019 to windows-2022
Let's give it a try.
2021-12-09 20:07:59 +01:00
Christoph Reiter
8ecac52817 Update deps 2021-11-27 05:47:09 +01:00
Jeremy Drake
310a1fa4e4 Updates for ARM64 running x64 now.
Use x64 python everywhere.  Otherwise, it will try to find an ARM64
python, which Github doesn't offer in their metadata.

Pass release: false to setup-msys2 on ARM64.  My ARM64 runner has no D:,
and IO is slow enough to make setting up a fresh install on each run
prohibitive anyway.
2021-11-20 06:40:15 +01:00
Christoph Reiter
3ae4835f34 CI: fix switching to the main mirror
This broke when we switched the mirrorlist file defaults
(and also when we added more repos).

Just replace the shared mirrorlist instead.

Fixes #47
2021-11-18 19:24:08 +01:00
Christoph Reiter
a0a0b3f47b
mingw-w64-mlpack is fixed now 2021-11-05 18:38:32 +01:00
Jeremy Drake
46400708d0 Tweaks to workflow.
Use new runner.arch variable instead of checking job name to see if
we're on ARM64.  Add runner.arch to python cache key (fixes #36).

Move output of drive information to a new Runner details step, and add
output of CPU name (from MINGW-packages workflow) to that.
2021-11-04 21:40:52 +01:00
Mehdi Chinoune
51e711deb1 Change MSYS2 default Installation location 2021-11-03 16:48:11 +01:00
Christoph Reiter
6e469e2c56 fetch-assets: add --fetch-complete option
this fetches all packages, as long es they are complete
2021-11-03 08:49:23 +01:00
jeremyd2019
a4ab5bc26b fix BUILD_ROOT
`C:` is the CWD on drive C, `C:\` is the root of drive C.
2021-10-24 23:08:30 +02:00
Mehdi Chinoune
067f5c1ecd Shorten BUILD_ROOT 2021-10-24 09:49:30 +02:00
Christoph Reiter
0cfe547446 Revert "qt5-static: All manual builds"
This reverts commit 87dbe7aebc5cd274457f05e62527029f90f90e2e.
2021-10-20 09:53:39 +02:00
Alexey Pavlov
87dbe7aebc qt5-static: All manual builds 2021-10-19 21:09:52 +03:00
Christoph Reiter
4cc7035246
ignore rdeps: mingw-w64-zig 2021-10-17 20:12:52 +02:00
Christoph Reiter
aea50264e2
ignore qt-static rdeps 2021-10-14 18:44:51 +02:00
Christoph Reiter
41742850ce Revert "Revert "Update autobuild.py""
This reverts commit 5f728e1eb22bdcb3c0e97c2d7c6fdd6a025dca64.
2021-10-11 18:25:43 +02:00
Christoph Reiter
5f728e1eb2 Revert "Update autobuild.py"
This reverts commit 8c7ef11f693b7af3ad3292ff6d0ae0b80fcd8ba0.
2021-10-10 17:57:46 +02:00
Christoph Reiter
8c7ef11f69
Update autobuild.py 2021-10-10 15:07:50 +02:00
Christoph Reiter
d74753f0e5 build status: inherit blocking info instead of replacing it
a bit hacky.. but works

Fixes #42
2021-10-09 08:37:22 +02:00
Christoph Reiter
c23ca57bed Update deps 2021-10-09 07:25:03 +02:00
Christoph Reiter
9c67f65b7c make mypy happy 2021-09-15 08:54:29 +02:00
Christoph Reiter
7f3417441c Fix cleaning up failed assets 2021-09-15 08:48:00 +02:00
Christoph Reiter
6add89827b Only have one metadata file for a failed build
We used the resulting package names, but we can just key by build type
and get fewer files that way.
2021-09-15 08:35:53 +02:00
Christoph Reiter
a0713fbf40 move a call out of a loop 2021-09-12 07:02:16 +02:00
Christoph Reiter
ed2cdb03c6 Require python 3.8
for typing.Literal
2021-09-11 17:24:54 +02:00
Christoph Reiter
2707697dc4 require python 3.7 2021-09-11 17:20:12 +02:00
Christoph Reiter
c861ee86d0 Update deps 2021-09-11 17:16:51 +02:00
Christoph Reiter
5b58993660 Have one GH release per build type
This makes the code easier to understand and saves some API calls
when fetching dependencies, since only a subset of assets need to
be looked through.
2021-09-11 17:08:00 +02:00
Christoph Reiter
c2c24e50e3 More typing for literals 2021-09-11 12:03:41 +02:00
Christoph Reiter
c5688a7839
oops.. 2021-09-05 15:33:19 +02:00
Christoph Reiter
09475aabfd
clean up "ignore rdep" list 2021-09-05 09:08:35 +02:00
Christoph Reiter
79d4cbda1a
ignore rdeps: add mingw-w64-plasma-framework-qt5
until https://github.com/msys2/MINGW-packages/pull/9530 lands
2021-09-04 19:27:46 +02:00
Christoph Reiter
b738d09014 flake8 2021-08-28 11:45:32 +02:00
Christoph Reiter
10764cd166 Run everything with unbufferd stdout/err
Instead of flushing everywhere
2021-08-28 11:08:53 +02:00
Jeremy Drake
640d714345 flush stdout on a few prints that show up at odd times 2021-08-26 08:29:38 +02:00
Christoph Reiter
916fd75c11 MANUAL_BUILD: allow limiting for specific build types
qt5-static works fine with clang, so let CI handle that.
While at it unify the logic to also handle clangarm64.
2021-08-24 17:30:39 +02:00
Christoph Reiter
8d057042c4 Clean up IGNORE_RDEP_PACKAGES
These pacakges got fixed in the meantime
2021-08-24 17:21:07 +02:00
Christoph Reiter
10fdc3ec57 Download dependencies in parallel 2021-08-22 18:46:08 +02:00
Christoph Reiter
64a3c0b94e CI: fall back to GITHUB_TOKEN if GITHUBTOKENREADONLY is empty
Fixes #32
2021-08-22 18:36:21 +02:00
Christoph Reiter
842072fe55 Add all dependencies to the temp repo in one go
This should be faster since the pacman DB needs to be rebuild only once
2021-08-22 18:29:01 +02:00
Christoph Reiter
d37effda22 Use the dependency information from the API
Instead of hardcoding that mingw can depend on msys etc.
2021-08-22 18:11:46 +02:00
Christoph Reiter
975e479034 Port to new buildqueue API
No functional change
2021-08-22 17:50:18 +02:00
Christoph Reiter
b4c259019b Install the VCS group for jobs building source packages 2021-08-20 09:33:50 +02:00
Christoph Reiter
d028d3acbd fetch-assets: add --delete option to clear targetdir of unwanted files
This removes all files we no longer need from the target, while keeping
files where the mtime and size match and wont re-download them.

This is useful for keeping a directory in sync via a cron job for example.
2021-08-11 21:06:08 +02:00
Christoph Reiter
edf78a3862 Update deps 2021-08-11 20:14:40 +02:00
Christoph Reiter
51666786b6 allow src.tar.zst again 2021-07-25 09:03:16 +02:00
Christoph Reiter
be1f0f71e0 thinko 2021-07-17 21:31:02 +02:00
Christoph Reiter
7f481fdb1a Simplify build order selection
It was selecting the middle one first and then filtered the list
to find the right build type. In many cases this results on not the
middle being built in the end.
2021-07-17 21:16:59 +02:00
Christoph Reiter
71e061429e Add back the third job 2021-07-17 19:55:27 +02:00
Christoph Reiter
e2279e671a fetch-assets: warn in case the mtime of the existing file is different 2021-07-17 19:50:12 +02:00
Christoph Reiter
41f566d371 When downloading an asset preserve the mtime
We take the update_date of the release asset as the mtime.
2021-07-17 19:46:19 +02:00
Christoph Reiter
082e6ba927 Move a function to its only user 2021-07-17 19:06:40 +02:00
Christoph Reiter
8618aa349c Create the referenced releases automatically 2021-07-17 19:04:33 +02:00
Christoph Reiter
ae604e8cac Add some more type annotations 2021-07-17 18:45:56 +02:00
Christoph Reiter
e45ceae224 Move some global state into a function 2021-07-17 18:30:50 +02:00
Christoph Reiter
32d83dcdad Update deps 2021-07-17 18:26:28 +02:00
Christoph Reiter
788340e1bb Add libgda to broken packages
I tried to fix it but gave up. It segfaults at some point during the build.
2021-07-14 20:28:28 +02:00
Christoph Reiter
235648ed1b Remove third job again
We are hitting the api rate limit, so this doesn't add much
2021-07-13 19:04:41 +02:00
Christoph Reiter
44337498b1 use a separate limit for read only limits
it's the main bottle neck now, and there are not many write requests,
so keep them separate.
2021-07-13 06:07:17 +02:00
Christoph Reiter
5eb08f94cd
bump the min remaining requests a bit
with that many jobs we are hitting the rate limit too often
2021-07-13 05:32:28 +02:00
Christoph Reiter
37d15cdc42 Print the build config 2021-07-12 20:57:34 +02:00
Christoph Reiter
d3fa21febc Try to add a third build job 2021-07-12 20:51:48 +02:00
Christoph Reiter
246029f842
Add jeremyd2019 to uploaders 2021-07-11 08:29:41 +02:00
Jeremy Drake
eef874b68e Make msys2 config/installs idempotent.
To guard against the case that we may be reusing an msys2 install.

Also, increase job timeout.  According to Github docs, jobs on runners
they host are limited to 6 hours (which is also the default job limit),
but self-hosted runners can run up to the maximum runtime for a
workflow, 72 hours (4320 minutes)
2021-07-07 08:03:02 +02:00
Jeremy Drake
f54ad41f4e Add -R/--repo parameter for which repo to use.
Inspired by Github CLI's -R/--repo option.
2021-07-07 08:01:23 +02:00
Jeremy Drake
de1083d03e Windows ARM64 self-hosted runner support.
Workflow improvements to support a self-hosted runner on Windows ARM64.
The biggest issue is that the currently released versions of Windows on
ARM64 do not support x64 emulation, only x86.

Add a comment to setup-msys2 action that it would need to install a
32-bit msys for current released Windows ARM64.
2021-07-05 20:08:45 +02:00
Christoph Reiter
78b3da8727 fix 2021-07-02 23:40:02 +02:00
Christoph Reiter
a55b4f0bfd Try to filter out built types not in MINGW_ARCH_LIST 2021-07-02 22:12:37 +02:00
Christoph Reiter
58b4e7747c Make it possible to change MINGW_SRC_ARCH
The job schedule needs to take this into account
2021-07-02 22:01:05 +02:00
Christoph Reiter
7a1e258101 Clean up IGNORE_RDEP_PACKAGES
We now ignore rdeps for packages that aren't in the repo already
so these are no longer needed.
2021-07-02 21:55:52 +02:00
Christoph Reiter
219634574f Move everything config related to one place 2021-07-02 21:54:13 +02:00
Christoph Reiter
7ca0610513 Rework the upload-assets command
Instead of requiring a package name just pass a directory and match all
packages.

Also make the package patterns more strict so .sig files don't match.

This makes uploading a directory full of packages easier.
2021-06-30 00:30:04 +02:00
Christoph Reiter
d8c110587c
typo 2021-06-25 16:35:45 +02:00
Christoph Reiter
9e3bd5306d Don't pass CI related env vars to build scripts
Fixes #30
2021-06-25 10:36:02 +02:00
Christoph Reiter
3c86ba12f9 Revert "mark clang32 as wip for now"
This reverts commit 4a2b2ad7b0919b26b488235d956a314dc357395f.
2021-06-24 08:38:41 +02:00
Christoph Reiter
a7489361f5 If a package is marked incomplete it also needs to block rdeps
This just moves the check into the loop that checks deps until there are no more changes.
2021-06-24 08:14:52 +02:00
Christoph Reiter
1690ff155c Update deps 2021-06-24 08:03:35 +02:00
Christoph Reiter
4a2b2ad7b0 mark clang32 as wip for now 2021-06-24 06:42:50 +02:00
Christoph Reiter
d20d37a631
handle jobs named clang32-2 2021-06-18 07:31:25 +02:00
Christoph Reiter
676af4c2d7
Add mingw-w64-kirigami2-qt5 to IGNORE_RDEP_PACKAGES 2021-06-16 15:12:07 +02:00
Christoph Reiter
16f20bf2ec Don't block on clangarm64 if it's a reverse dep for msys packages 2021-05-23 14:12:49 +02:00
Christoph Reiter
cf48da40ba Trigger a website update after updating the build status file
And use the same request timeout everywhere
2021-05-16 20:10:44 +02:00
Christoph Reiter
d4e9a3a4b1 don't replace the status of blocked packages if the src is the only uploadable 2021-05-16 05:48:39 +02:00
Christoph Reiter
b7465338f6 fetch-assets: also include incomplete builds with --fetch-all
We want them to end up in staging still so they get some testing.
2021-05-15 11:46:20 +02:00
Christoph Reiter
f6d048f250 Explicitely block lone source packages
Otherwise we try to upload them every time because we have no way to track them.
2021-05-14 17:48:07 +02:00
Christoph Reiter
a3fb4f818f Keep ignoring clangarm64
for example sphinx is blocking other packages, but we can't easily build it and unblock things
2021-05-14 17:11:41 +02:00
Christoph Reiter
30bf4b08b4 Update deps 2021-05-14 13:32:51 +02:00
Christoph Reiter
d50c183681 Take into account if the packge to build is new or not
instead of being more lax with newer repos in general use that fact that a package is new
instead to decide when to block uploads.
2021-05-14 13:20:17 +02:00
Christoph Reiter
ea7ae5138e Remove GITHUB_USER/PASS support
afaik this got deprecated
2021-05-14 08:54:24 +02:00
Christoph Reiter
c0cea6bff9 Mark clangarm64 as WIP 2021-05-14 08:52:31 +02:00
Christoph Reiter
e13dda0bb9 Never let source packages get blocked by other things 2021-05-14 08:51:55 +02:00
Christoph Reiter
4d99bee231 Add clangarm64 2021-05-13 16:47:26 +02:00
Christoph Reiter
4ba4930f7e Store multiple URLs per failed run
So we can link to multiple pages
2021-05-01 15:54:38 +02:00
Christoph Reiter
99330be9d6 GHA doesn't give us the real run name, so set it manually 2021-05-01 15:19:59 +02:00
Christoph Reiter
fba4a9e16e Add a command for clearing the failed state of one build type
This is useful for mass rebuilds of one build type
2021-05-01 14:22:39 +02:00
Christoph Reiter
478184ad37 Try to improve the error log url
github makes it hard to get the url of the currently running job.
I think this should work, at least in our setup
2021-05-01 14:11:37 +02:00
Christoph Reiter
cbff3ed167 update_status: handle requests exceptions as well
like in other places. they leak through..
2021-04-26 19:47:21 +02:00
Christoph Reiter
41e990ace1 Install mingw-w64-clang-i686-toolchain after adding the repo 2021-04-25 16:52:57 +02:00
Christoph Reiter
6938d8b09d Add clang32 repo
we don't have it in pacman for now
2021-04-25 15:22:45 +02:00
Christoph Reiter
dad24d4aef Add clang32 2021-04-25 14:50:25 +02:00
Christoph Reiter
e779c2595f CI: enable write permissions for the build job 2021-04-24 18:06:40 +02:00
Christoph Reiter
41ce6dcf6f Add clang64 2021-04-23 16:45:08 +02:00
Christoph Reiter
12703c6cd3 CI: reduce cron time to every 4 hours
The high poll rate was mostly there to avoid pauses if there are many things to build.

Since jobs now get queued we can poll at the rate at which the jobs hit their
soft limit and stop beginning new builds. If there are more builds left then
the next queued workflow will start right away now.
2021-04-21 15:25:03 +02:00
Christoph Reiter
a609a9d398 Try to disable concurrency for the workflow
Ideally this means we always have a pending workflow when the old one
didn't built everything.
2021-04-20 17:04:20 +02:00
Christoph Reiter
8ebdca930c
MINGW_INSTALLS -> MINGW_ARCH 2021-04-12 08:25:32 +02:00
Christoph Reiter
67d78714f7 wait_for_api_limit_reset: handle both read and write tokens
We want to wait if either of them reaches the limit
2021-04-11 16:26:14 +02:00
Christoph Reiter
3f115655b3 Use a read-only PAT for read only operations
We now use a dummy PAT for read-only requests and the GHA token
for any write operations. This should give us 5x more ready-only
API calls per hour.
2021-04-11 16:10:58 +02:00
Christoph Reiter
177fa71ff2 pygithub leaks requests exceptions 2021-04-11 15:29:48 +02:00
Christoph Reiter
79ab25aa7d pygithub: try to retry requests
We get a lot of errors recently
2021-04-11 14:37:57 +02:00
Christoph Reiter
251a70c6d0 Ignore ucrt64 when deciding which packages are ready
So it doesn't block everything
2021-04-05 11:24:10 +02:00
Christoph Reiter
0ba23c7f0a Fix upload without replacment
I missed that case with the last API call reduction change
2021-04-04 09:31:34 +02:00
Christoph Reiter
79a11d4c1d
Reduce API limit to 50 and flush stdout
50 should be enough, and maybe flushing makes the message show in the logs
2021-04-04 09:14:54 +02:00
Christoph Reiter
ffebe82072 More debug output for the rate limit waiting 2021-04-03 15:46:01 +02:00
Christoph Reiter
4e74dcd802 Wait when writing the build plan too 2021-04-03 11:20:46 +02:00
Christoph Reiter
3b01ae2d7a Sleep in case the remaining api calls fall below 100
Before we start a build and before we want to upload
2021-04-03 11:03:56 +02:00
Christoph Reiter
1be021c37c
bump the limit for additional jobs a bit
otherwise we trigger api rate limits too easily
2021-04-02 21:57:55 +02:00
Christoph Reiter
99ee497121 Ignore errors during the status update
In case multiple jobs want to replace the same asset things get racy,
so just ignore errors for now.
2021-04-02 15:06:34 +02:00
Christoph Reiter
5e435f16c5 Print the api rate limit status from time to time 2021-04-02 14:35:25 +02:00
Christoph Reiter
1cf6bcd510 Support more than one job per build type
The second one just starts from the end, so there is less chance of
them building the same thing.
2021-04-02 14:32:16 +02:00
Christoph Reiter
535a1cb670 Avoid fetching assets when uploading in the common case
In case the upload just works we don't have to look for things to delete first.
Saves some API calls..
2021-04-02 14:10:07 +02:00
Christoph Reiter
e45ba0dde5 Only build a source package if we need to
In case the repo already contains a package with the same base and version
we can assume that the source package is already in the repo as well.
2021-04-02 11:15:42 +02:00
Christoph Reiter
0d948c349d Fix one case where an incomplete package could still be moved to staging 2021-03-31 21:37:58 +02:00
Christoph Reiter
94e8b7f8d3 In case there are only blocked related builds block all instead of marking as incomplete
Instead fo marking them incomplete which would prevent them from reaching staging.
2021-03-31 10:17:10 +02:00
Christoph Reiter
533127815b Add ucrt64 support 2021-03-25 17:38:48 +01:00
Christoph Reiter
3e10bb5f32 Handle the blocking recursively
oh well...
2021-03-14 08:53:39 +01:00
Christoph Reiter
1d87fa448c Blocked and finished also needs to block all deps/rdeps
Fixes #27
2021-03-14 08:24:49 +01:00
Christoph Reiter
2e47253d7c
Ignore mingw-w64-tolua for now 2021-03-13 18:53:08 +01:00
Christoph Reiter
f16726abdc fetch-validpgpkeys: source before set -e 2021-02-22 20:02:10 +01:00
Christoph Reiter
ed06d345cd
Skip mingw-w64-arm-none-eabi-gcc
Takes too long
2021-02-22 09:27:08 +01:00
Christoph Reiter
18d1bd382d Fetch validpgpkeys before running makepkg
auto retrieve only allows one keyserver and that is flaky, also
not all signatures include the full length key.
2021-02-21 13:21:00 +01:00
Christoph Reiter
571bdbec92 thinko 2021-02-08 19:31:21 +01:00
Christoph Reiter
73b6385940 Spawn one job for each build type 2021-02-08 19:25:12 +01:00
Christoph Reiter
9ae54273b1 Dynamically create a build matrix
at least try to
2021-02-08 18:35:05 +01:00
Christoph Reiter
231d8b7214 Pin the wheel version as well
might make things faster
2021-02-08 12:16:06 +01:00
Christoph Reiter
a6df7c30a3 CI: don't include hashes in requirements.txt
maybe it fixes the wheel caching
2021-02-08 10:48:38 +01:00
Christoph Reiter
efdb55d12f Revert "try to update pip"
This reverts commit ef7d26c3d931b02a33373e5708db7540ea4f0780.
2021-02-08 10:48:03 +01:00
Christoph Reiter
ef7d26c3d9 try to update pip 2021-02-08 10:26:38 +01:00
Christoph Reiter
b489f676fe Move some common tasks back to the main job
We want to split things up and this just leads to too many API calls
2021-02-08 09:49:42 +01:00
Christoph Reiter
a149e02742 Split the build into two jobs
One decides if we should build, the scond builds.
In the future we might spawn more build jobs in parallel.
2021-02-08 09:00:36 +01:00
Christoph Reiter
770dca45d1 Support Python 3.6 and test in CI 2021-02-07 10:49:43 +01:00
Christoph Reiter
171962d948 include the version in the status json 2021-02-06 15:03:41 +01:00
Christoph Reiter
cf51c634ca More typing 2021-02-05 17:52:41 +01:00
Christoph Reiter
d131e8417c No longer store split package names
We always consider all split packages for a split type now and
never one split package alone.
2021-02-05 17:46:43 +01:00
Christoph Reiter
693c8fb667 Get rid of a hack to download all related assets by making them fake deps
Instead handle this in the place where we decide what to download.
2021-02-05 17:39:22 +01:00
Christoph Reiter
63e2681784 Separate messages for missing deps an rdeps 2021-02-05 17:20:07 +01:00
Christoph Reiter
f6c0aa9068 Also handle reverse deps from msys to mingw 2021-02-05 17:13:26 +01:00
Christoph Reiter
0b88e29e87 Clean up the package exception lists 2021-02-05 16:37:03 +01:00
Christoph Reiter
dd5df302c9 oops.. 2021-02-05 16:33:08 +01:00
Christoph Reiter
279cabaa98 Handle mingw packages depending on msys ones
This is the only cross build type dep we allow.
2021-02-05 16:19:34 +01:00
Christoph Reiter
04735f8c7f Deduplicate and sort missing deps in the description 2021-02-04 10:41:24 +01:00
Christoph Reiter
6df6f9b7ee
Merge pull request #24 from podsvirov/fixed-typo-in-sequence-diagram
Fixed typo in sequence diagram
2021-02-03 19:46:15 +01:00
Konstantin Podsvirov
c0ece6b0ee Fixed typo in sequence diagram 2021-02-03 21:39:09 +03:00
Christoph Reiter
bd3d4d38c4 Remove error when building the same thing twice
This was to prevent bugs leading to loops, but also prevents
rebuilds in case of flaky builds that are manually reset.

Things seem stable enough now, so remove it.
2021-01-31 13:51:55 +01:00
Christoph Reiter
2ce31a81f6 Update the build status before/after each build
Instead of before abd after the CI job. We already have most the information
ready in that case so we can do it more often without hitting the GH API too much.

This gives us faster status updates on the website.
2021-01-30 20:00:05 +01:00
Christoph Reiter
4eaa5d3941
try blender again 2021-01-24 17:00:45 +01:00
Christoph Reiter
87d7916308 Always build source packages in CI
They should always work and not hit any limits
2021-01-24 13:33:13 +01:00
Christoph Reiter
4daf82d87f Add an upload-assets command
Example:

* build package
* ./autobuild.py upload-assets mingw-w64-blender

It will look automatically for the right files in the current directory (or specify
a directory via --path) and upload them. Use --dry-run to see what gets uploaded first.

Our CI can't handle all packages since some reach the build time limit
and some toolchain packages have cyclic dependencies that make DLL version
bumps impossible without locally copying DLLs around.

For this maintainers can now upload the missing packages and unblock CI.
Up until now only GHA was allowed to upload. This gets replaced by a list of allowed
uploaders. It doesn't add much security, but better than nothing.
2021-01-23 21:11:02 +01:00
Christoph Reiter
4ccc958f85 Fix finished check for blocking
We only want to know if they are finished, not if they can be downloaded.
2021-01-23 20:48:21 +01:00
Christoph Reiter
b6111084a4 Add a shebang 2021-01-23 19:56:29 +01:00
Christoph Reiter
26d91ee4c2 Add instructions for how to install dependencies via pacman
Fixes #23
2021-01-23 19:42:15 +01:00
Christoph Reiter
d428412330 typo 2021-01-22 19:48:02 +01:00
Christoph Reiter
3a2c7dbccf Update the status file after we are done building as well
This is the point where most likely something has changed.
2021-01-22 19:39:25 +01:00
Christoph Reiter
8360a63dea update-status: print some success message 2021-01-22 17:28:01 +01:00
Christoph Reiter
294a27a650 Create a status.json file on each run
This can be used on packages.msys2.org
2021-01-22 17:22:47 +01:00
Christoph Reiter
2c024794af Fix 2021-01-22 16:31:00 +01:00
Christoph Reiter
584cea762b Rework everything
It now calculates the status of all builds upfront and then just decides
what to do about it.

This means we can potentially dump the status to a json file and re-use
it in other places.
2021-01-22 16:13:14 +01:00
35 changed files with 3178 additions and 1321 deletions

4
.flake8 Normal file
View File

@ -0,0 +1,4 @@
[flake8]
max-line-length = 110
exclude =
.venv/

View File

@ -2,104 +2,206 @@ name: 'build'
on:
workflow_dispatch:
inputs:
optional_deps:
description: 'optional_deps=pkg-A:optional-dep-B,pkg-C:optional-dep-D'
default: ''
required: false
type: string
context:
description: 'Extra information from invoker'
default: ''
required: false
type: string
schedule:
- cron: '0 0/2 * * *'
- cron: '0 0/3 * * *'
env:
CI: true
PY_COLORS: 1
PYTHONUNBUFFERED: 1
permissions: {}
jobs:
build:
runs-on: windows-latest
schedule:
runs-on: ubuntu-24.04
permissions:
contents: write
concurrency: autobuild-maint
outputs:
build-plan: ${{ steps.check.outputs.build-plan }}
steps:
- uses: actions/checkout@v2
- name: Dump inputs
if: ${{ github.event_name == 'workflow_dispatch' }}
env:
CONTEXT: '${{ toJSON(github.event.inputs) }}'
run: |
echo "$CONTEXT"
- uses: actions/setup-python@v2
- uses: actions/checkout@v5
with:
python-version: '3.8'
persist-credentials: false
- uses: actions/cache@v2
- uses: actions/setup-python@v5
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-
python-version: '3.13'
cache: 'pip'
cache-dependency-path: 'requirements.txt'
- name: Install deps
env:
PIP_DISABLE_PIP_VERSION_CHECK: 1
run: |
python -m pip install --user wheel
python -m pip install --user -r requirements.txt
python -m venv .venv
source .venv/bin/activate
python -m pip install -r requirements.txt
echo "$VIRTUAL_ENV/bin" >> $GITHUB_PATH
- name: Check if we should run
- name: autobuild cache
uses: actions/cache@v4
with:
path: ${{ github.workspace }}/.autobuild_cache
key: autobuild_cache-${{ github.job }}-${{ github.run_id }}-${{ github.run_attempt }}
restore-keys: autobuild_cache-
- name: Check what we should run
id: check
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
OPTIONAL_DEPS: ${{ github.event.inputs.optional_deps }}
run: |
python autobuild.py should-run
$skipBuild = ($LASTEXITCODE -ne 0)
If ($skipBuild) {echo '::set-output name=skip-build::true'}
exit 0
python -m msys2_autobuild write-build-plan --optional-deps "$OPTIONAL_DEPS" build_plan.json
buildPlan="$(cat build_plan.json)"
echo "build-plan=$buildPlan" >> $GITHUB_OUTPUT
- name: Clean up assets
if: steps.check.outputs.skip-build != 'true'
if: steps.check.outputs.build-plan != '[]'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
run: |
python autobuild.py clean-assets
python -m msys2_autobuild clean-assets
- name: Show build queue
if: steps.check.outputs.skip-build != 'true'
if: steps.check.outputs.build-plan != '[]'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
OPTIONAL_DEPS: ${{ github.event.inputs.optional_deps }}
run: |
python autobuild.py show
python -m msys2_autobuild show --optional-deps "$OPTIONAL_DEPS"
- uses: msys2/setup-msys2@v2
if: steps.check.outputs.skip-build != 'true'
build:
timeout-minutes: 4320
needs: schedule
permissions:
contents: write
concurrency: autobuild-build-${{ matrix.name }}
if: ${{ needs.schedule.outputs.build-plan != '[]' }}
strategy:
fail-fast: false
matrix:
include: ${{ fromJson(needs.schedule.outputs.build-plan) }}
name: ${{ matrix.name }}
runs-on: ${{ matrix.runner }}
steps:
- name: Configure Pagefile
if: ${{ matrix.hosted }}
# https://github.com/al-cheb/configure-pagefile-action/issues/16
continue-on-error: true
uses: al-cheb/configure-pagefile-action@a3b6ebd6b634da88790d9c58d4b37a7f4a7b8708
with:
minimum-size: 4GB
maximum-size: 16GB
disk-root: "C:"
- name: Runner details
run: |
Get-PSDrive -PSProvider FileSystem
Get-CIMInstance -Class Win32_Processor | Select-Object -Property Name
- uses: actions/checkout@v5
with:
persist-credentials: false
- uses: actions/setup-python@v5
id: python
with:
python-version: '3.13'
# Avoid it setting CMake/pkg-config variables
# https://github.com/actions/setup-python/blob/main/docs/advanced-usage.md#environment-variables
update-environment: false
# Work around https://github.com/actions/setup-python/issues/1050
- name: Cache pip dependencies
uses: actions/cache@v4
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-${{ runner.arch }}-pip-${{ hashFiles('requirements.txt') }}
restore-keys: |
${{ runner.os }}-${{ runner.arch }}-pip-
- name: Install deps
env:
PIP_DISABLE_PIP_VERSION_CHECK: 1
PYTHON_PATH: ${{ steps.python.outputs.python-path }}
run: |
& "$env:PYTHON_PATH" -m venv .venv
.\.venv\Scripts\activate
python -m pip install -r requirements.txt
echo "$env:VIRTUAL_ENV\Scripts" | Out-File -FilePath $env:GITHUB_PATH -Encoding utf8 -Append
- name: autobuild cache
uses: actions/cache@v4
with:
path: ${{ github.workspace }}/.autobuild_cache
key: autobuild_cache-${{ github.job }}-${{ github.run_id }}-${{ github.run_attempt }}
restore-keys: autobuild_cache-
# Note that ARM64 prior to Win11 requires x86 msys, but this will install x64
- uses: msys2/setup-msys2@v2 # zizmor: ignore[unpinned-uses]
id: msys2
with:
msystem: MSYS
update: true
install: msys2-devel base-devel mingw-w64-x86_64-toolchain mingw-w64-i686-toolchain git
install: ${{ matrix.packages }}
location: '\M'
release: ${{ matrix.hosted }}
cache: ${{ matrix.hosted }}
- name: Switch to the main mirror
if: steps.check.outputs.skip-build != 'true'
shell: msys2 {0}
run: |
sed -e "s|Include = /etc/pacman.d/mirrorlist.mingw32|Server = http://repo.msys2.org/mingw/i686/|g" -i /etc/pacman.conf
sed -e "s|Include = /etc/pacman.d/mirrorlist.mingw64|Server = http://repo.msys2.org/mingw/x86_64/|g" -i /etc/pacman.conf
sed -e "s|Include = /etc/pacman.d/mirrorlist.msys|Server = http://repo.msys2.org/msys/\$arch/|g" -i /etc/pacman.conf
echo 'Server = https://repo.msys2.org/mingw/$repo/' > /etc/pacman.d/mirrorlist.mingw
echo 'Server = https://repo.msys2.org/msys/$arch/' > /etc/pacman.d/mirrorlist.msys
pacman-conf.exe
- name: Update using the main mirror
if: steps.check.outputs.skip-build != 'true'
- name: Update using the main mirror & Check install
run: |
msys2 -c 'pacman --noconfirm -Suuy'
msys2 -c 'pacman --noconfirm -Suu'
- name: Check install
if: steps.check.outputs.skip-build != 'true'
run: |
msys2 -c 'pacman -Qkq'
- name: Init git
if: steps.check.outputs.skip-build != 'true'
shell: msys2 {0}
run: |
git config --global user.email 'ci@msys2.org'
git config --global user.name 'MSYS2 Continuous Integration'
- name: Process build queue
if: steps.check.outputs.skip-build != 'true'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
# https://github.com/actions/runner/issues/324#issuecomment-3324382354
# https://github.com/actions/runner/pull/4053
JOB_CHECK_RUN_ID: ${{ job.check_run_id }}
MSYS2_ROOT: ${{ steps.msys2.outputs.msys2-location }}
run: |
$env:PACKAGER='CI (msys2-autobuild/' + $env:GITHUB_SHA.Substring(0, 8) + '/' + $env:GITHUB_RUN_ID + ')'
$BUILD_ROOT='C:\_'
$MSYS2_ROOT=(msys2 -c 'cygpath -w /')
Get-PSDrive -PSProvider FileSystem
python autobuild.py build "$MSYS2_ROOT" "$BUILD_ROOT"
echo "JOB_CHECK_RUN_ID=$env:JOB_CHECK_RUN_ID"
$BUILD_ROOT=Join-Path (Split-Path $env:GITHUB_WORKSPACE -Qualifier) "\"
python -m msys2_autobuild build ${{ matrix.build-args }} "$env:MSYS2_ROOT" "$BUILD_ROOT"

80
.github/workflows/maint.yml vendored Normal file
View File

@ -0,0 +1,80 @@
name: 'maint'
on:
workflow_dispatch:
inputs:
clear_failed_packages:
description: 'clear_failed_packages=mingw-w64-foo,mingw-w64-bar'
default: ''
required: false
type: string
clear_failed_build_types:
description: 'clear_failed_build_types=mingw64,clang64'
default: ''
required: false
type: string
context:
description: 'Extra information from invoker'
default: ''
required: false
type: string
permissions: {}
concurrency: autobuild-maint
jobs:
schedule:
runs-on: ubuntu-24.04
permissions:
contents: write
steps:
- name: Dump inputs
if: ${{ github.event_name == 'workflow_dispatch' }}
env:
CONTEXT: '${{ toJSON(github.event.inputs) }}'
run: |
echo "$CONTEXT"
- uses: actions/checkout@v5
with:
persist-credentials: false
- uses: actions/setup-python@v5
with:
python-version: '3.13'
cache: 'pip'
cache-dependency-path: 'requirements.txt'
- name: Install deps
env:
PIP_DISABLE_PIP_VERSION_CHECK: 1
run: |
python -m venv .venv
source .venv/bin/activate
python -m pip install -r requirements.txt
echo "$VIRTUAL_ENV/bin" >> $GITHUB_PATH
- name: Clear failed build types
if: ${{ github.event.inputs.clear_failed_build_types != '' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
CLEAR_FAILED_BUILD_TYPES: ${{ github.event.inputs.clear_failed_build_types }}
run: |
python -m msys2_autobuild clear-failed --build-types "$CLEAR_FAILED_BUILD_TYPES"
python -m msys2_autobuild update-status
- name: Clear failed packages
if: ${{ github.event.inputs.clear_failed_packages != '' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
CLEAR_FAILED_PACKAGES: ${{ github.event.inputs.clear_failed_packages }}
run: |
python -m msys2_autobuild clear-failed --packages "$CLEAR_FAILED_PACKAGES"
python -m msys2_autobuild update-status

61
.github/workflows/test.yml vendored Normal file
View File

@ -0,0 +1,61 @@
name: test
on: [push, pull_request]
permissions:
contents: read
jobs:
test:
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: [ubuntu-24.04, windows-2022, windows-11-arm]
python-version: ['3.12', '3.13']
steps:
- uses: actions/checkout@v5
with:
persist-credentials: false
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install poetry
python -m poetry install
- name: Run mypy
run: |
python -m poetry run mypy .
- name: Run flake8
run: |
python -m poetry run flake8 .
- name: Run tests
run: |
python -m poetry run pytest
zizmor:
runs-on: ubuntu-24.04
permissions:
contents: read
security-events: write
steps:
- name: Checkout repository
uses: actions/checkout@v5
with:
persist-credentials: false
- name: Run zizmor
run: pipx run zizmor .
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}

3
.gitignore vendored
View File

@ -1,3 +1,4 @@
*.pyc
.vscode/
.mypy_cache/
.mypy_cache/
.autobuild_cache/

2
.mypy.ini Normal file
View File

@ -0,0 +1,2 @@
[mypy]
ignore_missing_imports = True

View File

@ -1,60 +1,50 @@
# msys2-autobuild
## autobuild.py
msys2-autobuild is a Python tool for
* automatically building MSYS2 packages in GitHub Actions
* manually uploading packages, or retrying builds
* retrieving the built packages for upload to the pacman repo
## Installation
```console
$ python -m pip install --user -r requirements.txt
$ pacman -S mingw-w64-x86_64-python-tabulate mingw-w64-x86_64-python-pygithub mingw-w64-x86_64-python-requests
# or
$ poetry install
# or
$ python -m pip install --user -r requirements.txt
# or
$ pipx install git+https://github.com/msys2/msys2-autobuild
```
## Usage
```console
$ python autobuild.py --help
usage: autobuild.py [-h] {build,show,should-run,fetch-assets,clean-assets} ...
$ msys2-autobuild --help
usage: msys2-autobuild [-h]
{build,show,write-build-plan,update-status,fetch-assets,upload-assets,clear-failed,clean-assets}
...
Build packages
optional arguments:
options:
-h, --help show this help message and exit
subcommands:
{build,show,should-run,fetch-assets,clean-assets}
{build,show,write-build-plan,update-status,fetch-assets,upload-assets,clear-failed,clean-assets}
build Build all packages
show Show all packages to be built
should-run Fails if the workflow shouldn't run
write-build-plan Write a GHA build matrix setup
update-status Update the status file
fetch-assets Download all staging packages
upload-assets Upload packages
clear-failed Clear the failed state for packages
clean-assets Clean up GHA assets
```
## Automated Build Process
## Configuration
The following graph shows what happens between a PKGBUILD getting changed in git
and the built package being available in the pacman repo.
![sequence](./docs/sequence.svg)
### Security Considerations
Assuming changes to PKGBUILDs are properly reviewed, the pacman signature
checking works, the upstream source is OK and all MSYS2 organization members are
trusted we need to consider a bad actor controlling some part of the building
process between the PKGBUILD getting changed and the package ending up signed in
the pacman repo.
A bad actor would need to get a package on the machine of the developer signing
the package and adding it to the pacman repo. We take the following precautions:
* We only build packages automatically with GitHub Actions without third party
actions, excluding the official GitHub ones. We assume the GHA images and
official actions are safe.
* The download tool used by the person signing the package checks that the
binaries where uploaded by GHA so that uploading a package with a personal
account leads to an error. Someone would need to push a workflow change to the
repo which gets run and uploads a package to the release assets to avoid that.
We assume the bad actor doesn't have git push rights.
* Packages too large for GHA get built/signed by MSYS2 developers on their
machines. We assume the developer machines are safe.
* We enforce 2FA for the MSYS2 organization to make account takeovers of
existing MSYS2 developers harder.
Feedback and ideas on how to improve this welcome.
* `GITHUB_TOKEN` (required) - a GitHub token with write access to the current repo.
* `GITHUB_TOKEN_READONLY` (optional) - a GitHub token with read access to the current repo. This is used for read operations to not get limited by the API access limits.
* `GITHUB_REPOSITORY` (optional) - the path to the GitHub repo this is uploading to. Used for deciding which things can be built and where to upload them to. Defaults to `msys2/msys2-autobuild`.

View File

@ -1,892 +0,0 @@
import sys
import os
import argparse
from os import environ
from github import Github
from github.GitRelease import GitRelease
from github.GitReleaseAsset import GitReleaseAsset
from github.Repository import Repository
from pathlib import Path, PurePosixPath, PurePath
from subprocess import check_call
import subprocess
from sys import stdout
import fnmatch
import traceback
from tabulate import tabulate
from concurrent.futures import ThreadPoolExecutor
from contextlib import contextmanager
import requests
import shlex
import time
import tempfile
import shutil
import json
from hashlib import sha256
from typing import Generator, Union, AnyStr, List, Any, Dict, Tuple, Set, Sequence, \
Collection, Optional
_PathLike = Union[os.PathLike, AnyStr]
class _Package(dict):
def __repr__(self):
return "Package(%r)" % self["name"]
def __hash__(self):
return id(self)
def __eq__(self, other):
return self is other
def get_build_patterns(self, build_type: str) -> List[str]:
patterns = []
if build_type in ["mingw-src", "msys-src"]:
patterns.append(f"{self['name']}-{self['version']}.src.tar.*")
elif build_type in ["mingw32", "mingw64", "msys"]:
for item in self['packages'].get(build_type, []):
patterns.append(f"{item}-{self['version']}-*.pkg.tar.*")
else:
assert 0
return patterns
def get_failed_names(self, build_type: str) -> List[str]:
names = []
if build_type in ["mingw-src", "msys-src"]:
names.append(f"{self['name']}-{self['version']}.failed")
elif build_type in ["mingw32", "mingw64", "msys"]:
for item in self['packages'].get(build_type, []):
names.append(f"{item}-{self['version']}.failed")
else:
assert 0
return names
def get_build_types(self) -> List[str]:
build_types = list(self["packages"].keys())
if any(k.startswith("mingw") for k in self["packages"].keys()):
build_types.append("mingw-src")
if "msys" in self["packages"].keys():
build_types.append("msys-src")
return build_types
def get_repo_type(self) -> str:
return "msys" if self['repo'].startswith('MSYS2') else "mingw"
# After which we shouldn't start a new build
SOFT_TIMEOUT = 60 * 60 * 3
# Packages that take too long to build, and should be handled manually
SKIP: List[str] = [
# 'mingw-w64-clang',
# 'mingw-w64-arm-none-eabi-gcc',
# 'mingw-w64-gcc',
'mingw-w64-gcc-git',
'mingw-w64-firebird-git',
'mingw-w64-qt5-static',
'mingw-w64-blender',
]
# FIXME: Packages that should be ignored if they depend on other things
# in the queue. Ideally this list should be empty..
IGNORE_RDEP_PACKAGES: List[str] = [
"mingw-w64-vrpn",
"mingw-w64-cocos2d-x",
"mingw-w64-mlpack",
"mingw-w64-qemu",
"mingw-w64-ghc",
"mingw-w64-python-notebook",
"mingw-w64-python-pywin32",
"mingw-w64-usbmuxd",
"mingw-w64-ldns",
"mingw-w64-npm",
"mingw-w64-yarn",
"mingw-w64-bower",
"mingw-w64-nodejs",
"mingw-w64-cross-conemu-git",
"mingw-w64-blender",
"mingw-w64-godot-cpp",
]
REPO = "msys2/msys2-autobuild"
def get_current_run_url() -> Optional[str]:
if "GITHUB_RUN_ID" in os.environ and "GITHUB_REPOSITORY" in os.environ:
run_id = os.environ["GITHUB_RUN_ID"]
repo = os.environ["GITHUB_REPOSITORY"]
return f"https://github.com/{repo}/actions/runs/{run_id}"
return None
def run_cmd(msys2_root: _PathLike, args, **kwargs):
executable = os.path.join(msys2_root, 'usr', 'bin', 'bash.exe')
env = kwargs.pop("env", os.environ.copy())
env["CHERE_INVOKING"] = "1"
env["MSYSTEM"] = "MSYS"
env["MSYS2_PATH_TYPE"] = "minimal"
check_call([executable, '-lc'] + [shlex.join([str(a) for a in args])], env=env, **kwargs)
@contextmanager
def fresh_git_repo(url: str, path: _PathLike) -> Generator:
if not os.path.exists(path):
check_call(["git", "clone", url, path])
else:
check_call(["git", "fetch", "origin"], cwd=path)
check_call(["git", "reset", "--hard", "origin/master"], cwd=path)
try:
yield
finally:
assert os.path.exists(path)
try:
check_call(["git", "clean", "-xfdf"], cwd=path)
except subprocess.CalledProcessError:
# sometimes it fails right after the build has failed
# not sure why
pass
check_call(["git", "reset", "--hard", "HEAD"], cwd=path)
@contextmanager
def gha_group(title: str) -> Generator:
print(f'\n::group::{title}')
stdout.flush()
try:
yield
finally:
print('::endgroup::')
stdout.flush()
class BuildError(Exception):
pass
def asset_is_complete(asset: GitReleaseAsset) -> bool:
# assets can stay around in a weird incomplete state
# in which case asset.state == "starter". GitHub shows
# them with a red warning sign in the edit UI.
return asset.state == "uploaded"
def download_asset(asset: GitReleaseAsset, target_path: str) -> None:
assert asset_is_complete(asset)
with requests.get(asset.browser_download_url, stream=True, timeout=(15, 30)) as r:
r.raise_for_status()
fd, temppath = tempfile.mkstemp()
try:
os.chmod(temppath, 0o644)
with os.fdopen(fd, "wb") as h:
for chunk in r.iter_content(4096):
h.write(chunk)
shutil.move(temppath, target_path)
finally:
try:
os.remove(temppath)
except OSError:
pass
def upload_asset(release: GitRelease, path: _PathLike, replace: bool = False,
text: bool = False) -> None:
# type_: msys/mingw/failed
if not environ.get("CI"):
print("WARNING: upload skipped, not running in CI")
return
path = Path(path)
basename = os.path.basename(str(path))
asset_name = get_gh_asset_name(basename, text)
asset_label = basename
for asset in get_release_assets(release, include_incomplete=True):
if asset_name == asset.name:
# We want to tread incomplete assets as if they weren't there
# so replace them always
if replace or not asset_is_complete(asset):
asset.delete_asset()
else:
print(f"Skipping upload for {asset_name} as {asset_label}, already exists")
return
release.upload_asset(str(path), label=asset_label, name=asset_name)
print(f"Uploaded {asset_name} as {asset_label}")
def get_python_path(msys2_root: _PathLike, msys2_path: _PathLike) -> Path:
return Path(os.path.normpath(str(msys2_root) + str(msys2_path)))
def to_pure_posix_path(path: _PathLike) -> PurePath:
return PurePosixPath("/" + str(path).replace(":", "", 1).replace("\\", "/"))
@contextmanager
def backup_pacman_conf(msys2_root: _PathLike) -> Generator:
conf = get_python_path(msys2_root, "/etc/pacman.conf")
backup = get_python_path(msys2_root, "/etc/pacman.conf.backup")
shutil.copyfile(conf, backup)
try:
yield
finally:
os.replace(backup, conf)
@contextmanager
def auto_key_retrieve(msys2_root: _PathLike) -> Generator:
home_dir = os.path.join(msys2_root, "home", environ["USERNAME"])
assert os.path.exists(home_dir)
gnupg_dir = os.path.join(home_dir, ".gnupg")
os.makedirs(gnupg_dir, exist_ok=True)
conf = os.path.join(gnupg_dir, "gpg.conf")
backup = None
if os.path.exists(conf):
backup = conf + ".backup"
shutil.copyfile(conf, backup)
try:
with open(conf, "w", encoding="utf-8") as h:
h.write("""
keyserver hkp://keys.gnupg.net
keyserver-options auto-key-retrieve
""")
yield
finally:
if backup is not None:
os.replace(backup, conf)
def build_type_to_dep_type(build_type):
if build_type == "mingw-src":
dep_type = "mingw64"
elif build_type == "msys-src":
dep_type = "msys"
else:
dep_type = build_type
return dep_type
@contextmanager
def staging_dependencies(
build_type: str, pkg: _Package, msys2_root: _PathLike,
builddir: _PathLike) -> Generator:
repo = get_repo()
def add_to_repo(repo_root, repo_type, asset):
repo_dir = Path(repo_root) / get_repo_subdir(repo_type, asset)
os.makedirs(repo_dir, exist_ok=True)
print(f"Downloading {get_asset_filename(asset)}...")
package_path = os.path.join(repo_dir, get_asset_filename(asset))
download_asset(asset, package_path)
repo_name = "autobuild-" + (
str(get_repo_subdir(repo_type, asset)).replace("/", "-").replace("\\", "-"))
repo_db_path = os.path.join(repo_dir, f"{repo_name}.db.tar.gz")
conf = get_python_path(msys2_root, "/etc/pacman.conf")
with open(conf, "r", encoding="utf-8") as h:
text = h.read()
uri = to_pure_posix_path(repo_dir).as_uri()
if uri not in text:
with open(conf, "w", encoding="utf-8") as h2:
h2.write(f"""[{repo_name}]
Server={uri}
SigLevel=Never
""")
h2.write(text)
run_cmd(msys2_root, ["repo-add", to_pure_posix_path(repo_db_path),
to_pure_posix_path(package_path)], cwd=repo_dir)
def get_cached_assets(
repo: Repository, release_name: str, *, _cache={}) -> List[GitReleaseAsset]:
key = (repo.full_name, release_name)
if key not in _cache:
release = repo.get_release(release_name)
_cache[key] = get_release_assets(release)
return _cache[key]
repo_root = os.path.join(builddir, "_REPO")
try:
shutil.rmtree(repo_root, ignore_errors=True)
os.makedirs(repo_root, exist_ok=True)
with backup_pacman_conf(msys2_root):
to_add = []
dep_type = build_type_to_dep_type(build_type)
for name, dep in pkg['ext-depends'].get(dep_type, {}).items():
pattern = f"{name}-{dep['version']}-*.pkg.*"
repo_type = dep.get_repo_type()
for asset in get_cached_assets(repo, "staging-" + repo_type):
if fnmatch.fnmatch(get_asset_filename(asset), pattern):
to_add.append((repo_type, asset))
break
else:
raise SystemExit(f"asset for {pattern} in {repo_type} not found")
for repo_type, asset in to_add:
add_to_repo(repo_root, repo_type, asset)
# in case they are already installed we need to upgrade
run_cmd(msys2_root, ["pacman", "--noconfirm", "-Suy"])
yield
finally:
shutil.rmtree(repo_root, ignore_errors=True)
# downgrade again
run_cmd(msys2_root, ["pacman", "--noconfirm", "-Suuy"])
def build_package(build_type: str, pkg, msys2_root: _PathLike, builddir: _PathLike) -> None:
assert os.path.isabs(builddir)
assert os.path.isabs(msys2_root)
os.makedirs(builddir, exist_ok=True)
repo_name = {"MINGW-packages": "M", "MSYS2-packages": "S"}.get(pkg['repo'], pkg['repo'])
repo_dir = os.path.join(builddir, repo_name)
to_upload: List[str] = []
repo = get_repo()
with staging_dependencies(build_type, pkg, msys2_root, builddir), \
auto_key_retrieve(msys2_root), \
fresh_git_repo(pkg['repo_url'], repo_dir):
pkg_dir = os.path.join(repo_dir, pkg['repo_path'])
try:
if build_type == "mingw-src":
env = environ.copy()
env['MINGW_INSTALLS'] = 'mingw64'
run_cmd(msys2_root, [
'makepkg-mingw',
'--noconfirm',
'--noprogressbar',
'--allsource'
], env=env, cwd=pkg_dir)
elif build_type == "msys-src":
run_cmd(msys2_root, [
'makepkg',
'--noconfirm',
'--noprogressbar',
'--allsource'
], cwd=pkg_dir)
elif build_type in ["mingw32", "mingw64"]:
env = environ.copy()
env['MINGW_INSTALLS'] = build_type
run_cmd(msys2_root, [
'makepkg-mingw',
'--noconfirm',
'--noprogressbar',
'--nocheck',
'--syncdeps',
'--rmdeps',
'--cleanbuild'
], env=env, cwd=pkg_dir)
elif build_type == "msys":
run_cmd(msys2_root, [
'makepkg',
'--noconfirm',
'--noprogressbar',
'--nocheck',
'--syncdeps',
'--rmdeps',
'--cleanbuild'
], cwd=pkg_dir)
else:
assert 0
entries = os.listdir(pkg_dir)
for pattern in pkg.get_build_patterns(build_type):
found = fnmatch.filter(entries, pattern)
if not found:
raise BuildError(f"{pattern} not found, likely wrong version built")
to_upload.extend([os.path.join(pkg_dir, e) for e in found])
except (subprocess.CalledProcessError, BuildError) as e:
release = repo.get_release("staging-failed")
for entry in pkg.get_failed_names(build_type):
with tempfile.TemporaryDirectory() as tempdir:
failed_path = os.path.join(tempdir, entry)
failed_data = {}
run_url = get_current_run_url()
if run_url is not None:
failed_data["url"] = run_url
with open(failed_path, 'w') as h:
h.write(json.dumps(failed_data))
upload_asset(release, failed_path, text=True)
raise BuildError(e)
else:
release = repo.get_release("staging-" + pkg.get_repo_type())
for path in to_upload:
upload_asset(release, path)
def run_build(args: Any) -> None:
builddir = os.path.abspath(args.builddir)
msys2_root = os.path.abspath(args.msys2_root)
start_time = time.monotonic()
if not sys.platform == "win32":
raise SystemExit("ERROR: Needs to run under native Python")
if not shutil.which("git"):
raise SystemExit("ERROR: git not in PATH")
if not os.path.isdir(msys2_root):
raise SystemExit("ERROR: msys2_root doesn't exist")
try:
run_cmd(msys2_root, [])
except Exception as e:
raise SystemExit("ERROR: msys2_root not functional", e)
done = set()
while True:
todo = get_package_to_build()
if not todo:
break
pkg, build_type = todo
key = pkg['repo'] + build_type + pkg['name'] + pkg['version']
if key in done:
raise SystemExit("ERROR: building package again in the same run", pkg)
done.add(key)
if (time.monotonic() - start_time) >= SOFT_TIMEOUT:
print("timeout reached")
break
try:
with gha_group(f"[{ pkg['repo'] }] [{ build_type }] { pkg['name'] }..."):
build_package(build_type, pkg, msys2_root, builddir)
except BuildError:
with gha_group(f"[{ pkg['repo'] }] [{ build_type }] { pkg['name'] }: failed"):
traceback.print_exc(file=sys.stdout)
continue
def get_buildqueue() -> List[_Package]:
pkgs = []
r = requests.get("https://packages.msys2.org/api/buildqueue")
r.raise_for_status()
dep_mapping = {}
for received in r.json():
pkg = _Package(received)
pkg['repo'] = pkg['repo_url'].split('/')[-1]
pkgs.append(pkg)
for repo, names in pkg['packages'].items():
for name in names:
dep_mapping[name] = pkg
# We need to pull in all packages of that particular build because they can
# depend on each other with a fixed version
for pkg in pkgs:
for repo, deps in pkg['depends'].items():
all_deps = set(deps)
for dep in deps:
dep_pkg = dep_mapping[dep]
all_deps.update(dep_pkg['packages'][repo])
pkg['depends'][repo] = sorted(all_deps)
# link up dependencies with the real package in the queue
for pkg in pkgs:
ver_depends: Dict[str, Dict[str, _Package]] = {}
for repo, deps in pkg['depends'].items():
for dep in deps:
ver_depends.setdefault(repo, {})[dep] = dep_mapping[dep]
pkg['ext-depends'] = ver_depends
# reverse dependencies
for pkg in pkgs:
r_depends: Dict[str, Set[_Package]] = {}
for pkg2 in pkgs:
for repo, deps in pkg2['ext-depends'].items():
if pkg in deps.values():
r_depends.setdefault(repo, set()).add(pkg2)
pkg['ext-rdepends'] = r_depends
return pkgs
def get_gh_asset_name(basename: _PathLike, text: bool = False) -> str:
# GitHub will throw out charaters like '~' or '='. It also doesn't like
# when there is no file extension and will try to add one
return sha256(str(basename).encode("utf-8")).hexdigest() + (".bin" if not text else ".txt")
def get_asset_filename(asset: GitReleaseAsset) -> str:
if not asset.label:
return asset.name
else:
assert os.path.splitext(get_gh_asset_name(asset.label))[0] == \
os.path.splitext(asset.name)[0]
return asset.label
def get_release_assets(release: GitRelease, include_incomplete=False) -> List[GitReleaseAsset]:
assets = []
for asset in release.get_assets():
# skip in case not fully uploaded yet (or uploading failed)
if not asset_is_complete(asset) and not include_incomplete:
continue
uploader = asset.uploader
if uploader.type != "Bot" or uploader.login != "github-actions[bot]":
raise SystemExit(f"ERROR: Asset '{get_asset_filename(asset)}' not uploaded "
f"by GHA but '{uploader.login}'. Aborting.")
assets.append(asset)
return assets
def get_packages_to_build() -> Tuple[
List[Tuple[_Package, str]], List[Tuple[_Package, str, str]],
List[Tuple[_Package, str]]]:
repo = get_repo(optional_credentials=True)
assets = []
for name in ["msys", "mingw"]:
release = repo.get_release('staging-' + name)
assets.extend([
get_asset_filename(a) for a in get_release_assets(release)])
release = repo.get_release('staging-failed')
assets_failed = [
get_asset_filename(a) for a in get_release_assets(release)]
def pkg_is_done(build_type: str, pkg: _Package) -> bool:
for pattern in pkg.get_build_patterns(build_type):
if not fnmatch.filter(assets, pattern):
return False
return True
def pkg_has_failed(build_type: str, pkg: _Package) -> bool:
for name in pkg.get_failed_names(build_type):
if name in assets_failed:
return True
return False
def pkg_is_skipped(build_type: str, pkg: _Package) -> bool:
for other, other_type, msg in skipped:
if build_type == other_type and pkg is other:
return True
return pkg['name'] in SKIP
todo = []
done = []
skipped = []
for pkg in get_buildqueue():
for build_type in pkg.get_build_types():
if pkg_is_done(build_type, pkg):
done.append((pkg, build_type))
elif pkg_has_failed(build_type, pkg):
skipped.append((pkg, build_type, "failed"))
elif pkg_is_skipped(build_type, pkg):
skipped.append((pkg, build_type, "skipped"))
else:
dep_type = build_type_to_dep_type(build_type)
for dep in pkg['ext-depends'].get(dep_type, {}).values():
if pkg_has_failed(dep_type, dep) or pkg_is_skipped(dep_type, dep):
skipped.append((pkg, build_type, "requires: " + dep['name']))
break
else:
todo.append((pkg, build_type))
return done, skipped, todo
def get_package_to_build() -> Optional[Tuple[_Package, str]]:
done, skipped, todo = get_packages_to_build()
if todo:
return todo[0]
else:
return None
def get_workflow():
workflow_name = os.environ.get("GITHUB_WORKFLOW", None)
if workflow_name is None:
raise Exception("GITHUB_WORKFLOW not set")
repo = get_repo()
for workflow in repo.get_workflows():
if workflow.name == workflow_name:
return workflow
else:
raise Exception("workflow not found:", workflow_name)
def should_run(args: Any) -> None:
current_id = None
if "GITHUB_RUN_ID" in os.environ:
current_id = int(os.environ["GITHUB_RUN_ID"])
workflow = get_workflow()
runs = list(workflow.get_runs(status="in_progress"))
runs += list(workflow.get_runs(status="queued"))
for run in runs:
if current_id is not None and current_id == run.id:
# Ignore this run itself
continue
raise SystemExit(
f"Another workflow is currently running or has something queued: {run.html_url}")
if not get_package_to_build():
raise SystemExit("Nothing to build")
def show_build(args: Any) -> None:
done, skipped, todo = get_packages_to_build()
with gha_group(f"TODO ({len(todo)})"):
print(tabulate([(p["name"], bt, p["version"]) for (p, bt) in todo],
headers=["Package", "Build", "Version"]))
with gha_group(f"SKIPPED ({len(skipped)})"):
print(tabulate([(p["name"], bt, p["version"], r) for (p, bt, r) in skipped],
headers=["Package", "Build", "Version", "Reason"]))
with gha_group(f"DONE ({len(done)})"):
print(tabulate([(p["name"], bt, p["version"]) for (p, bt) in done],
headers=["Package", "Build", "Version"]))
def get_repo_subdir(type_: str, asset: GitReleaseAsset) -> Path:
entry = get_asset_filename(asset)
t = Path(type_)
if type_ == "msys":
if fnmatch.fnmatch(entry, '*.pkg.tar.*'):
return t / "x86_64"
elif fnmatch.fnmatch(entry, '*.src.tar.*'):
return t / "sources"
else:
raise Exception("unknown file type")
elif type_ == "mingw":
if fnmatch.fnmatch(entry, '*.src.tar.*'):
return t / "sources"
elif entry.startswith("mingw-w64-x86_64-"):
return t / "x86_64"
elif entry.startswith("mingw-w64-i686-"):
return t / "i686"
else:
raise Exception("unknown file type")
else:
raise Exception("unknown type")
def fetch_assets(args: Any) -> None:
repo = get_repo(optional_credentials=True)
pkgs = get_buildqueue()
todo = []
done = []
all_blocked = {}
for name, repo_name in [("msys", "MSYS2-packages"), ("mingw", "MINGW-packages")]:
p = Path(args.targetdir)
release = repo.get_release('staging-' + name)
release_assets = get_release_assets(release)
repo_pkgs = [p for p in pkgs if p["repo"] == repo_name]
finished_assets, blocked = get_finished_assets(
repo_pkgs, release_assets, args.fetch_all)
all_blocked.update(blocked)
for pkg, assets in finished_assets.items():
for asset in assets:
asset_dir = p / get_repo_subdir(name, asset)
asset_dir.mkdir(parents=True, exist_ok=True)
asset_path = asset_dir / get_asset_filename(asset)
if asset_path.exists():
if asset_path.stat().st_size != asset.size:
print(f"Warning: {asset_path} already exists "
f"but has a different size")
done.append(asset)
continue
todo.append((asset, asset_path))
if args.verbose and all_blocked:
import pprint
print("Packages that are blocked and why:")
pprint.pprint(all_blocked)
print(f"downloading: {len(todo)}, done: {len(done)}, "
f"blocked: {len(all_blocked)} (related builds missing)")
print("Pass --verbose to see the list of blocked packages.")
print("Pass --fetch-all to also fetch blocked packages.")
def fetch_item(item):
asset, asset_path = item
if not args.pretend:
download_asset(asset, asset_path)
return item
with ThreadPoolExecutor(8) as executor:
for i, item in enumerate(executor.map(fetch_item, todo)):
print(f"[{i + 1}/{len(todo)}] {get_asset_filename(item[0])}")
print("done")
def get_assets_to_delete(repo: Repository) -> List[GitReleaseAsset]:
print("Fetching packages to build...")
patterns = []
for pkg in get_buildqueue():
for build_type in pkg.get_build_types():
patterns.extend(pkg.get_failed_names(build_type))
patterns.extend(pkg.get_build_patterns(build_type))
print("Fetching assets...")
assets: Dict[str, List[GitReleaseAsset]] = {}
for release_name in ['staging-msys', 'staging-mingw', 'staging-failed']:
release = repo.get_release(release_name)
for asset in get_release_assets(release, include_incomplete=True):
assets.setdefault(get_asset_filename(asset), []).append(asset)
for pattern in patterns:
for key in fnmatch.filter(assets.keys(), pattern):
del assets[key]
result = []
for items in assets.values():
for asset in items:
result.append(asset)
return result
def get_finished_assets(pkgs: Collection[_Package],
assets: Sequence[GitReleaseAsset],
ignore_blocked: bool) -> Tuple[
Dict[_Package, List[GitReleaseAsset]], Dict[_Package, str]]:
"""Returns assets for packages where all package results are available"""
assets_mapping: Dict[str, List[GitReleaseAsset]] = {}
for asset in assets:
assets_mapping.setdefault(get_asset_filename(asset), []).append(asset)
finished = {}
for pkg in pkgs:
# Only returns assets for packages where everything has been
# built already
patterns = []
for build_type in pkg.get_build_types():
patterns.extend(pkg.get_build_patterns(build_type))
finished_maybe = []
for pattern in patterns:
matches = fnmatch.filter(assets_mapping.keys(), pattern)
if matches:
found = assets_mapping[matches[0]]
finished_maybe.extend(found)
if len(finished_maybe) == len(patterns):
finished[pkg] = finished_maybe
blocked = {}
if not ignore_blocked:
for pkg in finished:
blocked_reason = set()
# skip packages where not all dependencies have been built
for repo, deps in pkg["ext-depends"].items():
for dep in deps.values():
if dep in pkgs and dep not in finished:
blocked_reason.add(dep)
# skip packages where not all reverse dependencies have been built
for repo, deps in pkg["ext-rdepends"].items():
for dep in deps:
if dep["name"] in IGNORE_RDEP_PACKAGES:
continue
if dep in pkgs and dep not in finished:
blocked_reason.add(dep)
if blocked_reason:
blocked[pkg] = "waiting on %r" % blocked_reason
for pkg in blocked:
finished.pop(pkg, None)
return finished, blocked
def clean_gha_assets(args: Any) -> None:
repo = get_repo()
assets = get_assets_to_delete(repo)
for asset in assets:
print(f"Deleting {get_asset_filename(asset)}...")
if not args.dry_run:
asset.delete_asset()
if not assets:
print("Nothing to delete")
def get_credentials(optional: bool = False) -> Dict[str, Any]:
if "GITHUB_TOKEN" in environ:
return {'login_or_token': environ["GITHUB_TOKEN"]}
elif "GITHUB_USER" in environ and "GITHUB_PASS" in environ:
return {'login_or_token': environ["GITHUB_USER"], 'password': environ["GITHUB_PASS"]}
else:
if optional:
print("[Warning] 'GITHUB_TOKEN' or 'GITHUB_USER'/'GITHUB_PASS' env vars "
"not set which might lead to API rate limiting", file=sys.stderr)
return {}
else:
raise Exception("'GITHUB_TOKEN' or 'GITHUB_USER'/'GITHUB_PASS' env vars not set")
def get_repo(optional_credentials: bool = False) -> Repository:
kwargs = get_credentials(optional=optional_credentials)
has_creds = bool(kwargs)
# 100 is the maximum allowed
kwargs['per_page'] = 100
gh = Github(**kwargs)
if not has_creds and optional_credentials:
print(f"[Warning] Rate limit status: {gh.get_rate_limit().core}", file=sys.stderr)
return gh.get_repo(REPO, lazy=True)
def main(argv: List[str]):
parser = argparse.ArgumentParser(description="Build packages", allow_abbrev=False)
parser.set_defaults(func=lambda *x: parser.print_help())
subparser = parser.add_subparsers(title="subcommands")
sub = subparser.add_parser("build", help="Build all packages")
sub.add_argument("msys2_root", help="The MSYS2 install used for building. e.g. C:\\msys64")
sub.add_argument(
"builddir",
help="A directory used for saving temporary build results and the git repos")
sub.set_defaults(func=run_build)
sub = subparser.add_parser(
"show", help="Show all packages to be built", allow_abbrev=False)
sub.add_argument(
"--fail-on-idle", action="store_true", help="Fails if there is nothing to do")
sub.set_defaults(func=show_build)
sub = subparser.add_parser(
"should-run", help="Fails if the workflow shouldn't run", allow_abbrev=False)
sub.set_defaults(func=should_run)
sub = subparser.add_parser(
"fetch-assets", help="Download all staging packages", allow_abbrev=False)
sub.add_argument("targetdir")
sub.add_argument(
"--verbose", action="store_true", help="Show why things are blocked")
sub.add_argument(
"--pretend", action="store_true",
help="Don't actually download, just show what would be done")
sub.add_argument(
"--fetch-all", action="store_true", help="Fetch all packages, even blocked ones")
sub.set_defaults(func=fetch_assets)
sub = subparser.add_parser("clean-assets", help="Clean up GHA assets", allow_abbrev=False)
sub.add_argument(
"--dry-run", action="store_true", help="Only show what is going to be deleted")
sub.set_defaults(func=clean_gha_assets)
args = parser.parse_args(argv[1:])
return args.func(args)
if __name__ == "__main__":
main(sys.argv)

2
build.bat Normal file
View File

@ -0,0 +1,2 @@
@echo off
C:\msys64\msys2_shell.cmd -here -msys -no-start -defterm -c "./build.sh"

5
build.sh Normal file
View File

@ -0,0 +1,5 @@
pacman --needed --noconfirm -S mingw-w64-x86_64-python-tabulate mingw-w64-x86_64-python-requests-cache
OLD_ACLOCAL_PATH="${ACLOCAL_PATH}"
unset ACLOCAL_PATH
python -m msys2_autobuild build / ~/build-temp -t msys,msys-src,mingw64,mingw32,mingw-src
ACLOCAL_PATH="${OLD_ACLOCAL_PATH}"

View File

@ -1,43 +0,0 @@
https://mermaid-js.github.io
```
sequenceDiagram
participant GIT as MSYS2/MINGW-packages
participant API as packages.msys2.org
participant GHA as GitHub Actions
participant DT as msys2-autobuild
participant DEV as Developer
participant REPO as Pacman Repo
GIT->>GHA: GIT push trigger
GHA->>GHA: parse PKBUILDs
GHA-->>GIT: upload parsed PKGBUILDs
loop Every 5 minutes
API->>GIT: fetch parsed PKGBUILDs
GIT-->>API:
end
loop Every 2 hours
DT->>GHA: cron trigger
GHA->>API: fetch TODO list
API-->>GHA:
GHA->>GIT: fetch PKGBUILDs
GIT-->>GHA:
GHA->>DT: fetch staging
DT-->>GHA:
GHA->>GHA: build packages
GHA-->>DT: upload packages
end
DEV->>DT: fetch packages
DT-->>DEV:
DEV->>DEV: sign packages
DEV->>REPO: push to repo
```
```
{
"theme": "forest"
}
```

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 24 KiB

0
msys2_autobuild/__init__.py Executable file
View File

View File

@ -0,0 +1,3 @@
from .main import run
run()

421
msys2_autobuild/build.py Normal file
View File

@ -0,0 +1,421 @@
import fnmatch
import json
import os
import time
import shlex
import shutil
import stat
import subprocess
import tempfile
from concurrent.futures import ThreadPoolExecutor
from contextlib import contextmanager
from pathlib import Path, PurePath, PurePosixPath
from subprocess import check_call
from typing import Any, TypeVar
from collections.abc import Generator, Sequence
from gitea import Attachment
from .config import ArchType, BuildType, Config
from .gh import (CachedAssets, download_asset, get_asset_filename,
get_release, get_repo_for_build_type, upload_asset)
from .queue import Package
from .utils import SCRIPT_DIR, PathLike
class BuildError(Exception):
pass
def get_python_path(msys2_root: PathLike, msys2_path: PathLike) -> Path:
return Path(os.path.normpath(str(msys2_root) + str(msys2_path)))
def to_pure_posix_path(path: PathLike) -> PurePath:
return PurePosixPath("/" + str(path).replace(":", "", 1).replace("\\", "/"))
def get_build_environ(build_type: BuildType) -> dict[str, str]:
environ = os.environ.copy()
# Set PACKAGER for makepkg
packager_ref = Config.RUNNER_CONFIG[build_type]["repo"]
if "GITHUB_SHA" in environ and "GITHUB_RUN_ID" in environ:
packager_ref += "/" + environ["GITHUB_SHA"][:8] + "/" + environ["GITHUB_RUN_ID"]
environ["PACKAGER"] = f"CI ({packager_ref})"
return environ
@contextmanager
def temp_pacman_script(pacman_config: PathLike) -> Generator[PathLike, None, None]:
"""Gives a temporary pacman script which uses the passed in pacman config
without having to pass --config to it. Required because makepkg doesn't allow
setting the pacman conf path, but it allows setting the pacman executable path
via the 'PACMAN' env var.
"""
fd, filename = tempfile.mkstemp("pacman")
os.close(fd)
try:
with open(filename, "w", encoding="utf-8") as h:
cli = shlex.join(['/usr/bin/pacman', '--config', str(to_pure_posix_path(pacman_config))])
h.write(f"""\
#!/bin/bash
set -e
exec {cli} "$@"
""")
yield filename
finally:
try:
os.unlink(filename)
except OSError:
pass
@contextmanager
def temp_pacman_conf(msys2_root: PathLike) -> Generator[Path, None, None]:
"""Gives a unix path to a temporary copy of pacman.conf"""
fd, filename = tempfile.mkstemp("pacman.conf")
os.close(fd)
try:
conf = get_python_path(msys2_root, "/etc/pacman.conf")
with open(conf, "rb") as src:
with open(filename, "wb") as dest:
shutil.copyfileobj(src, dest)
yield Path(filename)
finally:
try:
os.unlink(filename)
except OSError:
pass
@contextmanager
def temp_makepkg_confd(msys2_root: PathLike, config_name: str) -> Generator[Path, None, None]:
"""Gives a path to a temporary $config_name.d file"""
conf_dir = get_python_path(msys2_root, f"/etc/{config_name}.d")
os.makedirs(conf_dir, exist_ok=True)
conf_file = conf_dir / "msys2_autobuild.conf"
try:
open(conf_file, "wb").close()
yield conf_file
finally:
try:
os.unlink(conf_file)
except OSError:
pass
try:
os.rmdir(conf_dir)
except OSError:
pass
def clean_environ(environ: dict[str, str]) -> dict[str, str]:
"""Returns an environment without any CI related variables.
This is to avoid leaking secrets to package build scripts we call.
While in theory we trust them this can't hurt.
"""
new_env = environ.copy()
for key in list(new_env):
if key.startswith(("GITHUB_", "RUNNER_")):
del new_env[key]
return new_env
def run_cmd(msys2_root: PathLike, args: Sequence[PathLike], **kwargs: Any) -> None:
executable = os.path.join(msys2_root, 'usr', 'bin', 'bash.exe')
env = clean_environ(kwargs.pop("env", os.environ.copy()))
env["CHERE_INVOKING"] = "1"
env["MSYSTEM"] = "MSYS"
env["MSYS2_PATH_TYPE"] = "minimal"
check_call([executable, '-lc'] + [shlex.join([str(a) for a in args])], env=env, **kwargs)
def make_tree_writable(topdir: PathLike) -> None:
# Ensure all files and directories under topdir are writable
# (and readable) by owner.
# Taken from meson, and adjusted
def chmod(p: PathLike) -> None:
os.chmod(p, os.stat(p).st_mode | stat.S_IWRITE | stat.S_IREAD)
chmod(topdir)
for root, dirs, files in os.walk(topdir):
for d in dirs:
chmod(os.path.join(root, d))
# Work around Python bug following junctions
# https://github.com/python/cpython/issues/67596#issuecomment-1918112817
dirs[:] = [d for d in dirs if not os.path.isjunction(os.path.join(root, d))]
for fname in files:
fpath = os.path.join(root, fname)
if os.path.isfile(fpath):
chmod(fpath)
def remove_junctions(topdir: PathLike) -> None:
# work around a git issue where it can't handle junctions
# https://github.com/git-for-windows/git/issues/5320
for root, dirs, _ in os.walk(topdir):
no_junctions = []
for d in dirs:
if not os.path.isjunction(os.path.join(root, d)):
no_junctions.append(d)
else:
os.remove(os.path.join(root, d))
dirs[:] = no_junctions
def reset_git_repo(path: PathLike):
def clean():
assert os.path.exists(path)
# Try to avoid git hanging in a junction loop, by removing them
# before running git clean/reset
# https://github.com/msys2/msys2-autobuild/issues/108#issuecomment-2776420879
try:
remove_junctions(path)
except OSError as e:
print("Removing junctions failed", e)
check_call(["git", "clean", "-xfdf"], cwd=path)
check_call(["git", "reset", "--hard", "HEAD"], cwd=path)
made_writable = False
for i in range(10):
try:
clean()
except subprocess.CalledProcessError:
try:
if not made_writable:
print("Trying to make files writable")
make_tree_writable(path)
remove_junctions(path)
made_writable = True
except OSError as e:
print("Making files writable failed", e)
print(f"git clean/reset failed, sleeping for {i} seconds")
time.sleep(i)
else:
break
else:
# run it one more time to raise
clean()
@contextmanager
def fresh_git_repo(url: str, path: PathLike) -> Generator:
if not os.path.exists(path):
check_call(["git", "clone", url, path])
check_call(["git", "config", "core.longpaths", "true"], cwd=path)
else:
reset_git_repo(path)
check_call(["git", "fetch", "origin"], cwd=path)
check_call(["git", "reset", "--hard", "origin/master"], cwd=path)
try:
yield
finally:
assert os.path.exists(path)
reset_git_repo(path)
@contextmanager
def staging_dependencies(
build_type: BuildType, pkg: Package, msys2_root: PathLike,
builddir: PathLike) -> Generator[PathLike, None, None]:
def add_to_repo(repo_root: PathLike, pacman_config: PathLike, repo_name: str,
assets: list[Attachment]) -> None:
repo_dir = Path(repo_root) / repo_name
os.makedirs(repo_dir, exist_ok=True)
todo = []
for asset in assets:
asset_path = os.path.join(repo_dir, get_asset_filename(asset))
todo.append((asset_path, asset))
def fetch_item(item: tuple[str, Attachment]) -> tuple[str, Attachment]:
asset_path, asset = item
download_asset(asset, asset_path)
return item
package_paths = []
with ThreadPoolExecutor(8) as executor:
for i, item in enumerate(executor.map(fetch_item, todo)):
asset_path, asset = item
print(f"[{i + 1}/{len(todo)}] {get_asset_filename(asset)}")
package_paths.append(asset_path)
repo_name = f"autobuild-{repo_name}"
repo_db_path = os.path.join(repo_dir, f"{repo_name}.db.tar.gz")
with open(pacman_config, encoding="utf-8") as h:
text = h.read()
uri = to_pure_posix_path(repo_dir).as_uri()
if uri not in text:
with open(pacman_config, "w", encoding="utf-8") as h2:
h2.write(f"""[{repo_name}]
Server={uri}
SigLevel=Never
""")
h2.write(text)
# repo-add 15 packages at a time so we don't hit the size limit for CLI arguments
ChunkItem = TypeVar("ChunkItem")
def chunks(lst: list[ChunkItem], n: int) -> Generator[list[ChunkItem], None, None]:
for i in range(0, len(lst), n):
yield lst[i:i + n]
base_args: list[PathLike] = ["repo-add", to_pure_posix_path(repo_db_path)]
posix_paths: list[PathLike] = [to_pure_posix_path(p) for p in package_paths]
for chunk in chunks(posix_paths, 15):
args = base_args + chunk
run_cmd(msys2_root, args, cwd=repo_dir)
cached_assets = CachedAssets()
repo_root = os.path.join(builddir, "_REPO")
try:
shutil.rmtree(repo_root, ignore_errors=True)
os.makedirs(repo_root, exist_ok=True)
with temp_pacman_conf(msys2_root) as pacman_config:
to_add: dict[ArchType, list[GitReleaseAsset]] = {}
for dep_type, deps in pkg.get_depends(build_type).items():
assets = cached_assets.get_assets(dep_type)
for dep in deps:
for pattern in dep.get_build_patterns(dep_type):
for asset in assets:
if fnmatch.fnmatch(get_asset_filename(asset), pattern):
to_add.setdefault(dep_type, []).append(asset)
break
else:
if pkg.is_optional_dep(dep, dep_type):
# If it's there, good, if not we ignore it since it's part of a cycle
pass
else:
raise SystemExit(f"asset for {pattern} in {dep_type} not found")
for dep_type, assets in to_add.items():
add_to_repo(repo_root, pacman_config, dep_type, assets)
with temp_pacman_script(pacman_config) as temp_pacman:
# in case they are already installed we need to upgrade
run_cmd(msys2_root, [to_pure_posix_path(temp_pacman), "--noconfirm", "-Suy"])
run_cmd(msys2_root, [to_pure_posix_path(temp_pacman), "--noconfirm", "-Su"])
yield temp_pacman
finally:
shutil.rmtree(repo_root, ignore_errors=True)
# downgrade again
run_cmd(msys2_root, ["pacman", "--noconfirm", "-Suuy"])
run_cmd(msys2_root, ["pacman", "--noconfirm", "-Suu"])
def build_package(build_type: BuildType, pkg: Package, msys2_root: PathLike, builddir: PathLike) -> None:
assert os.path.isabs(builddir)
assert os.path.isabs(msys2_root)
os.makedirs(builddir, exist_ok=True)
repo_name = {"MINGW-packages": "W", "MSYS2-packages": "S"}.get(pkg['repo'], pkg['repo'])
repo_dir = os.path.join(builddir, repo_name)
to_upload: list[str] = []
repo = get_repo_for_build_type(build_type)
with fresh_git_repo(pkg['repo_url'], repo_dir):
orig_pkg_dir = os.path.join(repo_dir, pkg['repo_path'])
# Rename it to get a shorter overall build path
# https://github.com/msys2/msys2-autobuild/issues/71
pkg_dir = os.path.join(repo_dir, 'B')
assert not os.path.exists(pkg_dir)
os.rename(orig_pkg_dir, pkg_dir)
# Fetch all keys mentioned in the PKGBUILD
validpgpkeys = to_pure_posix_path(os.path.join(SCRIPT_DIR, 'fetch-validpgpkeys.sh'))
run_cmd(msys2_root, ['bash', validpgpkeys], cwd=pkg_dir)
with staging_dependencies(build_type, pkg, msys2_root, builddir) as temp_pacman:
try:
env = get_build_environ(build_type)
# this makes makepkg use our custom pacman script
env['PACMAN'] = str(to_pure_posix_path(temp_pacman))
if build_type == Config.MINGW_SRC_BUILD_TYPE:
with temp_makepkg_confd(msys2_root, "makepkg_mingw.conf") as makepkg_conf:
with open(makepkg_conf, "w", encoding="utf-8") as h:
h.write("COMPRESSZST=(zstd -c -T0 --ultra -22 -)\n")
env['MINGW_ARCH'] = Config.MINGW_SRC_ARCH
run_cmd(msys2_root, [
'makepkg-mingw',
'--noconfirm',
'--noprogressbar',
'--allsource'
], env=env, cwd=pkg_dir)
elif build_type == Config.MSYS_SRC_BUILD_TYPE:
with temp_makepkg_confd(msys2_root, "makepkg.conf") as makepkg_conf:
with open(makepkg_conf, "w", encoding="utf-8") as h:
h.write("COMPRESSZST=(zstd -c -T0 --ultra -22 -)\n")
run_cmd(msys2_root, [
'makepkg',
'--noconfirm',
'--noprogressbar',
'--allsource'
], env=env, cwd=pkg_dir)
elif build_type in Config.MINGW_ARCH_LIST:
with temp_makepkg_confd(msys2_root, "makepkg_mingw.conf") as makepkg_conf:
with open(makepkg_conf, "w", encoding="utf-8") as h:
h.write("COMPRESSZST=(zstd -c -T0 --ultra -20 -)\n")
env['MINGW_ARCH'] = build_type
run_cmd(msys2_root, [
'makepkg-mingw',
'--noconfirm',
'--noprogressbar',
'--nocheck',
'--syncdeps',
'--rmdeps',
'--cleanbuild'
], env=env, cwd=pkg_dir)
elif build_type in Config.MSYS_ARCH_LIST:
with temp_makepkg_confd(msys2_root, "makepkg.conf") as makepkg_conf:
with open(makepkg_conf, "w", encoding="utf-8") as h:
h.write("COMPRESSZST=(zstd -c -T0 --ultra -20 -)\n")
run_cmd(msys2_root, [
'makepkg',
'--noconfirm',
'--noprogressbar',
'--nocheck',
'--syncdeps',
'--rmdeps',
'--cleanbuild'
], env=env, cwd=pkg_dir)
else:
assert 0
entries = os.listdir(pkg_dir)
for pattern in pkg.get_build_patterns(build_type):
found = fnmatch.filter(entries, pattern)
if not found:
raise BuildError(f"{pattern} not found, likely wrong version built")
to_upload.extend([os.path.join(pkg_dir, e) for e in found])
except (subprocess.CalledProcessError, BuildError) as e:
release = get_release(repo, "staging-failed")
failed_data = {}
content = json.dumps(failed_data).encode()
upload_asset(repo, release, pkg.get_failed_name(build_type), text=True, content=content)
raise BuildError(e)
else:
release = get_release(repo, "staging-" + build_type)
for path in to_upload:
upload_asset(repo, release, path)

View File

@ -0,0 +1,102 @@
import os
import shutil
import sys
import time
import traceback
from typing import Any, Literal
from .build import BuildError, build_package, run_cmd
from .config import BuildType, Config
from .queue import (Package, PackageStatus, get_buildqueue_with_status,
update_status)
from .utils import apply_optional_deps, gha_group
BuildFrom = Literal["start", "middle", "end"]
def get_package_to_build(
pkgs: list[Package], build_types: list[BuildType] | None,
build_from: BuildFrom) -> tuple[Package, BuildType] | None:
can_build = []
for pkg in pkgs:
for build_type in pkg.get_build_types():
if build_types is not None and build_type not in build_types:
continue
if pkg.get_status(build_type) == PackageStatus.WAITING_FOR_BUILD:
can_build.append((pkg, build_type))
if not can_build:
return None
if build_from == "end":
return can_build[-1]
elif build_from == "middle":
return can_build[len(can_build)//2]
elif build_from == "start":
return can_build[0]
else:
raise Exception("Unknown order:", build_from)
def run_build(args: Any) -> None:
builddir = os.path.abspath(args.builddir)
msys2_root = os.path.abspath(args.msys2_root)
if args.build_types is None:
build_types = None
else:
build_types = [p.strip() for p in args.build_types.split(",")]
apply_optional_deps(args.optional_deps or "")
start_time = time.monotonic()
if not sys.platform == "win32":
raise SystemExit("ERROR: Needs to run under native Python")
if not shutil.which("git"):
raise SystemExit("ERROR: git not in PATH")
if not os.path.isdir(msys2_root):
raise SystemExit("ERROR: msys2_root doesn't exist")
try:
run_cmd(msys2_root, [])
except Exception as e:
raise SystemExit("ERROR: msys2_root not functional", e)
print(f"Building {build_types} starting from {args.build_from}")
while True:
pkgs = get_buildqueue_with_status(full_details=True)
update_status(pkgs)
if (time.monotonic() - start_time) >= Config.SOFT_JOB_TIMEOUT:
print("timeout reached")
break
todo = get_package_to_build(pkgs, build_types, args.build_from)
if not todo:
break
pkg, build_type = todo
try:
with gha_group(f"[{pkg['repo']}] [{build_type}] {pkg['name']}..."):
build_package(build_type, pkg, msys2_root, builddir)
except BuildError:
with gha_group(f"[{pkg['repo']}] [{build_type}] {pkg['name']}: failed"):
traceback.print_exc(file=sys.stdout)
continue
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser("build", help="Build all packages")
sub.add_argument("-t", "--build-types", action="store")
sub.add_argument(
"--build-from", action="store", default="start", help="Start building from start|end|middle")
sub.add_argument("--optional-deps", action="store")
sub.add_argument("msys2_root", help="The MSYS2 install used for building. e.g. C:\\msys64")
sub.add_argument(
"builddir",
help="A directory used for saving temporary build results and the git repos")
sub.set_defaults(func=run_build)

View File

@ -0,0 +1,90 @@
import re
import fnmatch
from typing import Any
from gitea import Release, Attachment
from .config import get_all_build_types
from .gh import (get_asset_filename, get_current_repo, get_release,
get_release_assets, get_gitea)
from .queue import get_buildqueue
def get_assets_to_delete() -> tuple[list[Release], list[tuple[Release, Attachment]]]:
print("Fetching packages to build...")
keep_patterns = []
for pkg in get_buildqueue():
for build_type in pkg.get_build_types():
keep_patterns.append(pkg.get_failed_name(build_type))
keep_patterns.extend(pkg.get_build_patterns(build_type))
keep_pattern_regex = re.compile('|'.join(fnmatch.translate(p) for p in keep_patterns))
def should_be_deleted(asset: Attachment) -> bool:
filename = get_asset_filename(asset)
return not keep_pattern_regex.match(filename)
def get_to_delete(release: Release) -> tuple[list[Release], list[Attachment]]:
assets = get_release_assets(release)
to_delete = []
for asset in assets:
if should_be_deleted(asset):
to_delete.append(asset)
# Deleting and re-creating a release requires two write calls, so delete
# the release if all assets should be deleted and there are more than 2.
# min_to_delete = 3
# XXX: re-creating releases causes notifications, so avoid unless possible
# https://github.com/msys2/msys2-autobuild/issues/77#issuecomment-1657231719
min_to_delete = 400*333
if len(to_delete) >= min_to_delete and len(assets) == len(to_delete):
return [release], []
else:
return [], to_delete
def get_all_releases() -> list[Release]:
repo = get_current_repo()
releases = []
for build_type in get_all_build_types():
releases.append(get_release(repo, "staging-" + build_type))
releases.append(get_release(repo, "staging-failed"))
return releases
print("Fetching assets...")
releases = []
assets = []
for release in get_all_releases():
r, a = get_to_delete(release)
releases.extend(r)
assets.extend(r, a)
return releases, assets
def clean_gha_assets(args: Any) -> None:
repo = get_current_repo()
releases, assets = get_assets_to_delete()
print("Resetting releases...")
for release in releases:
print(f"Resetting {release.tag_name}...")
if not args.dry_run:
release.delete_release()
get_release(repo, release.tag_name)
print("Deleting assets...")
for release, asset in assets:
print(f"Deleting {get_asset_filename(asset)}...")
if not args.dry_run:
gitea = get_gitea()
gitea.repo_delete_release_attachment(repo.owner.login, repo.name, release.id, asset.id)
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser("clean-assets", help="Clean up GHA assets", allow_abbrev=False)
sub.add_argument(
"--dry-run", action="store_true", help="Only show what is going to be deleted")
sub.set_defaults(func=clean_gha_assets)

View File

@ -0,0 +1,49 @@
from typing import Any
from .gh import (get_asset_filename, get_current_repo, get_release,
get_release_assets, get_gitea)
from .queue import get_buildqueue_with_status
def clear_failed_state(args: Any) -> None:
build_type_filter = args.build_types
build_type_list = build_type_filter.replace(" ", "").split(",") if build_type_filter else []
package_filter = args.packages
package_list = package_filter.replace(" ", "").split(",") if package_filter else []
if build_type_filter is None and package_filter is None:
raise SystemExit("clear-failed: At least one of --build-types or --packages needs to be passed")
repo = get_current_repo()
release = get_release(repo, 'staging-failed')
assets_failed = get_release_assets(release)
failed_map = dict((get_asset_filename(a), a) for a in assets_failed)
for pkg in get_buildqueue_with_status():
if package_filter is not None and pkg["name"] not in package_list:
continue
for build_type in pkg.get_build_types():
if build_type_filter is not None and build_type not in build_type_list:
continue
name = pkg.get_failed_name(build_type)
if name in failed_map:
asset = failed_map[name]
print(f"Deleting {get_asset_filename(asset)}...")
if not args.dry_run:
gitea = get_gitea()
gitea.repo_delete_release_attachment(repo.owner.login, repo.name, release.id, asset.id)
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser(
"clear-failed", help="Clear the failed state for packages", allow_abbrev=False)
sub.add_argument(
"--dry-run", action="store_true", help="Only show what is going to be deleted")
sub.add_argument("--build-types", action="store", help=(
"A comma separated list of build types (e.g. mingw64)"))
sub.add_argument("--packages", action="store", help=(
"A comma separated list of packages to clear (e.g. mingw-w64-qt-creator)"))
sub.set_defaults(func=clear_failed_state)

View File

@ -0,0 +1,178 @@
import fnmatch
import os
from concurrent.futures import ThreadPoolExecutor
from pathlib import Path
from typing import Any
import subprocess
from gitea import Attachment
from .config import BuildType, Config
from .gh import (CachedAssets, download_asset, get_asset_filename,
get_asset_mtime_ns)
from .queue import PackageStatus, get_buildqueue_with_status
from .utils import ask_yes_no
def get_repo_subdir(build_type: BuildType) -> Path:
if build_type in Config.MSYS_ARCH_LIST:
return Path("msys") / "x86_64"
elif build_type == Config.MSYS_SRC_BUILD_TYPE:
return Path("msys") / "sources"
elif build_type == Config.MINGW_SRC_BUILD_TYPE:
return Path("mingw") / "sources"
elif build_type in Config.MINGW_ARCH_LIST:
return Path("mingw") / build_type
else:
raise Exception("unknown type")
def fetch_assets(args: Any) -> None:
target_dir = os.path.abspath(args.targetdir)
fetch_all = args.fetch_all
fetch_complete = args.fetch_complete
all_patterns: dict[BuildType, list[str]] = {}
all_blocked = []
for pkg in get_buildqueue_with_status():
for build_type in pkg.get_build_types():
if args.build_type and build_type not in args.build_type:
continue
status = pkg.get_status(build_type)
pkg_patterns = pkg.get_build_patterns(build_type)
if status == PackageStatus.FINISHED:
all_patterns.setdefault(build_type, []).extend(pkg_patterns)
elif status in [PackageStatus.FINISHED_BUT_BLOCKED,
PackageStatus.FINISHED_BUT_INCOMPLETE]:
if fetch_all or (fetch_complete and status != PackageStatus.FINISHED_BUT_INCOMPLETE):
all_patterns.setdefault(build_type, []).extend(pkg_patterns)
else:
all_blocked.append(
(pkg["name"], build_type, pkg.get_status_details(build_type)))
all_assets = {}
cached_assets = CachedAssets()
assets_to_download: dict[BuildType, list[Attachment]] = {}
for build_type, patterns in all_patterns.items():
if build_type not in all_assets:
all_assets[build_type] = cached_assets.get_assets(build_type)
assets = all_assets[build_type]
assets_mapping: dict[str, list[Attachment]] = {}
for asset in assets:
assets_mapping.setdefault(get_asset_filename(asset), []).append(asset)
for pattern in patterns:
matches = fnmatch.filter(assets_mapping.keys(), pattern)
if matches:
found = assets_mapping[matches[0]]
assets_to_download.setdefault(build_type, []).extend(found)
to_fetch = {}
for build_type, assets in assets_to_download.items():
for asset in assets:
asset_dir = Path(target_dir) / get_repo_subdir(build_type)
asset_path = asset_dir / get_asset_filename(asset)
to_fetch[str(asset_path)] = asset
def file_is_uptodate(path: str, asset: Attachment) -> bool:
asset_path = Path(path)
if not asset_path.exists():
return False
if asset_path.stat().st_size != asset.size:
return False
if get_asset_mtime_ns(asset) != asset_path.stat().st_mtime_ns:
return False
return True
# find files that are either wrong or not what we want
to_delete = []
not_uptodate = []
for root, dirs, files in os.walk(target_dir):
for name in files:
existing = os.path.join(root, name)
if existing in to_fetch:
asset = to_fetch[existing]
if not file_is_uptodate(existing, asset):
to_delete.append(existing)
not_uptodate.append(existing)
else:
to_delete.append(existing)
if args.delete and not args.pretend:
# delete unwanted files
for path in to_delete:
os.remove(path)
# delete empty directories
for root, dirs, files in os.walk(target_dir, topdown=False):
for name in dirs:
path = os.path.join(root, name)
if not os.listdir(path):
os.rmdir(path)
# Finally figure out what to download
todo = {}
done = []
for path, asset in to_fetch.items():
if not os.path.exists(path) or path in not_uptodate:
todo[path] = asset
Path(path).parent.mkdir(parents=True, exist_ok=True)
else:
done.append(path)
if args.verbose and all_blocked:
import pprint
print("Packages that are blocked and why:")
pprint.pprint(all_blocked)
print(f"downloading: {len(todo)}, done: {len(done)} "
f"blocked: {len(all_blocked)} (related builds missing)")
print("Pass --verbose to see the list of blocked packages.")
print("Pass --fetch-complete to also fetch blocked but complete packages")
print("Pass --fetch-all to fetch all packages.")
print("Pass --delete to clear the target directory")
def verify_file(path: str, target: str) -> None:
try:
subprocess.run(["zstd", "--quiet", "--test", path], capture_output=True, check=True, text=True)
except subprocess.CalledProcessError as e:
raise Exception(f"zstd test failed for {target!r}: {e.stderr}") from e
def fetch_item(item: tuple[str, Attachment]) -> tuple[str, Attachment]:
asset_path, asset = item
if not args.pretend:
download_asset(asset, asset_path, verify_file)
return item
with ThreadPoolExecutor(8) as executor:
for i, item in enumerate(executor.map(fetch_item, todo.items())):
print(f"[{i + 1}/{len(todo)}] {get_asset_filename(item[1])}")
print("done")
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser(
"fetch-assets", help="Download all staging packages", allow_abbrev=False)
sub.add_argument("targetdir")
sub.add_argument(
"--delete", action="store_true", help="Clear targetdir of unneeded files")
sub.add_argument(
"--verbose", action="store_true", help="Show why things are blocked")
sub.add_argument(
"--pretend", action="store_true",
help="Don't actually download, just show what would be done")
sub.add_argument(
"--fetch-all", action="store_true", help="Fetch all packages, even blocked ones")
sub.add_argument(
"--fetch-complete", action="store_true",
help="Fetch all packages, even blocked ones, except incomplete ones")
sub.add_argument(
"-t", "--build-type", action="append",
help="Only fetch packages for given build type(s) (may be used more than once)")
sub.add_argument(
"--noconfirm", action="store_true",
help="Don't require user confirmation")
sub.set_defaults(func=fetch_assets)

View File

@ -0,0 +1,66 @@
from typing import Any
from tabulate import tabulate
from .queue import Package, PackageStatus, get_buildqueue_with_status, get_cycles
from .utils import apply_optional_deps, gha_group
def show_cycles(pkgs: list[Package]) -> None:
cycles = get_cycles(pkgs)
if cycles:
def format_package(p: Package) -> str:
return f"{p['name']} [{p['version_repo']} -> {p['version']}]"
with gha_group(f"Dependency Cycles ({len(cycles)})"):
print(tabulate([
(format_package(a), "<-->", format_package(b)) for (a, b) in cycles],
headers=["Package", "", "Package"]))
def show_build(args: Any) -> None:
todo = []
waiting = []
done = []
failed = []
apply_optional_deps(args.optional_deps or "")
pkgs = get_buildqueue_with_status(full_details=args.details)
show_cycles(pkgs)
for pkg in pkgs:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
details = pkg.get_status_details(build_type)
details.pop("blocked", None)
if status == PackageStatus.WAITING_FOR_BUILD:
todo.append((pkg, build_type, status, details))
elif status in (PackageStatus.FINISHED, PackageStatus.FINISHED_BUT_BLOCKED,
PackageStatus.FINISHED_BUT_INCOMPLETE):
done.append((pkg, build_type, status, details))
elif status in (PackageStatus.WAITING_FOR_DEPENDENCIES,
PackageStatus.MANUAL_BUILD_REQUIRED):
waiting.append((pkg, build_type, status, details))
else:
failed.append((pkg, build_type, status, details))
def show_table(name: str, items: list) -> None:
with gha_group(f"{name} ({len(items)})"):
print(tabulate([(p["name"], bt, p["version"], str(s), d) for (p, bt, s, d) in items],
headers=["Package", "Build", "Version", "Status", "Details"]))
show_table("TODO", todo)
show_table("WAITING", waiting)
show_table("FAILED", failed)
show_table("DONE", done)
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser(
"show", help="Show all packages to be built", allow_abbrev=False)
sub.add_argument(
"--details", action="store_true", help="Show more details such as links to failed build logs (slow)")
sub.add_argument("--optional-deps", action="store")
sub.set_defaults(func=show_build)

View File

@ -0,0 +1,13 @@
from typing import Any
from .queue import get_buildqueue_with_status, update_status
def run_update_status(args: Any) -> None:
update_status(get_buildqueue_with_status(full_details=True))
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser(
"update-status", help="Update the status file", allow_abbrev=False)
sub.set_defaults(func=run_update_status)

View File

@ -0,0 +1,65 @@
import glob
import os
from typing import Any
from .gh import get_release, get_repo_for_build_type, upload_asset
from .queue import PackageStatus, get_buildqueue_with_status
def upload_assets(args: Any) -> None:
package_name = args.package
src_dir = args.path
src_dir = os.path.abspath(src_dir)
pkgs = get_buildqueue_with_status()
if package_name is not None:
for pkg in pkgs:
if pkg["name"] == package_name:
break
else:
raise SystemExit(f"Package '{package_name}' not in the queue, check the 'show' command")
pkgs = [pkg]
pattern_entries = []
for pkg in pkgs:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
# ignore finished packages
if status in (PackageStatus.FINISHED, PackageStatus.FINISHED_BUT_BLOCKED,
PackageStatus.FINISHED_BUT_INCOMPLETE):
continue
pattern_entries.append((build_type, pkg.get_build_patterns(build_type)))
print(f"Looking for the following files in {src_dir}:")
for build_type, patterns in pattern_entries:
for pattern in patterns:
print(" ", pattern)
matches = []
for build_type, patterns in pattern_entries:
for pattern in patterns:
for match in glob.glob(os.path.join(src_dir, pattern)):
matches.append((build_type, match))
print(f"Found {len(matches)} files..")
for build_type, match in matches:
repo = get_repo_for_build_type(build_type)
release = get_release(repo, 'staging-' + build_type)
print(f"Uploading {match}")
if not args.dry_run:
upload_asset(release, match)
print("Done")
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser(
"upload-assets", help="Upload packages", allow_abbrev=False)
sub.add_argument("path", help="Directory to look for packages in")
sub.add_argument(
"--dry-run", action="store_true", help="Only show what is going to be uploaded")
sub.add_argument("-p", "--package", action="store", help=(
"Only upload files belonging to a particualr package (pkgbase)"))
sub.set_defaults(func=upload_assets)

114
msys2_autobuild/config.py Normal file
View File

@ -0,0 +1,114 @@
from typing import Literal, TypeAlias
from urllib3.util import Retry
ArchType = Literal["mingw32", "mingw64", "ucrt64", "clang64", "clangarm64", "msys"]
SourceType = Literal["mingw-src", "msys-src"]
BuildType: TypeAlias = ArchType | SourceType
REQUESTS_TIMEOUT = (15, 30)
REQUESTS_RETRY = Retry(total=3, backoff_factor=1, status_forcelist=[500, 502])
def get_all_build_types() -> list[BuildType]:
all_build_types: list[BuildType] = []
all_build_types.extend(Config.MSYS_ARCH_LIST)
all_build_types.extend(Config.MINGW_ARCH_LIST)
all_build_types.append(Config.MINGW_SRC_BUILD_TYPE)
all_build_types.append(Config.MSYS_SRC_BUILD_TYPE)
return all_build_types
def build_type_is_src(build_type: BuildType) -> bool:
return build_type in [Config.MINGW_SRC_BUILD_TYPE, Config.MSYS_SRC_BUILD_TYPE]
class Config:
ALLOWED_UPLOADERS = [
"elieux",
"lazka",
"jeremyd2019",
]
"""Users that are allowed to upload assets. This is checked at download time"""
MINGW_ARCH_LIST: list[ArchType] = ["mingw32", "mingw64", "ucrt64", "clang64", "clangarm64"]
"""Arches we try to build"""
MINGW_SRC_ARCH: ArchType = "ucrt64"
"""The arch that is used to build the source package (any mingw one should work)"""
MINGW_SRC_BUILD_TYPE: BuildType = "mingw-src"
MSYS_ARCH_LIST: list[ArchType] = ["msys"]
MSYS_SRC_ARCH: ArchType = "msys"
MSYS_SRC_BUILD_TYPE: BuildType = "msys-src"
RUNNER_CONFIG: dict[BuildType, dict] = {
"msys-src": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
"max_jobs": 1,
},
"msys": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
},
"mingw-src": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
"max_jobs": 1,
},
"mingw32": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
},
"mingw64": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
},
"ucrt64": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
},
"clang64": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
},
"clangarm64": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-11-arm"],
"hosted": True,
},
}
"""Runner config to use for each build type."""
SOFT_JOB_TIMEOUT = 60 * 60 * 3
"""Runtime after which we shouldn't start a new build"""
MAXIMUM_JOB_COUNT = 15
"""Maximum number of jobs to spawn"""
MANUAL_BUILD: list[tuple[str, list[BuildType]]] = [
]
"""Packages that take too long to build, or can't be build and should be handled manually"""
IGNORE_RDEP_PACKAGES: list[str] = [
]
"""XXX: These would in theory block rdeps, but no one fixed them, so we ignore them"""
OPTIONAL_DEPS: dict[str, list[str]] = {
"mingw-w64-headers-git": ["mingw-w64-winpthreads", "mingw-w64-tools-git"],
"mingw-w64-crt-git": ["mingw-w64-winpthreads"],
"mingw-w64-llvm": ["mingw-w64-libc++"],
}
"""XXX: In case of cycles we mark these deps as optional"""

View File

@ -0,0 +1,17 @@
#!/bin/bash
. PKGBUILD
set -e
_keyserver=(
"keyserver.ubuntu.com"
"keys.gnupg.net"
"pgp.mit.edu"
"keys.openpgp.org"
)
for key in "${validpgpkeys[@]}"; do
for server in "${_keyserver[@]}"; do
timeout 20 /usr/bin/gpg --keyserver "${server}" --recv "${key}" && break || true
done
done

183
msys2_autobuild/gh.py Normal file
View File

@ -0,0 +1,183 @@
import io
import os
import shutil
import sys
import tempfile
import time
import hashlib
from contextlib import contextmanager
from datetime import datetime, UTC
from functools import cache
from pathlib import Path
from typing import Any
from collections.abc import Generator, Callable
import requests
from gitea import Configuration, ApiClient, RepositoryApi, CreateReleaseOption
from gitea import Repository, Release, Attachment
from gitea.rest import ApiException
from .config import REQUESTS_TIMEOUT, BuildType, Config
from .utils import PathLike, get_requests_session
@cache
def _get_repo(name: str) -> Repository:
gitea = get_gitea()
split = name.split("/")
return gitea.repo_get(split[0], split[1])
def get_current_repo() -> Repository:
repo_full_name = os.environ.get("GITHUB_REPOSITORY", "Befator-Inc-Firmen-Netzwerk/msys2-autobuild")
return _get_repo(repo_full_name)
def get_repo_for_build_type(build_type: BuildType) -> Repository:
return _get_repo(Config.RUNNER_CONFIG[build_type]["repo"])
@cache
def get_gitea() -> RepositoryApi:
configuration = Configuration()
configuration.host = "https://git.befatorinc.de/api/v1"
configuration.api_key["Authorization"] = "token 91f6f2e72e6d64fbd0b34133efae4a6c838d0e58"
gitea = RepositoryApi(ApiClient(configuration))
return gitea
def download_text_asset(asset: Attachment, cache=False) -> str:
session = get_requests_session(nocache=not cache)
with session.get(asset.browser_download_url, timeout=REQUESTS_TIMEOUT) as r:
r.raise_for_status()
return r.text
def get_asset_mtime_ns(asset: Attachment) -> int:
"""Returns the mtime of an asset in nanoseconds"""
return int(asset.created_at.timestamp() * (1000 ** 3))
def download_asset(asset: Attachment, target_path: str,
onverify: Callable[[str, str], None] | None = None) -> None:
session = get_requests_session(nocache=True)
with session.get(asset.browser_download_url, stream=True, timeout=REQUESTS_TIMEOUT) as r:
r.raise_for_status()
fd, temppath = tempfile.mkstemp()
try:
os.chmod(temppath, 0o644)
with os.fdopen(fd, "wb") as h:
for chunk in r.iter_content(256 * 1024):
h.write(chunk)
mtime_ns = get_asset_mtime_ns(asset)
os.utime(temppath, ns=(mtime_ns, mtime_ns))
if onverify is not None:
onverify(temppath, target_path)
shutil.move(temppath, target_path)
finally:
try:
os.remove(temppath)
except OSError:
pass
def get_gh_asset_name(basename: PathLike, text: bool = False) -> str:
# GitHub will throw out charaters like '~' or '='. It also doesn't like
# when there is no file extension and will try to add one
return hashlib.sha256(str(basename).encode("utf-8")).hexdigest() + (".bin" if not text else ".txt")
def get_asset_filename(asset: Attachment) -> str:
return asset.name
def get_release_assets(release: Release) -> list[Attachment]:
assets = []
for asset in release.assets:
# We allow uploads from GHA and some special users
assets.append(asset)
return assets
def upload_asset(repo: Repository, release: Release, path: PathLike, replace: bool = False,
text: bool = False, content: bytes | None = None) -> None:
gitea = get_gitea()
path = Path(path)
basename = os.path.basename(str(path))
asset_name = get_gh_asset_name(basename, text)
asset_label = basename
def can_try_upload_again() -> bool:
for asset in get_release_assets(release):
if asset_name == asset.name:
# We want to treat incomplete assets as if they weren't there
# so replace them always
if replace:
gitea.repo_delete_release_attachment(repo.owner.login, repo.name, release.id, asset.id)
break
else:
print(f"Skipping upload for {asset_name} as {asset_label}, already exists")
return False
return True
def upload() -> None:
if content is None:
with open(path, "rb") as fileobj:
gitea.repo_create_release_attachment(repo.owner.login, repo.name, release.id, name=asset_label, attachment=path)
else:
tmp_path = None
try:
with tempfile.NamedTemporaryFile(delete=False) as tf:
tf.write(content)
tf.flush()
tmp_path = tf.name
new_asset = gitea.repo_create_release_attachment(repo.owner.login, repo.name, release.id, name=asset_label, attachment=tmp_path)
finally:
if tmp_path and os.path.exists(tmp_path):
os.remove(tmp_path)
try:
upload()
except (ApiException, requests.RequestException):
if can_try_upload_again():
upload()
print(f"Uploaded {asset_name} as {asset_label}")
def get_release(repo: Repository, name: str, create: bool = True) -> Release:
"""Like Repository.get_release() but creates the referenced release if needed"""
gitea = get_gitea()
try:
return gitea.repo_get_release_by_tag(repo.owner.login, repo.name, name)
except ApiException:
if not create:
raise
return gitea.repo_create_release(repo.owner.login, repo.name, body=CreateReleaseOption(tag_name = name, prerelease = True))
class CachedAssets:
def __init__(self) -> None:
self._assets: dict[BuildType, list[Attachment]] = {}
self._failed: dict[str, list[Attachment]] = {}
def get_assets(self, build_type: BuildType) -> list[Attachment]:
if build_type not in self._assets:
repo = get_repo_for_build_type(build_type)
release = get_release(repo, 'staging-' + build_type)
self._assets[build_type] = get_release_assets(release)
return self._assets[build_type]
def get_failed_assets(self, build_type: BuildType) -> list[Attachment]:
repo = get_repo_for_build_type(build_type)
key = repo.full_name
if key not in self._failed:
release = get_release(repo, 'staging-failed')
self._failed[key] = get_release_assets(release)
assets = self._failed[key]
# XXX: This depends on the format of the filename
return [a for a in assets if get_asset_filename(a).startswith(build_type + "-")]

41
msys2_autobuild/main.py Normal file
View File

@ -0,0 +1,41 @@
import argparse
import sys
import logging
from . import (cmd_build, cmd_clean_assets, cmd_clear_failed, cmd_fetch_assets,
cmd_show_build, cmd_update_status, cmd_upload_assets)
from .utils import install_requests_cache
def main(argv: list[str]) -> None:
parser = argparse.ArgumentParser(description="Build packages", allow_abbrev=False)
parser.add_argument(
'-v', '--verbose',
action='count',
default=0,
help='Increase verbosity (can be used multiple times)'
)
parser.set_defaults(func=lambda *x: parser.print_help())
subparsers = parser.add_subparsers(title="subcommands")
cmd_build.add_parser(subparsers)
cmd_show_build.add_parser(subparsers)
cmd_update_status.add_parser(subparsers)
cmd_fetch_assets.add_parser(subparsers)
cmd_upload_assets.add_parser(subparsers)
cmd_clear_failed.add_parser(subparsers)
cmd_clean_assets.add_parser(subparsers)
args = parser.parse_args(argv[1:])
level_map = {0: logging.WARNING, 1: logging.INFO, 2: logging.DEBUG}
logging.basicConfig(
level=level_map.get(args.verbose, logging.DEBUG),
handlers=[logging.StreamHandler(sys.stderr)],
format='[%(asctime)s] [%(levelname)8s] [%(name)s:%(module)s:%(lineno)d] %(message)s',
datefmt='%Y-%m-%d %H:%M:%S')
with install_requests_cache():
args.func(args)
def run() -> None:
return main(sys.argv)

464
msys2_autobuild/queue.py Normal file
View File

@ -0,0 +1,464 @@
import fnmatch
import io
import json
import tempfile
import os
from concurrent.futures import ThreadPoolExecutor
from enum import Enum
from typing import Any, cast
import requests
from gitea.rest import ApiException
from .config import (REQUESTS_TIMEOUT, ArchType, BuildType, Config,
build_type_is_src, get_all_build_types)
from .gh import (CachedAssets, download_text_asset, get_asset_filename,
get_current_repo, get_release,
get_gitea)
from .utils import get_requests_session, queue_website_update
class PackageStatus(Enum):
FINISHED = 'finished'
FINISHED_BUT_BLOCKED = 'finished-but-blocked'
FINISHED_BUT_INCOMPLETE = 'finished-but-incomplete'
FAILED_TO_BUILD = 'failed-to-build'
WAITING_FOR_BUILD = 'waiting-for-build'
WAITING_FOR_DEPENDENCIES = 'waiting-for-dependencies'
MANUAL_BUILD_REQUIRED = 'manual-build-required'
UNKNOWN = 'unknown'
def __str__(self) -> str:
return self.value
class Package(dict):
def __repr__(self) -> str:
return "Package({!r})".format(self["name"])
def __hash__(self) -> int: # type: ignore
return id(self)
def __eq__(self, other: object) -> bool:
return self is other
@property
def _active_builds(self) -> dict:
return {
k: v for k, v in self["builds"].items() if k in (Config.MINGW_ARCH_LIST + Config.MSYS_ARCH_LIST)}
def _get_build(self, build_type: BuildType) -> dict:
return self["builds"].get(build_type, {})
def get_status(self, build_type: BuildType) -> PackageStatus:
build = self._get_build(build_type)
return build.get("status", PackageStatus.UNKNOWN)
def get_status_details(self, build_type: BuildType) -> dict[str, Any]:
build = self._get_build(build_type)
return dict(build.get("status_details", {}))
def set_status(self, build_type: BuildType, status: PackageStatus,
description: str | None = None,
urls: dict[str, str] | None = None) -> None:
build = self["builds"].setdefault(build_type, {})
build["status"] = status
meta: dict[str, Any] = {}
meta["desc"] = description
if urls is None:
urls = {}
meta["urls"] = urls
build["status_details"] = meta
def set_blocked(
self, build_type: BuildType, status: PackageStatus,
dep: "Package", dep_type: BuildType) -> None:
dep_details = dep.get_status_details(dep_type)
dep_blocked = dep_details.get("blocked", {})
details = self.get_status_details(build_type)
blocked = details.get("blocked", {})
if dep_blocked:
blocked = dict(dep_blocked)
else:
blocked.setdefault(dep, set()).add(dep_type)
descs = []
for pkg, types in blocked.items():
descs.append("{} ({})".format(pkg["name"], "/".join(types)))
self.set_status(build_type, status, "Blocked by: " + ", ".join(descs))
build = self._get_build(build_type)
build.setdefault("status_details", {})["blocked"] = blocked
def is_new(self, build_type: BuildType) -> bool:
build = self._get_build(build_type)
return build.get("new", False)
def get_build_patterns(self, build_type: BuildType) -> list[str]:
patterns = []
if build_type_is_src(build_type):
patterns.append(f"{self['name']}-{self['version']}.src.tar.[!s]*")
elif build_type in (Config.MINGW_ARCH_LIST + Config.MSYS_ARCH_LIST):
for item in self._get_build(build_type).get('packages', []):
patterns.append(f"{item}-{self['version']}-*.pkg.tar.zst")
else:
assert 0
return patterns
def get_failed_name(self, build_type: BuildType) -> str:
return f"{build_type}-{self['name']}-{self['version']}.failed"
def get_build_types(self) -> list[BuildType]:
build_types = list(self._active_builds)
if self["source"]:
if any((k in Config.MINGW_ARCH_LIST) for k in build_types):
build_types.append(Config.MINGW_SRC_BUILD_TYPE)
if any((k in Config.MSYS_ARCH_LIST) for k in build_types):
build_types.append(Config.MSYS_SRC_BUILD_TYPE)
return build_types
def _get_dep_build(self, build_type: BuildType) -> dict:
if build_type == Config.MINGW_SRC_BUILD_TYPE:
build_type = Config.MINGW_SRC_ARCH
elif build_type == Config.MSYS_SRC_BUILD_TYPE:
build_type = Config.MSYS_SRC_ARCH
return self._get_build(build_type)
def is_optional_dep(self, dep: "Package", dep_type: BuildType) -> bool:
# Some deps are manually marked as optional to break cycles.
# This requires them to be in the main repo though, otherwise the cycle has to
# be fixed manually.
return dep["name"] in Config.OPTIONAL_DEPS.get(self["name"], []) and not dep.is_new(dep_type)
def get_depends(self, build_type: BuildType) -> "dict[ArchType, set[Package]]":
build = self._get_dep_build(build_type)
return build.get('ext-depends', {})
def get_rdepends(self, build_type: BuildType) -> "dict[ArchType, set[Package]]":
build = self._get_dep_build(build_type)
return build.get('ext-rdepends', {})
def get_buildqueue() -> list[Package]:
session = get_requests_session()
r = session.get("http://localhost:8160/api/buildqueue2", timeout=REQUESTS_TIMEOUT)
r.raise_for_status()
return parse_buildqueue(r.text)
def parse_buildqueue(payload: str) -> list[Package]:
pkgs = []
for received in json.loads(payload):
pkg = Package(received)
pkg['repo'] = pkg['repo_url'].split('/')[-1]
pkgs.append(pkg)
# extract the package mapping
dep_mapping = {}
for pkg in pkgs:
for build in pkg._active_builds.values():
for name in build['packages']:
dep_mapping[name] = pkg
# link up dependencies with the real package in the queue
for pkg in pkgs:
for build in pkg._active_builds.values():
ver_depends: dict[str, set[Package]] = {}
for repo, deps in build['depends'].items():
for dep in deps:
ver_depends.setdefault(repo, set()).add(dep_mapping[dep])
build['ext-depends'] = ver_depends
# reverse dependencies
for pkg in pkgs:
for build in pkg._active_builds.values():
r_depends: dict[str, set[Package]] = {}
for pkg2 in pkgs:
for r_repo, build2 in pkg2._active_builds.items():
for repo, deps in build2['ext-depends'].items():
if pkg in deps:
r_depends.setdefault(r_repo, set()).add(pkg2)
build['ext-rdepends'] = r_depends
return pkgs
def get_cycles(pkgs: list[Package]) -> set[tuple[Package, Package]]:
cycles: set[tuple[Package, Package]] = set()
# In case the package is already built it doesn't matter if it is part of a cycle
def pkg_is_finished(pkg: Package, build_type: BuildType) -> bool:
return pkg.get_status(build_type) in [
PackageStatus.FINISHED,
PackageStatus.FINISHED_BUT_BLOCKED,
PackageStatus.FINISHED_BUT_INCOMPLETE,
]
# Transitive dependencies of a package. Excluding branches where a root is finished
def get_buildqueue_deps(pkg: Package, build_type: ArchType) -> "dict[ArchType, set[Package]]":
start = (build_type, pkg)
todo = set([start])
done = set()
result = set()
while todo:
build_type, pkg = todo.pop()
item = (build_type, pkg)
done.add(item)
if pkg_is_finished(pkg, build_type):
continue
result.add(item)
for dep_build_type, deps in pkg.get_depends(build_type).items():
for dep in deps:
dep_item = (dep_build_type, dep)
if dep_item not in done:
todo.add(dep_item)
result.discard(start)
d: dict[ArchType, set[Package]] = {}
for build_type, pkg in result:
d.setdefault(build_type, set()).add(pkg)
return d
for pkg in pkgs:
for build_type in pkg.get_build_types():
if build_type_is_src(build_type):
continue
build_type = cast(ArchType, build_type)
for dep_build_type, deps in get_buildqueue_deps(pkg, build_type).items():
for dep in deps:
# manually broken cycle
if pkg.is_optional_dep(dep, dep_build_type) or dep.is_optional_dep(pkg, build_type):
continue
dep_deps = get_buildqueue_deps(dep, dep_build_type)
if pkg in dep_deps.get(build_type, set()):
cycles.add(tuple(sorted([pkg, dep], key=lambda p: p["name"]))) # type: ignore
return cycles
def get_buildqueue_with_status(full_details: bool = False) -> list[Package]:
cached_assets = CachedAssets()
assets_failed = []
for build_type in get_all_build_types():
assets_failed.extend(cached_assets.get_failed_assets(build_type))
failed_urls = {}
if full_details:
# This might take a while, so only in full mode
with ThreadPoolExecutor(8) as executor:
for i, (asset, content) in enumerate(
zip(assets_failed, executor.map(download_text_asset, assets_failed))):
result = json.loads(content)
#No more Github Action URLs
#if result["urls"]:
# failed_urls[get_asset_filename(asset)] = result["urls"]
def pkg_is_done(build_type: BuildType, pkg: Package) -> bool:
done_names = [get_asset_filename(a) for a in cached_assets.get_assets(build_type)]
for pattern in pkg.get_build_patterns(build_type):
if not fnmatch.filter(done_names, pattern):
return False
return True
def get_failed_urls(build_type: BuildType, pkg: Package) -> dict[str, str] | None:
failed_names = [get_asset_filename(a) for a in assets_failed]
name = pkg.get_failed_name(build_type)
if name in failed_names:
return failed_urls.get(name)
return None
def pkg_has_failed(build_type: BuildType, pkg: Package) -> bool:
failed_names = [get_asset_filename(a) for a in assets_failed]
name = pkg.get_failed_name(build_type)
return name in failed_names
def pkg_is_manual(build_type: BuildType, pkg: Package) -> bool:
if build_type_is_src(build_type):
return False
for pattern, types in Config.MANUAL_BUILD:
type_matches = not types or build_type in types
if type_matches and fnmatch.fnmatchcase(pkg['name'], pattern):
return True
return False
pkgs = get_buildqueue()
# basic state
for pkg in pkgs:
for build_type in pkg.get_build_types():
if pkg_is_done(build_type, pkg):
pkg.set_status(build_type, PackageStatus.FINISHED)
elif pkg_has_failed(build_type, pkg):
urls = get_failed_urls(build_type, pkg)
pkg.set_status(build_type, PackageStatus.FAILED_TO_BUILD, urls=urls)
elif pkg_is_manual(build_type, pkg):
pkg.set_status(build_type, PackageStatus.MANUAL_BUILD_REQUIRED)
else:
pkg.set_status(build_type, PackageStatus.WAITING_FOR_BUILD)
# wait for dependencies to be finished before starting a build
for pkg in pkgs:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status == PackageStatus.WAITING_FOR_BUILD:
for dep_type, deps in pkg.get_depends(build_type).items():
for dep in deps:
dep_status = dep.get_status(dep_type)
if dep_status != PackageStatus.FINISHED:
if pkg.is_optional_dep(dep, dep_type):
continue
pkg.set_blocked(
build_type, PackageStatus.WAITING_FOR_DEPENDENCIES, dep, dep_type)
# Block packages where not all deps/rdeps/related are finished
changed = True
while changed:
changed = False
for pkg in pkgs:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status == PackageStatus.FINISHED:
# src builds are independent
if build_type_is_src(build_type):
continue
for dep_type, deps in pkg.get_depends(build_type).items():
for dep in deps:
dep_status = dep.get_status(dep_type)
if dep_status != PackageStatus.FINISHED:
pkg.set_blocked(
build_type, PackageStatus.FINISHED_BUT_BLOCKED, dep, dep_type)
changed = True
for dep_type, deps in pkg.get_rdepends(build_type).items():
for dep in deps:
if dep["name"] in Config.IGNORE_RDEP_PACKAGES:
continue
dep_status = dep.get_status(dep_type)
dep_new = dep.is_new(dep_type)
# if the rdep isn't in the repo we can't break it by uploading
if dep_status != PackageStatus.FINISHED and not dep_new:
pkg.set_blocked(
build_type, PackageStatus.FINISHED_BUT_BLOCKED, dep, dep_type)
changed = True
# Block packages where not every build type is finished
for pkg in pkgs:
unfinished = []
blocked = []
finished = []
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status != PackageStatus.FINISHED:
if status == PackageStatus.FINISHED_BUT_BLOCKED:
blocked.append(build_type)
# if the package isn't in the repo better not block on it
elif not pkg.is_new(build_type):
unfinished.append(build_type)
else:
finished.append(build_type)
# We track source packages by assuming they are in the repo if there is
# at least one binary package in the repo. Uploading lone source
# packages will not change anything, so block them.
if not blocked and not unfinished and finished and \
all(build_type_is_src(bt) for bt in finished):
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status in (PackageStatus.FINISHED, PackageStatus.FINISHED_BUT_BLOCKED):
changed = True
pkg.set_status(build_type, PackageStatus.FINISHED_BUT_INCOMPLETE)
elif unfinished:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status in (PackageStatus.FINISHED, PackageStatus.FINISHED_BUT_BLOCKED):
changed = True
for bt in unfinished:
pkg.set_blocked(build_type, PackageStatus.FINISHED_BUT_INCOMPLETE, pkg, bt)
elif blocked:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status == PackageStatus.FINISHED:
changed = True
for bt in blocked:
pkg.set_blocked(build_type, PackageStatus.FINISHED_BUT_BLOCKED, pkg, bt)
return pkgs
def update_status(pkgs: list[Package]) -> None:
repo = get_current_repo()
release = get_release(repo, "status")
status_object: dict[str, Any] = {}
packages = []
for pkg in pkgs:
pkg_result = {}
pkg_result["name"] = pkg["name"]
pkg_result["version"] = pkg["version"]
builds = {}
for build_type in pkg.get_build_types():
details = pkg.get_status_details(build_type)
details.pop("blocked", None)
details["status"] = pkg.get_status(build_type).value
builds[build_type] = details
pkg_result["builds"] = builds
packages.append(pkg_result)
status_object["packages"] = packages
cycles = []
for a, b in get_cycles(pkgs):
cycles.append([a["name"], b["name"]])
status_object["cycles"] = sorted(cycles)
content = json.dumps(status_object, indent=2).encode()
# If multiple jobs update this at the same time things can fail,
# assume the other one went through and just ignore all errors
try:
asset = None
asset_name = "status.json"
for asset in release.assets:
if asset.name == asset_name:
break
do_replace = True
# Avoid uploading the same file twice, to reduce API write calls
if asset is not None and asset.size == len(content):
try:
old_content = download_text_asset(asset, cache=True)
if old_content == content.decode():
do_replace = False
except requests.RequestException:
# github sometimes returns 404 for a short time after uploading
pass
if do_replace:
if asset is not None:
gitea = get_gitea()
gitea.repo_delete_release_attachment(repo.owner.login, repo.name, release.id, asset.id)
tmp_path = None
try:
with tempfile.NamedTemporaryFile(delete=False) as tf:
tf.write(content)
tf.flush()
tmp_path = tf.name
gitea = get_gitea()
new_asset = gitea.repo_create_release_attachment(repo.owner.login, repo.name, release.id, name=asset_name, attachment=tmp_path)
finally:
if tmp_path and os.path.exists(tmp_path):
os.remove(tmp_path)
print(f"Uploaded status file for {len(packages)} packages: {new_asset.browser_download_url}")
queue_website_update()
else:
print("Status unchanged")
except (ApiException, requests.RequestException) as e:
print(e)

122
msys2_autobuild/utils.py Normal file
View File

@ -0,0 +1,122 @@
import os
from contextlib import contextmanager
from datetime import timedelta
from functools import cache
from typing import Any, AnyStr, TypeAlias
from collections.abc import Generator
import requests
from requests.adapters import HTTPAdapter
from .config import REQUESTS_RETRY, REQUESTS_TIMEOUT, Config
PathLike: TypeAlias = os.PathLike | AnyStr
SCRIPT_DIR = os.path.dirname(os.path.realpath(__file__))
def requests_cache_disabled() -> Any:
import requests_cache
return requests_cache.disabled()
@cache
def get_requests_session(nocache: bool = False) -> requests.Session:
adapter = HTTPAdapter(max_retries=REQUESTS_RETRY)
if nocache:
with requests_cache_disabled():
http = requests.Session()
else:
http = requests.Session()
http.mount("https://", adapter)
http.mount("http://", adapter)
return http
@contextmanager
def install_requests_cache() -> Generator:
# This adds basic etag based caching, to avoid hitting API rate limiting
import requests_cache
from requests_cache.backends.sqlite import SQLiteCache
# Monkey patch globally, so pygithub uses it as well.
# Only do re-validation with etag/date etc and ignore the cache-control headers that
# github sends by default with 60 seconds.
cache_dir = os.path.join(os.getcwd(), '.autobuild_cache')
os.makedirs(cache_dir, exist_ok=True)
cache_file = f'http_cache_{requests_cache.__version__}.sqlite'
# delete other versions
for f in os.listdir(cache_dir):
if f.startswith('http_cache') and f != cache_file:
os.remove(os.path.join(cache_dir, f))
requests_cache.install_cache(
always_revalidate=True,
cache_control=False,
expire_after=requests_cache.EXPIRE_IMMEDIATELY,
backend=SQLiteCache(os.path.join(cache_dir, cache_file)))
# Call this once, so it gets cached from the main thread and can be used in a thread pool
get_requests_session(nocache=True)
try:
yield
finally:
# Delete old cache entries, so this doesn't grow indefinitely
cache = requests_cache.get_cache()
assert cache is not None
cache.delete(older_than=timedelta(hours=3))
# un-monkey-patch again
requests_cache.uninstall_cache()
@contextmanager
def gha_group(title: str) -> Generator:
print(f'\n::group::{title}')
try:
yield
finally:
print('::endgroup::')
def queue_website_update() -> None:
session = get_requests_session()
r = session.post('https://packages.msys2.org/api/trigger_update', timeout=REQUESTS_TIMEOUT)
try:
# it's not worth stopping the build if this fails, so just log it
r.raise_for_status()
except requests.RequestException as e:
print(e)
def parse_optional_deps(optional_deps: str) -> dict[str, list[str]]:
res: dict[str, list[str]] = {}
optional_deps = optional_deps.replace(" ", "")
if not optional_deps:
return res
for entry in optional_deps.split(","):
assert ":" in entry
first, second = entry.split(":", 2)
res.setdefault(first, []).append(second)
return res
def apply_optional_deps(optional_deps: str) -> None:
for dep, ignored in parse_optional_deps(optional_deps).items():
Config.OPTIONAL_DEPS.setdefault(dep, []).extend(ignored)
def ask_yes_no(prompt, default_no: bool = True):
"""Ask a yes/no question via input() and return their answer."""
if default_no:
prompt += " [y/N] "
else:
prompt += " [Y/n] "
user_input = input(prompt).strip().lower()
if not user_input:
return False if default_no else True
else:
return user_input == 'y'

979
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,19 +1,32 @@
[tool.poetry]
[project]
name = "msys2-autobuild"
version = "0.1.0"
description = ""
authors = ["Christoph Reiter <reiter.christoph@gmail.com>"]
license = "MIT"
authors = [
{ name = "Christoph Reiter", email = "reiter.christoph@gmail.com" }
]
requires-python = ">=3.12.0,<4.0"
dependencies = [
"PyGithub>=2.8.1,<3",
"tabulate>=0.9.0,<0.10",
"requests>=2.28.1,<3",
"requests-cache>=1.0.0,<2",
"urllib3>=2.2.1,<3",
]
[tool.poetry.dependencies]
python = "^3.7"
PyGithub = "^1.54.1"
tabulate = "^0.8.7"
requests = "^2.25.1"
[project.scripts]
msys2-autobuild = "msys2_autobuild.main:run"
[tool.poetry.dev-dependencies]
mypy = "^0.790"
flake8 = "^3.8.4"
[dependency-groups]
dev = [
"pytest>=8.0.0,<9",
"mypy==1.18.1",
"flake8>=7.0.0,<8",
"types-tabulate>=0.9.0.0,<0.10",
"types-requests>=2.25.0,<3",
]
[build-system]
requires = ["poetry-core>=1.0.0"]
requires = ["poetry-core>=2.2.0"]
build-backend = "poetry.core.masonry.api"

View File

@ -1,29 +1,18 @@
certifi==2020.12.5; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.5.0" and python_version >= "3.6" \
--hash=sha256:719a74fb9e33b9bd44cc7f3a8d94bc35e4049deebe19ba7d8e108280cfd59830 \
--hash=sha256:1a4995114262bffbc2413b159f2a1a480c969de6e6eb13ee966d470af86af59c
chardet==4.0.0; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.5.0" and python_version >= "3.6" \
--hash=sha256:f864054d66fd9118f2e67044ac8981a54775ec5b67aed0441892edb553d21da5 \
--hash=sha256:0d6f53a15db4120f2b08c94f11e7d93d2c911ee118b6b30a04ec3ee8310179fa
deprecated==1.2.11; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6" \
--hash=sha256:924b6921f822b64ec54f49be6700a126bab0640cfafca78f22c9d429ed590560 \
--hash=sha256:471ec32b2755172046e28102cd46c481f21c6036a0ec027521eba8521aa4ef35
idna==2.10; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.5.0" and python_version >= "3.6" \
--hash=sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0 \
--hash=sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6
pygithub==1.54.1; python_version >= "3.6" \
--hash=sha256:87afd6a67ea582aa7533afdbf41635725f13d12581faed7e3e04b1579c0c0627 \
--hash=sha256:300bc16e62886ca6537b0830e8f516ea4bc3ef12d308e0c5aff8bdbd099173d4
pyjwt==1.7.1; python_version >= "3.6" \
--hash=sha256:5c6eca3c2940464d106b99ba83b00c6add741c9becaec087fb7ccdefea71350e \
--hash=sha256:8d59a976fb773f3e6a39c85636357c4f0e242707394cadadd9814f5cbaa20e96
requests==2.25.1; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.5.0") \
--hash=sha256:c210084e36a42ae6b9219e00e48287def368a26d03a048ddad7bfee44f75871e \
--hash=sha256:27973dd4a904a4f13b263a19c866c13b92a39ed1c964655f025f3f8d3d75b804
tabulate==0.8.7 \
--hash=sha256:ac64cb76d53b1231d364babcd72abbb16855adac7de6665122f97b593f1eb2ba \
--hash=sha256:db2723a20d04bcda8522165c73eea7c300eda74e0ce852d9022e0159d7895007
urllib3==1.26.2; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.5.0" and python_version < "4" and python_version >= "3.6" \
--hash=sha256:d8ff90d979214d7b4f8ce956e80f4028fc6860e4431f731ea4a8c08f23f99473 \
--hash=sha256:19188f96923873c92ccb987120ec4acaa12f0461fa9ce5d3d0772bc965a39e08
wrapt==1.12.1; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6" \
--hash=sha256:b62ffa81fb85f4332a4f609cab4ac40709470da05643a082ec1eb88e6d9b97d7
attrs==25.3.0 ; python_version >= "3.12" and python_version < "4.0"
cattrs==25.2.0 ; python_version >= "3.12" and python_version < "4.0"
certifi==2025.8.3 ; python_version >= "3.12" and python_version < "4.0"
cffi==2.0.0 ; python_version >= "3.12" and python_version < "4.0" and platform_python_implementation != "PyPy"
charset-normalizer==3.4.3 ; python_version >= "3.12" and python_version < "4.0"
cryptography==46.0.1 ; python_version >= "3.12" and python_version < "4.0"
idna==3.10 ; python_version >= "3.12" and python_version < "4.0"
platformdirs==4.4.0 ; python_version >= "3.12" and python_version < "4.0"
pycparser==2.23 ; python_version >= "3.12" and python_version < "4.0" and platform_python_implementation != "PyPy" and implementation_name != "PyPy"
pygithub==2.8.1 ; python_version >= "3.12" and python_version < "4.0"
pyjwt==2.10.1 ; python_version >= "3.12" and python_version < "4.0"
pynacl==1.6.0 ; python_version >= "3.12" and python_version < "4.0"
requests-cache==1.2.1 ; python_version >= "3.12" and python_version < "4.0"
requests==2.32.5 ; python_version >= "3.12" and python_version < "4.0"
tabulate==0.9.0 ; python_version >= "3.12" and python_version < "4.0"
typing-extensions==4.15.0 ; python_version >= "3.12" and python_version < "4.0"
url-normalize==2.2.1 ; python_version >= "3.12" and python_version < "4.0"
urllib3==2.5.0 ; python_version >= "3.12" and python_version < "4.0"

View File

@ -1,2 +0,0 @@
[flake8]
max-line-length = 95

0
tests/__init__.py Normal file
View File

140
tests/main_test.py Normal file
View File

@ -0,0 +1,140 @@
# type: ignore
import os
import stat
import tempfile
from pathlib import Path
from msys2_autobuild.utils import parse_optional_deps
from msys2_autobuild.queue import parse_buildqueue, get_cycles
from msys2_autobuild.build import make_tree_writable, remove_junctions
def test_make_tree_writable():
with tempfile.TemporaryDirectory() as tempdir:
nested_dir = Path(tempdir) / "nested"
nested_junction = nested_dir / "junction"
nested_dir.mkdir()
file_path = nested_dir / "test_file.txt"
file_path.write_text("content")
# Create a junction loop if possible, to make sure we ignore it
if os.name == 'nt':
import _winapi
_winapi.CreateJunction(str(nested_dir), str(nested_junction))
else:
nested_junction.mkdir()
# Remove permissions
for p in [tempdir, nested_dir, file_path, nested_junction]:
os.chmod(p, os.stat(p).st_mode & ~stat.S_IWRITE & ~stat.S_IREAD)
make_tree_writable(tempdir)
assert os.access(tempdir, os.W_OK) and os.access(tempdir, os.R_OK)
assert os.access(nested_dir, os.W_OK) and os.access(nested_dir, os.R_OK)
assert os.access(file_path, os.W_OK) and os.access(file_path, os.R_OK)
assert os.access(nested_junction, os.W_OK) and os.access(nested_junction, os.R_OK)
def test_remove_junctions():
with tempfile.TemporaryDirectory() as tempdir:
nested_dir = Path(tempdir) / "nested"
nested_junction = nested_dir / "junction"
nested_dir.mkdir()
# Create a junction loop if possible, to make sure we ignore it
if os.name == 'nt':
import _winapi
_winapi.CreateJunction(str(nested_dir), str(nested_junction))
assert nested_junction.exists()
assert os.path.isjunction(nested_junction)
remove_junctions(tempdir)
assert not nested_junction.exists()
def test_parse_optional_deps():
assert parse_optional_deps("a:b,c:d,a:x") == {'a': ['b', 'x'], 'c': ['d']}
def test_get_cycles():
buildqueue = """
[
{
"name": "c-ares",
"version": "1.34.2-1",
"version_repo": "1.33.1-1",
"repo_url": "https://github.com/msys2/MSYS2-packages",
"repo_path": "c-ares",
"source": true,
"builds": {
"msys": {
"packages": [
"libcares",
"libcares-devel"
],
"depends": {
"msys": [
"libnghttp2",
"libuv"
]
},
"new": false
}
}
},
{
"name": "nghttp2",
"version": "1.64.0-1",
"version_repo": "1.63.0-1",
"repo_url": "https://github.com/msys2/MSYS2-packages",
"repo_path": "nghttp2",
"source": true,
"builds": {
"msys": {
"packages": [
"libnghttp2",
"libnghttp2-devel",
"nghttp2"
],
"depends": {
"msys": [
"libcares",
"libcares-devel"
]
},
"new": false
}
}
},
{
"name": "libuv",
"version": "1.49.2-1",
"version_repo": "1.49.1-1",
"repo_url": "https://github.com/msys2/MSYS2-packages",
"repo_path": "libuv",
"source": true,
"builds": {
"msys": {
"packages": [
"libuv",
"libuv-devel"
],
"depends": {
"msys": [
"libnghttp2"
]
},
"new": false
}
}
}
]"""
pkgs = parse_buildqueue(buildqueue)
cycles = get_cycles(pkgs)
assert len(cycles) == 3
assert (pkgs[0], pkgs[2]) in cycles
assert (pkgs[0], pkgs[1]) in cycles
assert (pkgs[2], pkgs[1]) in cycles

2
update-status.bat Normal file
View File

@ -0,0 +1,2 @@
@echo off
C:\msys64\msys2_shell.cmd -here -mingw64 -no-start -defterm -c "pacman --needed --noconfirm -S mingw-w64-x86_64-python-tabulate mingw-w64-x86_64-python-requests-cache && python -m msys2_autobuild update-status"