Compare commits

...

301 Commits

Author SHA1 Message Date
Befator
50276fb9a2 More fixes
Some checks failed
test / test (ubuntu-24.04, 3.12) (push) Has been cancelled
test / test (ubuntu-24.04, 3.13) (push) Has been cancelled
test / test (windows-11-arm, 3.12) (push) Has been cancelled
test / test (windows-11-arm, 3.13) (push) Has been cancelled
test / test (windows-2022, 3.12) (push) Has been cancelled
test / test (windows-2022, 3.13) (push) Has been cancelled
test / zizmor (push) Has been cancelled
build / schedule (push) Has been cancelled
build / ${{ matrix.name }} (push) Has been cancelled
2025-10-18 19:19:29 +02:00
Befator
84c03f504e Giteafication
Some checks are pending
test / test (windows-11-arm, 3.13) (push) Waiting to run
test / test (ubuntu-24.04, 3.12) (push) Waiting to run
test / test (ubuntu-24.04, 3.13) (push) Waiting to run
test / test (windows-11-arm, 3.12) (push) Waiting to run
test / test (windows-2022, 3.12) (push) Waiting to run
test / test (windows-2022, 3.13) (push) Waiting to run
test / zizmor (push) Waiting to run
2025-10-17 19:45:13 +02:00
Christoph Reiter
553846537b add some debug logs for JOB_CHECK_RUN_ID
seems like it's not there in some cases (?)
2025-10-05 09:49:19 +02:00
Christoph Reiter
c6213b4d1a Partly remove hack to fetch the current job ID
GH runner now exposes a "job.check_run_id" in the template
language, which we can use directly for the API to fetch
information about the job currently in progress.

Previously we looked through all active jobs and matched them
by name.

There is no env var for job.check_run_id, so we have to set the
env var in the yaml file still.
2025-10-03 22:16:09 +02:00
Christoph Reiter
ecd1d51f4d Use native arm64 Python 2025-09-17 11:11:28 +02:00
Christoph Reiter
fd1952d205 Update mypy 2025-09-17 10:59:55 +02:00
Christoph Reiter
19926ce9c5 Update requirements.txt
missed it
2025-09-17 10:29:34 +02:00
Christoph Reiter
33a052a413 Update deps 2025-09-17 10:18:31 +02:00
Christoph Reiter
59740a3f2e Port to PEP 735
And depend on poetry 2.2.
This allows one to use uv instead of poetry if wanted.
Add .venv to flake8 ignore since that's the uv default.

Also update deps while at it, and specify a license.
2025-09-14 21:43:15 +02:00
Christoph Reiter
4704486d49 Update deps; require PyGithub 2.8.1
move to the new digest property
2025-09-09 09:44:28 +02:00
Christoph Reiter
dc632d9934 build: custom makepkg config for building
During the build create a temporary config file in makepkg.conf.d
which changes some defaults.

For starters this sets the zstd compression, and bumps it for source
builds.

This allows us to make the default zstd config faster, while compressing
with a higher level in autobuild.
2025-08-28 19:00:00 +02:00
Christoph Reiter
3687fa3a0b Fix condition for selecting msys build
this happened to work by accident via substring match, and being
the last condition
2025-08-26 22:21:19 +02:00
Christoph Reiter
42b02362e1 Use ruff to upgrade code 2025-08-26 22:05:51 +02:00
Christoph Reiter
05abf4e953 Assume os.path.isjunction is available
now that we depend on Python 3.12+
2025-08-26 22:05:51 +02:00
Christoph Reiter
a3bae5a40c Drop support for Python 3.11
We kinda depend on os.path.isjunction, so just drop it
2025-08-26 22:05:43 +02:00
Christoph Reiter
456089ba22 Remove old compat code 2025-08-26 21:32:32 +02:00
Christoph Reiter
d15bda6f83 CI: update actions/checkout 2025-08-25 09:49:26 +02:00
Christoph Reiter
de38d16edd Update deps 2025-08-25 09:48:02 +02:00
Christoph Reiter
fd77359a5a Drop support for Python 3.10 2025-08-01 08:26:25 +02:00
Christoph Reiter
3581de3619 fix Generator usage with older Python
it doesn't have defaults there, so pass None
2025-08-01 08:19:56 +02:00
Christoph Reiter
84d3306857 Update mypy 2025-08-01 08:16:45 +02:00
Christoph Reiter
5bbfb7bb18 Verify checksums when downloading assets
In the last few weeks (I think) GH added checksums to the API reponses
for release assets. Use them to verify the downloaded files.

Also bump the chunk size a bit while at it, it was quite small..
2025-08-01 08:13:06 +02:00
Christoph Reiter
69ce064955 Update pygithub
there are API changes, so bump the minimum
2025-08-01 07:56:41 +02:00
Christoph Reiter
ab3c2437e8 Update deps 2025-07-22 06:38:31 +02:00
Christoph Reiter
70dec0bd33 CI: revert to windows-2022 for now
see https://github.com/msys2/MINGW-packages/pull/24677#issuecomment-3017919467
2025-06-30 08:22:56 +02:00
Christoph Reiter
54197e6af4 Update deps 2025-06-20 20:16:36 +02:00
Christoph Reiter
c237bc163a Update deps 2025-06-13 10:26:44 +02:00
Christoph Reiter
5f5d7aafa2 update default cycle breakers for winpthreads rename 2025-06-08 10:12:46 +02:00
Christoph Reiter
5c2504702e Update deps 2025-06-02 09:22:39 +02:00
Christoph Reiter
776a26e021 Update deps 2025-05-25 15:46:04 +02:00
Christoph Reiter
999e4e9327 Try to match the install paths of the packages CI more closely
Things are failing and the only difference in the logs are paths, so
try to get rid of that difference at least.
2025-05-19 09:32:07 +02:00
Christoph Reiter
3a5fc4c416 Update deps 2025-05-04 19:46:18 +02:00
Christoph Reiter
663b7acdc1 zizmor: allow unpinned setup-msys2
we trust our own code
2025-04-28 06:31:28 +02:00
Christoph Reiter
e8d10d7e9e config: switch to windows-11-arm for clangarm64 2025-04-16 19:59:15 +02:00
Christoph Reiter
caa6a73b53 CI: remove useless condition
as pointed out in https://github.com/msys2/msys2-autobuild/pull/112/files#r2047370653
if release==false then the location is not used anyway
2025-04-16 19:54:04 +02:00
Christoph Reiter
839b8befc3 config: fold MAXIMUM_BUILD_TYPE_JOB_COUNT into RUNNER_CONFIG as well 2025-04-16 07:24:19 +02:00
Christoph Reiter
a2fb8db0e7 config: add more runner specific config
instead of hardcoding them in multiple places
2025-04-16 06:34:39 +02:00
Christoph Reiter
311b4cd295 CI: run tests on windows-11-arm
force x64 Python still since installing our deps still fails
for arm64 there
2025-04-16 06:34:39 +02:00
Christoph Reiter
0d471ea5b7 build: try removing junctions before calling git clean
See https://github.com/msys2/msys2-autobuild/issues/108#issuecomment-2776420879

It looks like git can under some circumstances hang forever when trying
to clean the checkout when there are junction loops. So try to remove
them manually before calling git clean.

Fixes #108
2025-04-11 14:11:50 +02:00
Christoph Reiter
8d9cbcb54c Update deps 2025-04-11 10:52:11 +02:00
Christoph Reiter
23845c53e0 Update deps 2025-03-29 12:37:12 +01:00
Christoph Reiter
e9e823c2e7 build: try to use actions/cache@v4 for pip caching
To work around https://github.com/actions/setup-python/issues/1050
2025-03-12 06:48:14 +01:00
Christoph Reiter
fe4bcd08a9 CI: disable "update-environment" for "setup-python"
setup-python, by default, sets various cmake and pkg-config env
vars, so that packages using cmake can be built. Since this might
interfere with out package builds disable it.

We only care about the Python executable itself, so use the action
output to create the venv.
2025-03-10 08:38:48 +01:00
Christoph Reiter
47cc05c39f CI: also use a venv for the Windows build job
To be more isolated from the host system
2025-03-10 08:30:20 +01:00
Christoph Reiter
a2ebb72da0 CI: use a venv for the linux jobs
To gain more isolation from the host
2025-03-09 19:26:37 +01:00
Christoph Reiter
4413e41389 Port to PEP 621
only dev deps left, for that we need PEP 735 which isn't in poetry yet
2025-03-07 16:41:28 +01:00
Christoph Reiter
d45f6720f4 CI: move to Python 3.13 2025-03-07 12:06:20 +01:00
Christoph Reiter
e2042058f1 gh: improve repo caching
We were caching based on the build type, but for most build types the repo
is the same, so cache one level below instead.
2025-03-07 12:04:27 +01:00
Christoph Reiter
bb54adc298 Add verbosity option and write logs to stderr by default
default is warning, -v means info, -vv means debug
2025-03-07 12:04:27 +01:00
Christoph Reiter
1ef3f8f5f5 Use new pygithub global lazy feature
Requires 2.6.0. Means data is only fetched if it is accessed,
so fewer API calls for us (hopefully).
2025-03-07 12:04:21 +01:00
Christoph Reiter
ca6dd299ee Update dependencies 2025-03-07 11:02:48 +01:00
Christoph Reiter
5f9bed8409 CI: remove VCPKG_ROOT workaround
This was to avoid breakage from https://github.com/actions/runner-images/pull/6192
But it was reverted in the image long ago: https://github.com/actions/runner-images/issues/6376
2025-03-07 10:07:04 +01:00
Christoph Reiter
625631832e CI: derive the build root from GITHUB_WORKSPACE
On hosted runners this means D:, on self-hosted it
will point to C: if there is only one drive.
2025-03-06 00:27:11 +01:00
Christoph Reiter
7ec5a79b46 CI: build on D: instead of C:
Related to 796ec1c1ba

D: is both faster and has more free space compared to C: with the current runner setup.
2025-03-05 23:19:23 +01:00
Christoph Reiter
a187346d08 Update deps 2025-02-15 15:11:06 +01:00
Christoph Reiter
b442168127 build: delete all junctions before calling "git clean"
git clean can't deal with junctions and in case there is a loop
it follows them forever (or until stack overflow).
https://github.com/git-for-windows/git/issues/5320

To work around this try to delete all junctions in the clean
re-try code path.

Fixes #108
2025-01-31 16:01:13 +01:00
Christoph Reiter
bdd38ec73c Update deps 2025-01-25 07:32:02 +01:00
Christoph Reiter
98f6ea2875 CI: set default permissions to make newer zizmor happy 2025-01-19 09:27:23 +01:00
Christoph Reiter
a977f9deb9 remove leftover debug print 2025-01-11 08:58:12 +01:00
Christoph Reiter
4f60392b3e make_tree_writable: handle junctions and add tests
As found out here, os.walk() by default follows junctions, which we don't
want and can even lead to loops:
https://github.com/msys2/msys2-autobuild/issues/101#issuecomment-2583121845

Integrate the workaround mentioned in the CPython bug report:
https://github.com/python/cpython/issues/67596#issuecomment-1918112817
Since this is Python 3.12+ only and we still support 3.10 make
it optional though.

This also adds tests, which uncovered some other minor issues:
It was not chmoding top-down, which meant that os.walk would
skip things if there were no read permissions. So chmod before
os.walk() lists the dir.
2025-01-10 21:32:14 +01:00
Christoph Reiter
35ff0b71b6 Update deps 2024-12-20 11:24:43 +01:00
Christoph Reiter
1575848e81 Move to windows-2025 2024-12-20 11:23:33 +01:00
مهدي شينون (Mehdi Chinoune)
657fd89531 remove clang32 2024-12-19 08:19:08 +01:00
Christoph Reiter
0f20d6bfa8 Reapply "CI: remove code scanning again"
This reverts commit c553f33cf05394be3733705adc4c3ad86e1a044d.

I still can't get it to work and I give up
2024-12-13 22:16:39 +01:00
Christoph Reiter
c553f33cf0 Revert "CI: remove code scanning again"
This reverts commit c5b593a34c9d51fac31f3c3e158db7b15a004804.

try the suggestion from
https://github.com/woodruffw/zizmor/discussions/291
2024-12-13 21:51:40 +01:00
Christoph Reiter
c5b593a34c CI: remove code scanning again
And just fail normally in the job if anything is found.
I can't get the code scanning to fail a check somehow.
2024-12-13 21:34:31 +01:00
Christoph Reiter
1bc0a28e35 CI: run zizmor 2024-12-13 20:55:43 +01:00
Christoph Reiter
0f71ee73cf Update deps 2024-12-06 17:43:59 +01:00
Christoph Reiter
4deb3111d3 CI: move to ubuntu-24.04
from ubuntu-22.04
2024-12-06 14:41:07 +01:00
Christoph Reiter
5bf958fd1b CI-hardening: move permissions to the job level
Instead of giving all jobs write permissions, default to no permissions
and enable them on a per-job basis.

This does not change anything for us, but avoids accidental write
permissions if a new job gets added without considering that it inherits
the top level permissions, even if it doesn't need them.

See https://woodruffw.github.io/zizmor/audits/#excessive-permissions
2024-12-06 14:17:15 +01:00
Christoph Reiter
7eed3d8bc1 CI-hardening: escape the msys2-location output
While that comes from our own action, so we can in theory trust it,
escape it for good measure. Can't hurt and silences a warning.
2024-12-06 13:59:38 +01:00
Christoph Reiter
7c78444174 CI-hardening: set persist-credentials=false for all actions/checkout
To avoid writing the token to disk. It still gets exposed via env vars
to various steps, but this removes the access from any steps before that.

As recommeded by the zizmor scanner
2024-12-06 13:59:25 +01:00
Christoph Reiter
19c8f00aba CI: update to Python 3.12; also test with 3.13 2024-12-06 13:59:06 +01:00
jeremyd2019
a6b3079ae3 update package name in OPTIONAL_DEPS
The pkgbase was renamed from mingw-w64-clang to mingw-w64-llvm, but this still had the old name, requiring manual specification of breaking the cycle with libc++
2024-11-12 23:28:24 +01:00
Christoph Reiter
acafab9b5f queue: fix missing cycles with build-only deps of deps
We only looked at the dependencies of a package that are needed for building,
but for detecting build cycles we also have to look at all transitive deps.

Unless the dependency is already finished, then we can ignore its build deps,
even if they are not finished yet.

The test shows such a case where things indirectly create a cycle via cmake.

Fixes #91
2024-10-26 20:06:33 +02:00
Christoph Reiter
ef67d84096 Update deps 2024-10-26 14:27:05 +02:00
Christoph Reiter
7c56a1d764 Update deps 2024-10-07 07:44:41 +02:00
Christoph Reiter
cfdccd0a03 Update deps 2024-09-21 11:15:17 +02:00
Jeremy Drake
22f1e5ad0b Revert "Partially revert "CI: Update actions/setup-python""
The upstream issue has (finally) been fixed.

This reverts commit 3e617554bbe6fc206a4032e86b0cc79aedad42e6.
2024-08-29 20:52:12 +02:00
Christoph Reiter
05a051162d Update deps 2024-08-28 08:30:18 +02:00
Christoph Reiter
f968d2f0ca Update deps 2024-08-09 11:49:18 +02:00
Christoph Reiter
67d510ec4b CI: use the new setup-msys2 output for finding the install location 2024-08-03 13:50:05 +02:00
Christoph Reiter
f44d95e7c2 Update deps 2024-08-02 09:40:41 +02:00
Christoph Reiter
00495cb263 Update deps 2024-06-23 09:56:16 +02:00
Christoph Reiter
40ab937954 Update deps 2024-06-07 17:56:03 +02:00
Christoph Reiter
59bb7f6f18 fetch-assets: test all downloaded files with zstd
Test them before moving them to the final location.
This makes the download fial of there is some file corruption etc.

This adds a dependency on the zstd exectuable for the fetch-assets
command.

Motivated by https://github.com/msys2/msys2-main-server/issues/42
2024-05-25 14:03:54 +02:00
Christoph Reiter
bf3cf80161 Update deps 2024-05-25 13:10:28 +02:00
Christoph Reiter
63ea6585cd config: clean up manual build and ignore rdep lists
those packages eiher no longer exist, or should proably build now
that the CI runners are faster.
2024-05-20 10:20:22 +02:00
Christoph Reiter
ea149103be Update deps 2024-05-20 10:13:34 +02:00
Christoph Reiter
9c7e8d3135 Update deps 2024-04-17 08:13:54 +02:00
Christoph Reiter
8d08599c2e Update deps 2024-04-02 22:20:33 +02:00
Jeremy Drake
3e617554bb Partially revert "CI: Update actions/setup-python"
Due to actions/setup-python#819, it fails to install python on a Windows
11 (or presumably Server 2022) self-hosted runner, when a suitable
version of python was not already installed.

Closes #85

This partially reverts commit d5779cd65dbe2e5dceb418c040b7d0d505372294.
2024-03-18 05:42:33 +01:00
Christoph Reiter
9a0b6a31c9 Update deps 2024-03-17 16:22:03 +01:00
Christoph Reiter
8d7df1587a add missing urllib3 dep 2024-03-17 16:21:14 +01:00
Christoph Reiter
dad6671556 cache: old files didn't contain _ 2024-03-03 07:40:34 +01:00
Christoph Reiter
bf9a4e2862 cache: version the file using the library version instead
so we don't have to care about this in the future.
2024-03-03 07:37:29 +01:00
Christoph Reiter
719254cb89
bump the cache file
looks like it's not compatible anymore
2024-03-03 06:59:02 +01:00
Christoph Reiter
281ad3e16e Update deps 2024-03-02 21:22:33 +01:00
Christoph Reiter
d4515ba2fe CI: Update al-cheb/configure-pagefile-action 2024-02-02 19:13:29 +01:00
Christoph Reiter
b78070c653 Update deps 2024-02-01 21:08:42 +01:00
Christoph Reiter
aa0637d87b CI: Update actions/cache 2024-02-01 20:25:36 +01:00
Christoph Reiter
d5779cd65d CI: Update actions/setup-python 2024-01-30 07:24:44 +01:00
Christoph Reiter
1c45f2ab2e Update deps 2024-01-10 08:22:56 +01:00
Christoph Reiter
0eca067dd7 Update deps 2023-12-07 11:32:22 +01:00
Christoph Reiter
1ed7c15c97 flake8 fixes 2023-10-22 16:19:57 +02:00
Christoph Reiter
dae5e305db CI: Update to actions/checkout@v4 2023-10-22 16:16:45 +02:00
Christoph Reiter
1d8af300c4 Move flake8 config from setup.cfg to .flake8
We don't use setuptools, so this makes things clearer
2023-10-22 16:07:49 +02:00
Christoph Reiter
1f4971c293 Drop support for Python 3.8/9 2023-10-22 16:03:57 +02:00
Christoph Reiter
fd1d5cc9ef Update deps 2023-10-22 15:54:52 +02:00
Christoph Reiter
e6700d2089 Disable the new pygithub read throttling
It seems a bit excessive and doesn't take into account that
lots of our request hit the cache via etags.
2023-10-16 20:28:22 +02:00
Christoph Reiter
d1048413f8 Update pygithub to v2
It now has its own default retry logic that fits the GH API,
so no longer pass our own and assume it handles things better.

The datetimes are now timezone aware, so we no longer have to fix
them.
2023-10-16 20:21:47 +02:00
Christoph Reiter
3e0391eb26 Update mypy 2023-10-16 19:44:28 +02:00
Christoph Reiter
049635cd1a Handle disabling certain build types
Even it the API returns them, if they are not in the active list they
will be ignored.
2023-10-16 18:58:08 +02:00
Christoph Reiter
ca30448b74 Update deps 2023-10-16 18:28:03 +02:00
Christoph Reiter
a79a8c4c7a Update deps 2023-09-25 21:39:59 +02:00
Christoph Reiter
3f5f60aa62 Remove Alexpux from uploaders
He hasn't required this in a while, so remove.

Feel free to ask to be re-added.
2023-09-16 14:07:08 +02:00
Christoph Reiter
79a45bf6c7 Require a user confirmation for manual uploads
We currently allow some users to manually upload packages (in case
they take too long for CI, or to bootstrap things).

In case of an account takeover this would allow an attacker to upload/replace
files in staging. To reduce the risk a bit ask for confirmation when downloading
the manually uploaded files.

Also add a "--noconfirm" option so we can avoid the questions in the staging
download script.

Ideally we would require users to sign their files, but this helps a bit at least.
2023-09-16 14:07:08 +02:00
Christoph Reiter
0852421d17 Update deps 2023-08-29 07:38:58 +02:00
Christoph Reiter
f368fb4951 update_status: handle github returning 404 for assets returned by the API
some time after replacing the file it randomly returns 200 and 404 a few times
until it settles on 200.
2023-08-19 23:11:54 +02:00
Christoph Reiter
0af6deb998 CI: allow the "Configure Pagefile" step to fail
it's only a requirement for some packages (flang), and there is a much higher
chance that it fails for a job that doesn't need it currently.
2023-08-16 20:55:46 +02:00
Christoph Reiter
c9fb5c61ab Update dependencies 2023-08-16 20:53:29 +02:00
Christoph Reiter
a3a5c1da40 update-status: only replace the status file if something has changed
Before uploading the status file we make a cached request for the old status
content and if there is no difference we don't upload anything.

This reduces the amount of write API calls and the amount of useless
packages.msys2.org refreshes a bit.
2023-08-16 20:40:22 +02:00
Christoph Reiter
1f1fabade2 clean-assets: only re-create releases if there are many assets
re-creating causes notifications for users. While users can disable them
let's just limit it to larger rebuilds, like the Python rebuilds

Fixes #77
2023-08-01 08:07:23 +02:00
Christoph Reiter
4db4e22d09 clean-assets: delete release in case all assets need to be deleted
In case a release has hundreds of files that need to be deleted this
requires quite a bit of time and also works against the API rate limiting.

In case we want to delete all assets of an release just delete and
re-create the whole release instead.

Fixes #77
2023-07-30 15:00:23 +02:00
Christoph Reiter
5b61a937a1 Update dependencies 2023-07-29 21:10:41 +02:00
Christoph Reiter
edc9089808 clean-assets: fewer parallel deletes
we are still hitting the secondary rate limit
2023-07-29 21:05:00 +02:00
Christoph Reiter
a1540964f5 Remove winjitdebug again
things should be fixed with Python 3.11
2023-07-24 18:22:19 +02:00
Christoph Reiter
95ab14dfe7 Update requirements.txt 2023-06-30 22:04:57 +02:00
Christoph Reiter
d68ad18de2 Port to new pygithub auth API 2023-06-30 22:00:01 +02:00
Christoph Reiter
305e7b4c68 Update dependencies 2023-06-30 21:57:33 +02:00
Christoph Reiter
79096b753c build: disable setup-msys2 caching for the arm64 runner
it's not really helping in case of self-hosted runners,
so just disable it there.
2023-05-28 21:02:12 +02:00
Christoph Reiter
6d6d83ea3e CI: Python 3.10 -> 3.11
looks like all dependencies have wheels for 3.11 now
2023-05-27 08:54:06 +02:00
Christoph Reiter
f78c47f441 Update dependencies 2023-05-26 22:49:17 +02:00
مهدي شينون (Mehdi Chinoune)
13b6b27fea Enable autobuild for qt5-static 2023-05-20 19:10:29 +02:00
Christoph Reiter
dfc132af9d download_text_asset: don't use the cache
This gets called in a threadpool and something in requests_cache
deadlocks.
2023-05-08 09:15:06 +02:00
Christoph Reiter
1a8a881082 build: try to make all files writable if git clean fails
I'm again not sure if this helps, but let's see..
2023-04-13 18:03:47 +02:00
Christoph Reiter
aa61bfdedd Update dependencies 2023-04-07 19:29:24 +02:00
Christoph Reiter
b51cfd02af Avoid upload_asset()
git fails to delete files we have uploaded, and I'm wondering if
upload_asset() is somehow keeping a handle open. While I can't find
anything suspicious in pygithub let's make the file handling explicit
and open/close ourselves.
2023-04-07 19:21:42 +02:00
Christoph Reiter
76a815c145 build: enable core.longpaths for the git repo
so "git clean" can potentially remove overly long paths created
during build time.
2023-04-07 19:12:47 +02:00
Christoph Reiter
236220ef8e typo 2023-04-07 10:43:40 +02:00
Christoph Reiter
60a287290d build: try to run git clean multiple times before giving up
For example it failed with:

warning: failed to remove B/mingw-w64-clang-i686-seacas-2023.02.03-2-any.pkg.tar.zst: Invalid argument

We now always use the same build directory, so if files can't be removed
we fail. Retry git clean/reset a few times before giving up and also
try before we start so in case it is fixed while the job isn't running on
a self-hosted runner we can continue automatically.
2023-04-07 10:41:49 +02:00
Christoph Reiter
cc301e1e62 build: shorter build paths
see https://github.com/msys2/msys2-autobuild/issues/71
2023-04-06 08:53:11 +02:00
Christoph Reiter
f3bf1b80b0 Revert "CI: run every 2 hours instead of 3"
This reverts commit 3116e844bee9bb9515ea892b37238e54cf2fcb98.
2023-04-05 16:58:30 +02:00
Christoph Reiter
3ef72c5eed CI: try per-job concurrency
so that we start jobs even if other jobs from a previous workflow are still running
2023-04-05 16:52:08 +02:00
Christoph Reiter
ccaad93b62 Don't ignore rdeps for mingw-w64-qt6-static
let's try
2023-04-05 07:43:58 +02:00
Christoph Reiter
fb16cedabf config: re-enable automatic builds for mingw-w64-qt6-static
see https://github.com/msys2/MINGW-packages/pull/16637#issuecomment-1496360513
2023-04-05 07:42:33 +02:00
Christoph Reiter
3116e844be CI: run every 2 hours instead of 3 2023-03-24 17:18:21 +01:00
Christoph Reiter
e3bb36afac more type annotations 2023-03-24 14:09:24 +01:00
Christoph Reiter
30fbfffb96 CI: installing wheel shouldn't be needed anymore
pip pulls it in now if needed
2023-03-24 13:44:21 +01:00
Christoph Reiter
8cb3c65f55 turns out matrix in needs is broken
https://github.com/orgs/community/discussions/25364
2023-03-24 13:27:50 +01:00
Christoph Reiter
7417496d9e write_build_plan: rework + build src last
* don't show the cycles when generating the build plan
  (we have other places that show it now)
* interleave the different build types when generating jobs
* make the src jobs depend on the non-src jobs, as src builds
  depend on eiher msys or ucrt64 build results and otherwise
  will just stop due to missing deps. Could be improved by only
  depending on msys/ucrt64, but this is still an improvement.
2023-03-24 13:19:12 +01:00
Christoph Reiter
c27f9a7c40 we can only clean assets for the current repo 2023-03-23 12:32:59 +01:00
Christoph Reiter
19857e3fa0 looks like fromJson() can't handle newlines 2023-03-23 12:07:49 +01:00
Christoph Reiter
956ac59246 write_build_plan: remove the check for running workflows
This was required to avoid running multiple builds at the same time.
But GHA now has concurrency groups which solves the same problem,
so drop that code
2023-03-23 11:59:37 +01:00
Christoph Reiter
ba632451ef README: document the env vars 2023-03-23 11:59:37 +01:00
Christoph Reiter
606b782bb0 config: add option to limit the job count for specific build types
Limit src builds because they are quite fast anyway, and clangarm64
because the self hosted runner can only do one job at a time.
2023-03-23 11:58:17 +01:00
Christoph Reiter
e2ca121180 Replace readonly with write everywhere
less confusing, at least for me
2023-03-23 11:58:17 +01:00
Christoph Reiter
b453032363 Get rid of MAIN_REPO
in most cases at least. either derive from the current
build type, or via get_current_repo() which reads the
GITHUB_REPOSITORY env var.
2023-03-23 11:58:17 +01:00
Christoph Reiter
98697683a5 main: remove --repo option again
this was meant for the arm runner, but it was never used.
2023-03-23 11:58:17 +01:00
Christoph Reiter
6f93057f83 make the tests a package
to make pytest happy
2023-03-23 11:58:13 +01:00
Christoph Reiter
88871c4cb0 Rename _PathLike to PathLike
it's no longer internal
2023-03-23 11:17:10 +01:00
Christoph Reiter
ad34ca14b6 Move some hard coded IDs to the config 2023-03-23 11:17:10 +01:00
Christoph Reiter
e0e19de2c1 Add some unit tests
just one to get things started
2023-03-22 12:47:27 +01:00
Christoph Reiter
5085f864b3 Missed one command 2023-03-22 11:13:33 +01:00
Christoph Reiter
6f40845ba3 README: add a short description and remove the process info
the process info is now moved to the main MSYS2 documentation
2023-03-22 10:42:42 +01:00
Christoph Reiter
6788467670 README: update the CLI help output 2023-03-22 10:09:36 +01:00
Christoph Reiter
87f0603c87 Split the code up into separate modules
with minimal code changes
2023-03-22 09:59:05 +01:00
Christoph Reiter
0d25d51a04 Convert the script to a Python package
It can now be invoked via `python -m msys2_autobuild` or
by installing it, which adds a "msys2-autobuild" script.

This is a first step towards splitting up the code.

The HTTP cache is now stored in the working directory
instead of the source directory.
2023-03-21 11:34:39 +01:00
Christoph Reiter
d0ddf60737 Update dependencies 2023-03-18 10:40:06 +01:00
Christoph Reiter
91ab34350f cache: clean up at the end and limit to 3 hours
it's unlikely there will be many hits after some hours, so better
keep the upload size low. Also clean at the end to make
the upload smaller.
2023-02-19 17:02:55 +01:00
Christoph Reiter
38e6bc6e47 requests_cache: port to new cache cleanup function
I find the API still confusing, but it's better then before.
2023-02-19 16:43:35 +01:00
Christoph Reiter
6ccea00bba Bump the max number of jobs again
Since the last commit we should need fewer API calls
2023-02-19 16:10:29 +01:00
Christoph Reiter
c152a6dbbf Depend on the new pygithub assets API
This exposes the assets inline from a release, so this
should save us lots of requests. Available since v1.58.0
2023-02-19 16:08:51 +01:00
Christoph Reiter
b7df29ff56 CI: skip installing wheel
This was for packages without wheels to build them initially.
In theory newer pip should handle this automatically, let's see
2023-02-19 16:07:19 +01:00
Christoph Reiter
77c2d02a4d Update dependencies 2023-02-19 16:03:41 +01:00
Jeremy Drake
aea263ec2c CI: Remove enabling of clangarm64 in pacman.conf
It is now enabled by default so this is a no-op.
2023-01-29 00:33:11 +01:00
Jeremy Drake
1666f6d3b0 Allow building qt6-static on clangarm64.
The timeout on a self-hosted runner is much larger (72 hours, though
there seemed to be a different limit related to the token hit before
reaching that).
2023-01-18 19:44:42 +01:00
Christoph Reiter
63a1b6020e Extend manual build for mingw-w64-qt5-static to clang32/64
seems the clang build got slower, it now hits the 6h limit always
2023-01-18 18:36:19 +01:00
Christoph Reiter
77a21114a8 CI: set "MSYS" env var later
so the cache action doesn't override it
See https://github.com/actions/toolkit/issues/1312
2023-01-14 12:33:26 +01:00
Christoph Reiter
6e2c5b47d4 Revert "Unset MSYS env everywhere"
This reverts commit e2f4f874a20304bc94047ddf92ca63a9ee9aa5e5.

We are depending on it being set in CI, so this isn't the right approach
2023-01-14 12:32:40 +01:00
Christoph Reiter
e2f4f874a2 Unset MSYS env everywhere
See https://github.com/actions/toolkit/issues/1311
2023-01-14 07:58:13 +01:00
Christoph Reiter
63f65d30bc Delete old assets in a thread pool
To speed things up a bit
2023-01-01 11:34:17 +01:00
Christoph Reiter
307799fd27 Update the status file format and include cycles
This moves it closer to the buildqueue format, and also includes cycles,
and allows future additions.
2022-12-27 16:16:47 +01:00
Christoph Reiter
bf82f9fff2 Don't include broken cycles in the cycle output 2022-12-27 16:16:08 +01:00
Christoph Reiter
a9862b27c1 Missed one left over src build-type 2022-12-24 00:03:29 +01:00
Christoph Reiter
2ae439cd00 Build all source packages in a separate build job
See https://github.com/msys2/msys2-autobuild/issues/69

Building source packages requires git etc to be installed, but
ideally we wouldn't pollute the builder with extra packages that
it doesn't explicitly require.

To avoid this build msys and mingw source packages in a separate job.
2022-12-23 23:53:08 +01:00
Christoph Reiter
21a84297d8 Update deps 2022-12-21 12:09:05 +01:00
Christoph Reiter
e22cc1cc17 Update dependencies 2022-12-10 21:55:46 +01:00
Christoph Reiter
eee25ec33f CI: run on ubuntu-22.04 2022-12-10 21:21:01 +01:00
Christoph Reiter
59e8e1af5d CI: create a larger pagefile
so we can build flang in CI, same as https://github.com/msys2/MINGW-packages/pull/13791
2022-10-29 21:18:54 +02:00
Christoph Reiter
1fd41adbfa CI: test with 3.11 2022-10-27 08:05:20 +02:00
Christoph Reiter
e94b92f73e Update deps 2022-10-27 08:04:11 +02:00
Christoph Reiter
5d06444a57 CI: port away from ::set-output 2022-10-21 13:24:22 +02:00
Christoph Reiter
9d582e19b1 Build src packages in an ucrt64 env
It will be the new default
2022-10-10 18:39:24 +02:00
Christoph Reiter
bf34129d62 Update dependencies 2022-10-09 20:57:44 +02:00
Christoph Reiter
c9dd9afe5e Unset VCPKG_ROOT during build
see https://github.com/msys2/MINGW-packages/pull/13368
2022-10-02 12:35:50 +02:00
Christoph Reiter
b40229daa6 Drop BUILD_TYPES_WIP
This wasn't complete as it would only ignore broken builds
for direct deps and not indirect ones, but kinda worked in ignoring
some arm64 errors.

But it also causes problems if an error is ignored and the other arches
get uploaded. Then it's hard to roll back the update because lots of
packages with the new version are already in the repo.

With the new autobuild controller we can also restart flaky builds instead
of ignoring them and waiting for jeremy to fix them later.

Let's try removing that special case.
2022-09-20 08:02:56 +02:00
Jeremy Drake
253f8b8c4c GHA: accept extra 'context' input
This is meant for the invoker (ie, msys2-autobuild-controller) to
provide additional information to be logged with the job (specifically,
what user requested it).
2022-09-06 21:58:16 +02:00
Jeremy Drake
c03c642719 GHA: log workflow_dispatch inputs in job 2022-09-06 21:58:16 +02:00
Christoph Reiter
f581199930 try running the real pacman with exec
it seems like the pacman wrapper doesn't survive a runtime update.
try exec to avoid returning control to bash
2022-09-06 21:11:16 +02:00
Christoph Reiter
3637fea711 Update dependencies 2022-09-04 10:33:47 +02:00
Christoph Reiter
e23492ee15 also retry on 502
We just got "502 {"message": "Server Error"}" on a DELETE
2022-09-02 21:53:46 +02:00
Christoph Reiter
9f4f288d00 retry HTTP requests which return 500
We are getting "500 null" randomly recently, mabye this helps.
2022-08-27 13:51:32 +02:00
Christoph Reiter
b36a4da1da Use a temporary pacman.conf during building
Up until now we created a backup of pacman.conf and restored it after
the build was done. This can leave the environment in an undefined state
if something crashes inbetween.

Instead create a temporary pacman.conf and use that during building.
In theory pacman allows setting a custom config via "--config", but
makepkg doesn't expose this, so that's not working. Luckily makepkg
allows overriding the pacman path via the PACMAN env var, so we create
a temporary script which just forwards everything to pacman and always
sets the temporary config.
2022-08-26 15:09:05 +02:00
Christoph Reiter
5ecdbc97a7 thinko
we are in bash here, not powershell...
2022-08-21 21:52:59 +02:00
Christoph Reiter
0d4680c01f Don't use workflow inputs directly in scripts
They could inject commands that way. Instead assign them
to an env var and then use that env var in the powershell scripts.

We want to open those controls up to more people, so we need to make
sure they can only change the values and not extract tokens etc.

Fixes #60
2022-08-21 20:37:07 +02:00
Christoph Reiter
9374b1d9b4 main: run update-status at the end 2022-08-15 14:03:50 +02:00
Jeremy Drake
22ea970beb CI: uncomment clangarm64 from pacman.conf.
instead of adding from scratch.  Once the commented-out section was
added, the grep would match that and no longer run the sed to add it.

Also remove line adding clang32 section because that was added to
default pacman.conf (and was thus a no-op).
2022-08-14 22:39:52 +02:00
Christoph Reiter
45c6b89ec7 Update the build status before stopping due to timeout
In case the job stops because it has reached the time limit it would
not update the build status and just quit. Move the timeout check
a bit later to acoid that.
2022-08-08 18:50:27 +02:00
Christoph Reiter
70c6903191 CI: update python to 3.10 and setup-python to v4 2022-08-04 21:45:15 +02:00
Christoph Reiter
f33be41b0f Update dependencies 2022-08-04 21:39:19 +02:00
Christoph Reiter
5f53dab6de Enable winjitdebug to workaround python crashing issues
Why this helps, I don't know..
2022-07-24 12:31:25 +02:00
Christoph Reiter
4dbd2618fb Update deps 2022-07-21 21:54:07 +02:00
Christoph Reiter
a43bdf9479 Add a comment as to why we split up repo-add calls
This was pointed out here: 7d84a7e086 (r75916830)
2022-06-30 21:07:00 +02:00
Christoph Reiter
9360a8eebe Update dependencies 2022-06-30 21:04:54 +02:00
Christoph Reiter
7d84a7e086 Limit the amount of packages added with repo-add in one go
It errors out if there are too many (maybe memory?)
2022-06-12 18:28:48 +02:00
Christoph Reiter
ea46306e71 Add config key for limiting the max job count
We are hitting the API limit again, so reduce from 15 to 12.
This also allows self hosted runners to limit to 1 if needed.
2022-06-11 18:32:11 +02:00
Christoph Reiter
97faefb5b3 Update deps
requests-cache now provides an option to always revalidated, so use that.
Before it mostly worked by accident.
2022-05-29 10:29:38 +02:00
Christoph Reiter
745e5e2c40 Fetch pgp keys before upgrading packages
To avoid any ABI bump breaking pgp
2022-05-08 20:08:43 +02:00
Christoph Reiter
84315e8e56 cache: include the job ID in the cache key
we just want to store/replace it every time, so we need it to be as unique as possible
2022-05-04 19:53:03 +02:00
Christoph Reiter
0a302154b0 Cache the cache
This should lead to cache hits on the first calls from the spawned
jobs right after the scheduler runs.
2022-05-04 18:55:48 +02:00
Christoph Reiter
4a5355f5dc Another try at fixing the cache race
Turns out disabling the cache just disables the monkey patching,
so we have to disable when creating the session object and not when
we are using the cache.

Create a session object without cache in the main thread at first,
so that the download code can re-use it as is later on.
2022-04-30 22:19:15 +02:00
Christoph Reiter
88b49f2c6a Avoid disabling the cache in a thread pool
It isn't thread safe, so wrap the outer code instead and just
assert in download_asset() that the caching is disabled so it's
not called with caching by accident.
2022-04-30 22:07:40 +02:00
Christoph Reiter
1684dff8bc Update mypy 2022-04-30 17:28:08 +02:00
Christoph Reiter
8870b3a342 Use requests-cache for adding etag/last-modified based caching
This doesn't speed things up usually, since we still make the same amount
of requests, but it doesn't count against the rate-limit in case there
is a cache hit. Also there is a smaller chance of things going wrong,
since we don't transfer any payload.

The cache is store in a .autobuild_cache directory using a sqlite DB.
2022-04-30 17:15:18 +02:00
Christoph Reiter
258256e739 use lru_cache for Python 3.8 compat 2022-04-30 16:23:49 +02:00
Christoph Reiter
133ce88284 Cache the main github api instances
this leads to a shared session, and a bit fewer requests
2022-04-30 16:16:54 +02:00
Christoph Reiter
099438dc3f queue_website_update: just log errors instead of failing
this is optional really
2022-04-30 12:32:10 +02:00
Christoph Reiter
94d87dac25 Retry non-pygithub HTTP requests as well
There isn't an easier way to enabled retries with requests sadly.
This also shares the session between all non-pygithub requests, so
could make things a bit faster.
2022-04-30 12:27:43 +02:00
Jeremy Drake
4384e62d01 Use labels to restrict self-hosted runner selection. 2022-04-19 07:46:26 +02:00
Christoph Reiter
cb4434c72b CI: Move the clear-failed actions into its own workflow
So they can be run independently.
2022-04-18 17:29:31 +02:00
jeremyd2019
892e1a3206 break clang/libc++ cycle
before libc++ was split off from clang package, it was built after clang within the same PKGBUILD, so this order seems reasonably safe.

Also remove a couple of prior cycle breaks from before it was possible to break them manually in a run.  These packages are not related by the same source repo and release, like mingw-w64 and llvm-project are, so are less likely to consistently require a cycle break on every upstream update.
2022-04-05 09:39:39 +02:00
Christoph Reiter
5f5d895cb1 move more common inputs up 2022-04-01 20:11:18 +02:00
Christoph Reiter
e4c2d446d2 Include the input name in the description 2022-04-01 20:09:25 +02:00
Christoph Reiter
cfe519fbb0 clear-failed: Allow clearing the failed state for packages via a workflow input 2022-04-01 20:05:38 +02:00
Christoph Reiter
1b14e2ed4d cycles: skip cycles where one of the packages is already built 2022-04-01 18:36:32 +02:00
Christoph Reiter
8c060f3142 cycles: show the version change of all packages 2022-04-01 15:44:45 +02:00
jeremyd2019
c2f77181d7 work around powershell arg parsing fail
It appears that powershell doesn't properly handle an empty argument, resulting in all the subsequent arguments being shifted left by one.

So, don't specify --optional-deps argument if it is empty.
2022-03-30 23:56:20 +02:00
Christoph Reiter
cd67c3a66a show: also set optional deps 2022-03-30 17:29:59 +02:00
Christoph Reiter
5e037680d6 also add optional deps when checking if we should run 2022-03-30 17:17:04 +02:00
Christoph Reiter
5f628fb63a and now for real 2022-03-30 17:03:37 +02:00
Christoph Reiter
777bbb73af Try to make it possible to pass optional dependencies to the workflow
The idea is that in case of a cycle we explicitely break it on a case
by case basis.
2022-03-30 17:00:20 +02:00
Christoph Reiter
c1807c19a7 show the cycles also when writing the build plan
Otherwise we don't see it in CI, since the rest is skipped if there
is nothing to build.
2022-03-30 10:08:04 +02:00
Christoph Reiter
20ba53752d show: default to not fetch the build log URLs
Instead add a --details option.

It's quite slow and rarely needed, so default to off.
2022-03-30 10:00:18 +02:00
Christoph Reiter
81dd6cabad show: include a list of dependency cycles
This gives all cycles in the queue right now, ignoring the build
status of the packages.

If one part of the cycle is already built then it will not matter.
2022-03-30 09:57:02 +02:00
Christoph Reiter
5c6f39a511 break cycle 2022-03-29 20:26:19 +02:00
Christoph Reiter
e7fdb6dab2 fix yaml types 2022-03-19 10:17:22 +01:00
Christoph Reiter
d423d68901 CI: test with Python 3.10 2022-03-19 10:16:30 +01:00
Christoph Reiter
4cc7908a95 CI: update to Python 3.9 2022-03-19 10:15:12 +01:00
Christoph Reiter
0cf933cc9b CI: run every 3 hours instead of 4
it just takes 20 secs if there is nothing to do
2022-03-18 16:36:49 +01:00
Christoph Reiter
93dd330288 CI: run the schedule job on Ubuntu
We don't need Windows there, so let's give it a try.
2022-03-18 16:09:08 +01:00
Christoph Reiter
548cd95a30 break librsvg/gtk3 cycle 2022-03-18 09:05:46 +01:00
Christoph Reiter
a8d63e2852 Some cleanup; don't break cycles if the dep isn't in the repo
When bootstrapping a cycle we can't fall back to the repo, so
someone has to upload the package manually and we shouldn't try
to build it before that.
2022-03-12 08:53:05 +01:00
Christoph Reiter
1e254ee060 Another optional dep 2022-03-11 15:50:06 +01:00
Christoph Reiter
154402b355 Wrong package name
oops
2022-03-11 15:30:15 +01:00
Christoph Reiter
0ed108506a Add a list of optional dependencies
Since we no longer break cycles in msys2-web we have to do it here.
This adds a list of optional deps for some packages. If they are there
they will be used, if not they will be ignored.

By hardcoding it we should get more a more deterministic result, but
not sure if this scales well.
2022-03-11 15:20:47 +01:00
Christoph Reiter
be6f6f2a28 Update deps 2022-03-10 20:12:39 +01:00
Christoph Reiter
8144f50ad5 CI: switch from actions/cache to using the builtin cache feature of actions/setup-python
One thing less to care about, and one less node12 action
2022-03-10 20:04:45 +01:00
Christoph Reiter
51e8ee9f76 Update some actions 2022-03-07 19:33:34 +01:00
Christoph Reiter
7e96898a06 Stop setting GIT_COMMITTER_NAME/EMAIL
We no longer use "git am" in PKGBUILD files
2022-03-05 13:15:22 +01:00
Jeremy Drake
451dca0a27 run upgrade/downgrade twice for updated assets
If one or more of the assets are in the 'core' set (such as bash,
recently), only the 'core' packages will be upgraded/downgraded in the
first run.
2022-02-20 09:01:16 +01:00
Christoph Reiter
a316cb96c2
mermaid: don't set a theme
doesn't play well with dark mode
2022-02-17 18:09:33 +01:00
Christoph Reiter
9ff6282fd6 Use new markdown mermaid support for the process diagram 2022-02-17 17:15:02 +01:00
Christoph Reiter
8b9b746cfa Revert "Revert setting GIT_COMITTER_NAME/EMAIL. It doesn't do anything."
This reverts commit 9b01428dde1d2476f177c2437bdd60063ec8147c.
2022-01-25 21:09:36 +01:00
Christoph Reiter
9b01428dde Revert setting GIT_COMITTER_NAME/EMAIL. It doesn't do anything.
Since https://github.com/msys2/MSYS2-packages/commit/97491f06184abf6
makepkg sets them, so this wasn't really doing anything.
2022-01-25 20:17:16 +01:00
Christoph Reiter
3e28396ab0 Update deps 2022-01-14 16:41:36 +01:00
Christoph Reiter
6c461095e0 CI: don't install git by default
We only needed it to configure the commiter name/email which is now
done via env vars.

If a package still needs git we pull it in via makedepends.
2022-01-14 16:15:42 +01:00
Christoph Reiter
3a63bf21e1 Set GIT_COMMITTER_NAME/EMAIL when calling makepkg
Instead of setting it with "git config" early on. This way we don't
have to change some global files/state while still getting the same result.
2022-01-14 16:13:45 +01:00
Christoph Reiter
e93758b39c Set PACKAGER in autobuild directly
Instead of depending on the caller to set it.
2022-01-14 16:09:51 +01:00
Christoph Reiter
7c422261fc Use the same build environment for all makepkg calls
Just the default environ for now
2022-01-14 16:09:23 +01:00
Christoph Reiter
58fac3caaf Don't import environ directly
Use the same style everywhere, to also avoid shadowing locals
2022-01-14 15:53:03 +01:00
Christoph Reiter
6a436ac4e9 No longer install the toolchain groups
They are no longer required. See
https://github.com/msys2/MINGW-packages/discussions/10506
2022-01-13 17:44:52 +01:00
Christoph Reiter
698f9f514f Only start jobs for build types where we own the asset release 2022-01-13 09:36:45 +01:00
Christoph Reiter
f765fe5ea7
Drop clangarm64 from the manual build list 2022-01-12 22:38:17 +01:00
Christoph Reiter
f49b8afb91 Fetch certain build types also from other repos
Allow mapping build types to external repos and make some
read-only operations work with it.

This mainly means downloading assets will now also download clangarm64
and the clangarm64 build status will be included on packages.msys2.org.
2021-12-26 15:19:19 +01:00
Jeremy Drake
456f0a1e57 fetch_assets: add option to limit build_types. 2021-12-21 04:50:24 +01:00
Christoph Reiter
1aaafbed38 msys2-devel is no longer needed 2021-12-14 20:28:14 +01:00
Christoph Reiter
91bb7945cb Update deps 2021-12-09 20:10:55 +01:00
Christoph Reiter
f712bbd622 CI: switch the hosted runner env from windows-2019 to windows-2022
Let's give it a try.
2021-12-09 20:07:59 +01:00
Christoph Reiter
8ecac52817 Update deps 2021-11-27 05:47:09 +01:00
Jeremy Drake
310a1fa4e4 Updates for ARM64 running x64 now.
Use x64 python everywhere.  Otherwise, it will try to find an ARM64
python, which Github doesn't offer in their metadata.

Pass release: false to setup-msys2 on ARM64.  My ARM64 runner has no D:,
and IO is slow enough to make setting up a fresh install on each run
prohibitive anyway.
2021-11-20 06:40:15 +01:00
Christoph Reiter
3ae4835f34 CI: fix switching to the main mirror
This broke when we switched the mirrorlist file defaults
(and also when we added more repos).

Just replace the shared mirrorlist instead.

Fixes #47
2021-11-18 19:24:08 +01:00
Christoph Reiter
a0a0b3f47b
mingw-w64-mlpack is fixed now 2021-11-05 18:38:32 +01:00
Jeremy Drake
46400708d0 Tweaks to workflow.
Use new runner.arch variable instead of checking job name to see if
we're on ARM64.  Add runner.arch to python cache key (fixes #36).

Move output of drive information to a new Runner details step, and add
output of CPU name (from MINGW-packages workflow) to that.
2021-11-04 21:40:52 +01:00
Mehdi Chinoune
51e711deb1 Change MSYS2 default Installation location 2021-11-03 16:48:11 +01:00
Christoph Reiter
6e469e2c56 fetch-assets: add --fetch-complete option
this fetches all packages, as long es they are complete
2021-11-03 08:49:23 +01:00
jeremyd2019
a4ab5bc26b fix BUILD_ROOT
`C:` is the CWD on drive C, `C:\` is the root of drive C.
2021-10-24 23:08:30 +02:00
Mehdi Chinoune
067f5c1ecd Shorten BUILD_ROOT 2021-10-24 09:49:30 +02:00
Christoph Reiter
0cfe547446 Revert "qt5-static: All manual builds"
This reverts commit 87dbe7aebc5cd274457f05e62527029f90f90e2e.
2021-10-20 09:53:39 +02:00
Alexey Pavlov
87dbe7aebc qt5-static: All manual builds 2021-10-19 21:09:52 +03:00
Christoph Reiter
4cc7035246
ignore rdeps: mingw-w64-zig 2021-10-17 20:12:52 +02:00
Christoph Reiter
aea50264e2
ignore qt-static rdeps 2021-10-14 18:44:51 +02:00
Christoph Reiter
41742850ce Revert "Revert "Update autobuild.py""
This reverts commit 5f728e1eb22bdcb3c0e97c2d7c6fdd6a025dca64.
2021-10-11 18:25:43 +02:00
Christoph Reiter
5f728e1eb2 Revert "Update autobuild.py"
This reverts commit 8c7ef11f693b7af3ad3292ff6d0ae0b80fcd8ba0.
2021-10-10 17:57:46 +02:00
Christoph Reiter
8c7ef11f69
Update autobuild.py 2021-10-10 15:07:50 +02:00
Christoph Reiter
d74753f0e5 build status: inherit blocking info instead of replacing it
a bit hacky.. but works

Fixes #42
2021-10-09 08:37:22 +02:00
Christoph Reiter
c23ca57bed Update deps 2021-10-09 07:25:03 +02:00
Christoph Reiter
9c67f65b7c make mypy happy 2021-09-15 08:54:29 +02:00
Christoph Reiter
7f3417441c Fix cleaning up failed assets 2021-09-15 08:48:00 +02:00
Christoph Reiter
6add89827b Only have one metadata file for a failed build
We used the resulting package names, but we can just key by build type
and get fewer files that way.
2021-09-15 08:35:53 +02:00
Christoph Reiter
a0713fbf40 move a call out of a loop 2021-09-12 07:02:16 +02:00
34 changed files with 3046 additions and 1959 deletions

4
.flake8 Normal file
View File

@ -0,0 +1,4 @@
[flake8]
max-line-length = 110
exclude =
.venv/

View File

@ -2,53 +2,83 @@ name: 'build'
on:
workflow_dispatch:
inputs:
optional_deps:
description: 'optional_deps=pkg-A:optional-dep-B,pkg-C:optional-dep-D'
default: ''
required: false
type: string
context:
description: 'Extra information from invoker'
default: ''
required: false
type: string
schedule:
- cron: '0 0/4 * * *'
- cron: '0 0/3 * * *'
permissions:
contents: write
env:
PYTHONUNBUFFERED: 1
concurrency: nope
permissions: {}
jobs:
schedule:
runs-on: windows-latest
runs-on: ubuntu-24.04
permissions:
contents: write
concurrency: autobuild-maint
outputs:
build-plan: ${{ steps.check.outputs.build-plan }}
steps:
- uses: actions/checkout@v2
- name: Dump inputs
if: ${{ github.event_name == 'workflow_dispatch' }}
env:
CONTEXT: '${{ toJSON(github.event.inputs) }}'
run: |
echo "$CONTEXT"
- uses: actions/setup-python@v2
- uses: actions/checkout@v5
with:
python-version: '3.8'
persist-credentials: false
- uses: actions/cache@v2
- uses: actions/setup-python@v5
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-
python-version: '3.13'
cache: 'pip'
cache-dependency-path: 'requirements.txt'
- name: Install deps
env:
PIP_DISABLE_PIP_VERSION_CHECK: 1
run: |
python -m pip install --user 'wheel==0.36.2'
python -m pip install --user -r requirements.txt
python -m venv .venv
source .venv/bin/activate
python -m pip install -r requirements.txt
echo "$VIRTUAL_ENV/bin" >> $GITHUB_PATH
- name: autobuild cache
uses: actions/cache@v4
with:
path: ${{ github.workspace }}/.autobuild_cache
key: autobuild_cache-${{ github.job }}-${{ github.run_id }}-${{ github.run_attempt }}
restore-keys: autobuild_cache-
- name: Check what we should run
id: check
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
OPTIONAL_DEPS: ${{ github.event.inputs.optional_deps }}
run: |
python -u autobuild.py write-build-plan build_plan.json
$buildPlan = Get-Content build_plan.json -Raw
echo "::set-output name=build-plan::$buildPlan"
python -m msys2_autobuild write-build-plan --optional-deps "$OPTIONAL_DEPS" build_plan.json
buildPlan="$(cat build_plan.json)"
echo "build-plan=$buildPlan" >> $GITHUB_OUTPUT
- name: Clean up assets
if: steps.check.outputs.build-plan != '[]'
@ -56,19 +86,25 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
run: |
python -u autobuild.py clean-assets
python -m msys2_autobuild clean-assets
- name: Show build queue
if: steps.check.outputs.build-plan != '[]'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
OPTIONAL_DEPS: ${{ github.event.inputs.optional_deps }}
run: |
python -u autobuild.py show
python -m msys2_autobuild show --optional-deps "$OPTIONAL_DEPS"
build:
needs: schedule
timeout-minutes: 4320
needs: schedule
permissions:
contents: write
concurrency: autobuild-build-${{ matrix.name }}
if: ${{ needs.schedule.outputs.build-plan != '[]' }}
strategy:
@ -80,42 +116,75 @@ jobs:
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- name: Configure Pagefile
if: ${{ matrix.hosted }}
# https://github.com/al-cheb/configure-pagefile-action/issues/16
continue-on-error: true
uses: al-cheb/configure-pagefile-action@a3b6ebd6b634da88790d9c58d4b37a7f4a7b8708
with:
python-version: '3.8'
architecture: ${{ startsWith(matrix.name, 'clangarm') && 'x86' || 'x64' }}
minimum-size: 4GB
maximum-size: 16GB
disk-root: "C:"
- uses: actions/cache@v2
- name: Runner details
run: |
Get-PSDrive -PSProvider FileSystem
Get-CIMInstance -Class Win32_Processor | Select-Object -Property Name
- uses: actions/checkout@v5
with:
persist-credentials: false
- uses: actions/setup-python@v5
id: python
with:
python-version: '3.13'
# Avoid it setting CMake/pkg-config variables
# https://github.com/actions/setup-python/blob/main/docs/advanced-usage.md#environment-variables
update-environment: false
# Work around https://github.com/actions/setup-python/issues/1050
- name: Cache pip dependencies
uses: actions/cache@v4
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}
key: ${{ runner.os }}-${{ runner.arch }}-pip-${{ hashFiles('requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-
${{ runner.os }}-${{ runner.arch }}-pip-
- name: Install deps
env:
PIP_DISABLE_PIP_VERSION_CHECK: 1
PYTHON_PATH: ${{ steps.python.outputs.python-path }}
run: |
python -m pip install --user 'wheel==0.36.2'
python -m pip install --user -r requirements.txt
& "$env:PYTHON_PATH" -m venv .venv
.\.venv\Scripts\activate
python -m pip install -r requirements.txt
echo "$env:VIRTUAL_ENV\Scripts" | Out-File -FilePath $env:GITHUB_PATH -Encoding utf8 -Append
# Note that released ARM64 requires x86 msys, but this will install x64
- uses: msys2/setup-msys2@v2
- name: autobuild cache
uses: actions/cache@v4
with:
path: ${{ github.workspace }}/.autobuild_cache
key: autobuild_cache-${{ github.job }}-${{ github.run_id }}-${{ github.run_attempt }}
restore-keys: autobuild_cache-
# Note that ARM64 prior to Win11 requires x86 msys, but this will install x64
- uses: msys2/setup-msys2@v2 # zizmor: ignore[unpinned-uses]
id: msys2
with:
msystem: MSYS
update: true
install: ${{ matrix.packages }}
location: '\M'
release: ${{ matrix.hosted }}
cache: ${{ matrix.hosted }}
- name: Switch to the main mirror
shell: msys2 {0}
run: |
sed -e "s|Include = /etc/pacman.d/mirrorlist.mingw32|Server = http://repo.msys2.org/mingw/i686/|g" -i /etc/pacman.conf
sed -e "s|Include = /etc/pacman.d/mirrorlist.mingw64|Server = http://repo.msys2.org/mingw/x86_64/|g" -i /etc/pacman.conf
sed -e "s|Include = /etc/pacman.d/mirrorlist.msys|Server = http://repo.msys2.org/msys/\$arch/|g" -i /etc/pacman.conf
grep -qF '[clang32]' /etc/pacman.conf || sed -i '1s|^|[clang32]\nServer = http://repo.msys2.org/mingw/clang32/\n|' /etc/pacman.conf
grep -qF '[clangarm64]' /etc/pacman.conf || sed -i '1s|^|[clangarm64]\nServer = http://repo.msys2.org/mingw/clangarm64/\n|' /etc/pacman.conf
echo 'Server = https://repo.msys2.org/mingw/$repo/' > /etc/pacman.d/mirrorlist.mingw
echo 'Server = https://repo.msys2.org/msys/$arch/' > /etc/pacman.d/mirrorlist.msys
pacman-conf.exe
- name: Update using the main mirror & Check install
@ -124,25 +193,15 @@ jobs:
msys2 -c 'pacman --noconfirm -Suu'
msys2 -c 'pacman -Qkq'
- name: Install extra packages
if: ${{ startsWith(matrix.name, 'clang32') || startsWith(matrix.name, 'clangarm64') }}
run: |
msys2 -c 'pacman --noconfirm -S --needed mingw-w64-clang-${{ startsWith(matrix.name, 'clangarm64') && 'aarch64' || 'i686' }}-toolchain'
- name: Init git
shell: msys2 {0}
run: |
git config --global user.email 'ci@msys2.org'
git config --global user.name 'MSYS2 Continuous Integration'
- name: Process build queue
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
GITHUB_RUN_NAME: ${{ matrix.name }}
# https://github.com/actions/runner/issues/324#issuecomment-3324382354
# https://github.com/actions/runner/pull/4053
JOB_CHECK_RUN_ID: ${{ job.check_run_id }}
MSYS2_ROOT: ${{ steps.msys2.outputs.msys2-location }}
run: |
$env:PACKAGER='CI (msys2-autobuild/' + $env:GITHUB_SHA.Substring(0, 8) + '/' + $env:GITHUB_RUN_ID + ')'
$BUILD_ROOT='C:\_'
$MSYS2_ROOT=(msys2 -c 'cygpath -w /')
Get-PSDrive -PSProvider FileSystem
python -u autobuild.py build ${{ matrix.build-args }} "$MSYS2_ROOT" "$BUILD_ROOT"
echo "JOB_CHECK_RUN_ID=$env:JOB_CHECK_RUN_ID"
$BUILD_ROOT=Join-Path (Split-Path $env:GITHUB_WORKSPACE -Qualifier) "\"
python -m msys2_autobuild build ${{ matrix.build-args }} "$env:MSYS2_ROOT" "$BUILD_ROOT"

80
.github/workflows/maint.yml vendored Normal file
View File

@ -0,0 +1,80 @@
name: 'maint'
on:
workflow_dispatch:
inputs:
clear_failed_packages:
description: 'clear_failed_packages=mingw-w64-foo,mingw-w64-bar'
default: ''
required: false
type: string
clear_failed_build_types:
description: 'clear_failed_build_types=mingw64,clang64'
default: ''
required: false
type: string
context:
description: 'Extra information from invoker'
default: ''
required: false
type: string
permissions: {}
concurrency: autobuild-maint
jobs:
schedule:
runs-on: ubuntu-24.04
permissions:
contents: write
steps:
- name: Dump inputs
if: ${{ github.event_name == 'workflow_dispatch' }}
env:
CONTEXT: '${{ toJSON(github.event.inputs) }}'
run: |
echo "$CONTEXT"
- uses: actions/checkout@v5
with:
persist-credentials: false
- uses: actions/setup-python@v5
with:
python-version: '3.13'
cache: 'pip'
cache-dependency-path: 'requirements.txt'
- name: Install deps
env:
PIP_DISABLE_PIP_VERSION_CHECK: 1
run: |
python -m venv .venv
source .venv/bin/activate
python -m pip install -r requirements.txt
echo "$VIRTUAL_ENV/bin" >> $GITHUB_PATH
- name: Clear failed build types
if: ${{ github.event.inputs.clear_failed_build_types != '' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
CLEAR_FAILED_BUILD_TYPES: ${{ github.event.inputs.clear_failed_build_types }}
run: |
python -m msys2_autobuild clear-failed --build-types "$CLEAR_FAILED_BUILD_TYPES"
python -m msys2_autobuild update-status
- name: Clear failed packages
if: ${{ github.event.inputs.clear_failed_packages != '' }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN_READONLY: ${{ secrets.GITHUBTOKENREADONLY }}
CLEAR_FAILED_PACKAGES: ${{ github.event.inputs.clear_failed_packages }}
run: |
python -m msys2_autobuild clear-failed --packages "$CLEAR_FAILED_PACKAGES"
python -m msys2_autobuild update-status

View File

@ -2,6 +2,9 @@ name: test
on: [push, pull_request]
permissions:
contents: read
jobs:
test:
@ -9,13 +12,15 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest,windows-latest]
python-version: [3.8, 3.9]
os: [ubuntu-24.04, windows-2022, windows-11-arm]
python-version: ['3.12', '3.13']
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v5
with:
persist-credentials: false
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
@ -31,4 +36,26 @@ jobs:
- name: Run flake8
run: |
python -m poetry run flake8 .
python -m poetry run flake8 .
- name: Run tests
run: |
python -m poetry run pytest
zizmor:
runs-on: ubuntu-24.04
permissions:
contents: read
security-events: write
steps:
- name: Checkout repository
uses: actions/checkout@v5
with:
persist-credentials: false
- name: Run zizmor
run: pipx run zizmor .
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}

3
.gitignore vendored
View File

@ -1,3 +1,4 @@
*.pyc
.vscode/
.mypy_cache/
.mypy_cache/
.autobuild_cache/

View File

@ -1,6 +1,12 @@
# msys2-autobuild
## autobuild.py
msys2-autobuild is a Python tool for
* automatically building MSYS2 packages in GitHub Actions
* manually uploading packages, or retrying builds
* retrieving the built packages for upload to the pacman repo
## Installation
```console
$ pacman -S mingw-w64-x86_64-python-tabulate mingw-w64-x86_64-python-pygithub mingw-w64-x86_64-python-requests
@ -8,56 +14,37 @@ $ pacman -S mingw-w64-x86_64-python-tabulate mingw-w64-x86_64-python-pygithub mi
$ poetry install
# or
$ python -m pip install --user -r requirements.txt
# or
$ pipx install git+https://github.com/msys2/msys2-autobuild
```
## Usage
```console
$ ./autobuild.py --help
usage: autobuild.py [-h]
{build,show,should-run,update-status,fetch-assets,upload-assets,clean-assets}
...
$ msys2-autobuild --help
usage: msys2-autobuild [-h]
{build,show,write-build-plan,update-status,fetch-assets,upload-assets,clear-failed,clean-assets}
...
Build packages
optional arguments:
options:
-h, --help show this help message and exit
subcommands:
{build,show,should-run,update-status,fetch-assets,upload-assets,clean-assets}
{build,show,write-build-plan,update-status,fetch-assets,upload-assets,clear-failed,clean-assets}
build Build all packages
show Show all packages to be built
write-build-plan Write a GHA build matrix setup
update-status Update the status file
fetch-assets Download all staging packages
upload-assets Upload packages
clear-failed Clear the failed state for packages
clean-assets Clean up GHA assets
```
## Automated Build Process
## Configuration
The following graph shows what happens between a PKGBUILD getting changed in git
and the built package being available in the pacman repo.
![sequence](./docs/sequence.svg)
### Security Considerations
Assuming changes to PKGBUILDs are properly reviewed, the pacman signature
checking works, the upstream source is OK and all MSYS2 organization members are
trusted we need to consider a bad actor controlling some part of the building
process between the PKGBUILD getting changed and the package ending up signed in
the pacman repo.
A bad actor would need to get a package on the machine of the developer signing
the package and adding it to the pacman repo. We take the following precautions:
* We only build packages automatically with GitHub Actions without third party
actions, excluding the official GitHub ones. We assume the GHA images and
official actions are safe.
* The download tool used by the person signing the package checks that the
binaries where uploaded by a restricted set of GitHub users or GHA.
We assume the bad actor doesn't have git push rights.
* Packages too large for GHA get built/signed by MSYS2 developers on their
machines. We assume the developer machines are safe.
* We enforce 2FA for the MSYS2 organization to make account takeovers of
existing MSYS2 developers harder.
Feedback and ideas on how to improve this welcome.
* `GITHUB_TOKEN` (required) - a GitHub token with write access to the current repo.
* `GITHUB_TOKEN_READONLY` (optional) - a GitHub token with read access to the current repo. This is used for read operations to not get limited by the API access limits.
* `GITHUB_REPOSITORY` (optional) - the path to the GitHub repo this is uploading to. Used for deciding which things can be built and where to upload them to. Defaults to `msys2/msys2-autobuild`.

File diff suppressed because it is too large Load Diff

2
build.bat Normal file
View File

@ -0,0 +1,2 @@
@echo off
C:\msys64\msys2_shell.cmd -here -msys -no-start -defterm -c "./build.sh"

5
build.sh Normal file
View File

@ -0,0 +1,5 @@
pacman --needed --noconfirm -S mingw-w64-x86_64-python-tabulate mingw-w64-x86_64-python-requests-cache
OLD_ACLOCAL_PATH="${ACLOCAL_PATH}"
unset ACLOCAL_PATH
python -m msys2_autobuild build / ~/build-temp -t msys,msys-src,mingw64,mingw32,mingw-src
ACLOCAL_PATH="${OLD_ACLOCAL_PATH}"

View File

@ -1,43 +0,0 @@
https://mermaid-js.github.io
```
sequenceDiagram
participant GIT as MSYS2/MINGW-packages
participant API as packages.msys2.org
participant GHA as GitHub Actions
participant DT as msys2-autobuild
participant DEV as Developer
participant REPO as Pacman Repo
GIT->>GHA: GIT push trigger
GHA->>GHA: parse PKGBUILDs
GHA-->>GIT: upload parsed PKGBUILDs
loop Every 5 minutes
API->>GIT: fetch parsed PKGBUILDs
GIT-->>API:
end
loop Every 2 hours
DT->>GHA: cron trigger
GHA->>API: fetch TODO list
API-->>GHA:
GHA->>GIT: fetch PKGBUILDs
GIT-->>GHA:
GHA->>DT: fetch staging
DT-->>GHA:
GHA->>GHA: build packages
GHA-->>DT: upload packages
end
DEV->>DT: fetch packages
DT-->>DEV:
DEV->>DEV: sign packages
DEV->>REPO: push to repo
```
```
{
"theme": "forest"
}
```

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 24 KiB

0
msys2_autobuild/__init__.py Executable file
View File

View File

@ -0,0 +1,3 @@
from .main import run
run()

421
msys2_autobuild/build.py Normal file
View File

@ -0,0 +1,421 @@
import fnmatch
import json
import os
import time
import shlex
import shutil
import stat
import subprocess
import tempfile
from concurrent.futures import ThreadPoolExecutor
from contextlib import contextmanager
from pathlib import Path, PurePath, PurePosixPath
from subprocess import check_call
from typing import Any, TypeVar
from collections.abc import Generator, Sequence
from gitea import Attachment
from .config import ArchType, BuildType, Config
from .gh import (CachedAssets, download_asset, get_asset_filename,
get_release, get_repo_for_build_type, upload_asset)
from .queue import Package
from .utils import SCRIPT_DIR, PathLike
class BuildError(Exception):
pass
def get_python_path(msys2_root: PathLike, msys2_path: PathLike) -> Path:
return Path(os.path.normpath(str(msys2_root) + str(msys2_path)))
def to_pure_posix_path(path: PathLike) -> PurePath:
return PurePosixPath("/" + str(path).replace(":", "", 1).replace("\\", "/"))
def get_build_environ(build_type: BuildType) -> dict[str, str]:
environ = os.environ.copy()
# Set PACKAGER for makepkg
packager_ref = Config.RUNNER_CONFIG[build_type]["repo"]
if "GITHUB_SHA" in environ and "GITHUB_RUN_ID" in environ:
packager_ref += "/" + environ["GITHUB_SHA"][:8] + "/" + environ["GITHUB_RUN_ID"]
environ["PACKAGER"] = f"CI ({packager_ref})"
return environ
@contextmanager
def temp_pacman_script(pacman_config: PathLike) -> Generator[PathLike, None, None]:
"""Gives a temporary pacman script which uses the passed in pacman config
without having to pass --config to it. Required because makepkg doesn't allow
setting the pacman conf path, but it allows setting the pacman executable path
via the 'PACMAN' env var.
"""
fd, filename = tempfile.mkstemp("pacman")
os.close(fd)
try:
with open(filename, "w", encoding="utf-8") as h:
cli = shlex.join(['/usr/bin/pacman', '--config', str(to_pure_posix_path(pacman_config))])
h.write(f"""\
#!/bin/bash
set -e
exec {cli} "$@"
""")
yield filename
finally:
try:
os.unlink(filename)
except OSError:
pass
@contextmanager
def temp_pacman_conf(msys2_root: PathLike) -> Generator[Path, None, None]:
"""Gives a unix path to a temporary copy of pacman.conf"""
fd, filename = tempfile.mkstemp("pacman.conf")
os.close(fd)
try:
conf = get_python_path(msys2_root, "/etc/pacman.conf")
with open(conf, "rb") as src:
with open(filename, "wb") as dest:
shutil.copyfileobj(src, dest)
yield Path(filename)
finally:
try:
os.unlink(filename)
except OSError:
pass
@contextmanager
def temp_makepkg_confd(msys2_root: PathLike, config_name: str) -> Generator[Path, None, None]:
"""Gives a path to a temporary $config_name.d file"""
conf_dir = get_python_path(msys2_root, f"/etc/{config_name}.d")
os.makedirs(conf_dir, exist_ok=True)
conf_file = conf_dir / "msys2_autobuild.conf"
try:
open(conf_file, "wb").close()
yield conf_file
finally:
try:
os.unlink(conf_file)
except OSError:
pass
try:
os.rmdir(conf_dir)
except OSError:
pass
def clean_environ(environ: dict[str, str]) -> dict[str, str]:
"""Returns an environment without any CI related variables.
This is to avoid leaking secrets to package build scripts we call.
While in theory we trust them this can't hurt.
"""
new_env = environ.copy()
for key in list(new_env):
if key.startswith(("GITHUB_", "RUNNER_")):
del new_env[key]
return new_env
def run_cmd(msys2_root: PathLike, args: Sequence[PathLike], **kwargs: Any) -> None:
executable = os.path.join(msys2_root, 'usr', 'bin', 'bash.exe')
env = clean_environ(kwargs.pop("env", os.environ.copy()))
env["CHERE_INVOKING"] = "1"
env["MSYSTEM"] = "MSYS"
env["MSYS2_PATH_TYPE"] = "minimal"
check_call([executable, '-lc'] + [shlex.join([str(a) for a in args])], env=env, **kwargs)
def make_tree_writable(topdir: PathLike) -> None:
# Ensure all files and directories under topdir are writable
# (and readable) by owner.
# Taken from meson, and adjusted
def chmod(p: PathLike) -> None:
os.chmod(p, os.stat(p).st_mode | stat.S_IWRITE | stat.S_IREAD)
chmod(topdir)
for root, dirs, files in os.walk(topdir):
for d in dirs:
chmod(os.path.join(root, d))
# Work around Python bug following junctions
# https://github.com/python/cpython/issues/67596#issuecomment-1918112817
dirs[:] = [d for d in dirs if not os.path.isjunction(os.path.join(root, d))]
for fname in files:
fpath = os.path.join(root, fname)
if os.path.isfile(fpath):
chmod(fpath)
def remove_junctions(topdir: PathLike) -> None:
# work around a git issue where it can't handle junctions
# https://github.com/git-for-windows/git/issues/5320
for root, dirs, _ in os.walk(topdir):
no_junctions = []
for d in dirs:
if not os.path.isjunction(os.path.join(root, d)):
no_junctions.append(d)
else:
os.remove(os.path.join(root, d))
dirs[:] = no_junctions
def reset_git_repo(path: PathLike):
def clean():
assert os.path.exists(path)
# Try to avoid git hanging in a junction loop, by removing them
# before running git clean/reset
# https://github.com/msys2/msys2-autobuild/issues/108#issuecomment-2776420879
try:
remove_junctions(path)
except OSError as e:
print("Removing junctions failed", e)
check_call(["git", "clean", "-xfdf"], cwd=path)
check_call(["git", "reset", "--hard", "HEAD"], cwd=path)
made_writable = False
for i in range(10):
try:
clean()
except subprocess.CalledProcessError:
try:
if not made_writable:
print("Trying to make files writable")
make_tree_writable(path)
remove_junctions(path)
made_writable = True
except OSError as e:
print("Making files writable failed", e)
print(f"git clean/reset failed, sleeping for {i} seconds")
time.sleep(i)
else:
break
else:
# run it one more time to raise
clean()
@contextmanager
def fresh_git_repo(url: str, path: PathLike) -> Generator:
if not os.path.exists(path):
check_call(["git", "clone", url, path])
check_call(["git", "config", "core.longpaths", "true"], cwd=path)
else:
reset_git_repo(path)
check_call(["git", "fetch", "origin"], cwd=path)
check_call(["git", "reset", "--hard", "origin/master"], cwd=path)
try:
yield
finally:
assert os.path.exists(path)
reset_git_repo(path)
@contextmanager
def staging_dependencies(
build_type: BuildType, pkg: Package, msys2_root: PathLike,
builddir: PathLike) -> Generator[PathLike, None, None]:
def add_to_repo(repo_root: PathLike, pacman_config: PathLike, repo_name: str,
assets: list[Attachment]) -> None:
repo_dir = Path(repo_root) / repo_name
os.makedirs(repo_dir, exist_ok=True)
todo = []
for asset in assets:
asset_path = os.path.join(repo_dir, get_asset_filename(asset))
todo.append((asset_path, asset))
def fetch_item(item: tuple[str, Attachment]) -> tuple[str, Attachment]:
asset_path, asset = item
download_asset(asset, asset_path)
return item
package_paths = []
with ThreadPoolExecutor(8) as executor:
for i, item in enumerate(executor.map(fetch_item, todo)):
asset_path, asset = item
print(f"[{i + 1}/{len(todo)}] {get_asset_filename(asset)}")
package_paths.append(asset_path)
repo_name = f"autobuild-{repo_name}"
repo_db_path = os.path.join(repo_dir, f"{repo_name}.db.tar.gz")
with open(pacman_config, encoding="utf-8") as h:
text = h.read()
uri = to_pure_posix_path(repo_dir).as_uri()
if uri not in text:
with open(pacman_config, "w", encoding="utf-8") as h2:
h2.write(f"""[{repo_name}]
Server={uri}
SigLevel=Never
""")
h2.write(text)
# repo-add 15 packages at a time so we don't hit the size limit for CLI arguments
ChunkItem = TypeVar("ChunkItem")
def chunks(lst: list[ChunkItem], n: int) -> Generator[list[ChunkItem], None, None]:
for i in range(0, len(lst), n):
yield lst[i:i + n]
base_args: list[PathLike] = ["repo-add", to_pure_posix_path(repo_db_path)]
posix_paths: list[PathLike] = [to_pure_posix_path(p) for p in package_paths]
for chunk in chunks(posix_paths, 15):
args = base_args + chunk
run_cmd(msys2_root, args, cwd=repo_dir)
cached_assets = CachedAssets()
repo_root = os.path.join(builddir, "_REPO")
try:
shutil.rmtree(repo_root, ignore_errors=True)
os.makedirs(repo_root, exist_ok=True)
with temp_pacman_conf(msys2_root) as pacman_config:
to_add: dict[ArchType, list[GitReleaseAsset]] = {}
for dep_type, deps in pkg.get_depends(build_type).items():
assets = cached_assets.get_assets(dep_type)
for dep in deps:
for pattern in dep.get_build_patterns(dep_type):
for asset in assets:
if fnmatch.fnmatch(get_asset_filename(asset), pattern):
to_add.setdefault(dep_type, []).append(asset)
break
else:
if pkg.is_optional_dep(dep, dep_type):
# If it's there, good, if not we ignore it since it's part of a cycle
pass
else:
raise SystemExit(f"asset for {pattern} in {dep_type} not found")
for dep_type, assets in to_add.items():
add_to_repo(repo_root, pacman_config, dep_type, assets)
with temp_pacman_script(pacman_config) as temp_pacman:
# in case they are already installed we need to upgrade
run_cmd(msys2_root, [to_pure_posix_path(temp_pacman), "--noconfirm", "-Suy"])
run_cmd(msys2_root, [to_pure_posix_path(temp_pacman), "--noconfirm", "-Su"])
yield temp_pacman
finally:
shutil.rmtree(repo_root, ignore_errors=True)
# downgrade again
run_cmd(msys2_root, ["pacman", "--noconfirm", "-Suuy"])
run_cmd(msys2_root, ["pacman", "--noconfirm", "-Suu"])
def build_package(build_type: BuildType, pkg: Package, msys2_root: PathLike, builddir: PathLike) -> None:
assert os.path.isabs(builddir)
assert os.path.isabs(msys2_root)
os.makedirs(builddir, exist_ok=True)
repo_name = {"MINGW-packages": "W", "MSYS2-packages": "S"}.get(pkg['repo'], pkg['repo'])
repo_dir = os.path.join(builddir, repo_name)
to_upload: list[str] = []
repo = get_repo_for_build_type(build_type)
with fresh_git_repo(pkg['repo_url'], repo_dir):
orig_pkg_dir = os.path.join(repo_dir, pkg['repo_path'])
# Rename it to get a shorter overall build path
# https://github.com/msys2/msys2-autobuild/issues/71
pkg_dir = os.path.join(repo_dir, 'B')
assert not os.path.exists(pkg_dir)
os.rename(orig_pkg_dir, pkg_dir)
# Fetch all keys mentioned in the PKGBUILD
validpgpkeys = to_pure_posix_path(os.path.join(SCRIPT_DIR, 'fetch-validpgpkeys.sh'))
run_cmd(msys2_root, ['bash', validpgpkeys], cwd=pkg_dir)
with staging_dependencies(build_type, pkg, msys2_root, builddir) as temp_pacman:
try:
env = get_build_environ(build_type)
# this makes makepkg use our custom pacman script
env['PACMAN'] = str(to_pure_posix_path(temp_pacman))
if build_type == Config.MINGW_SRC_BUILD_TYPE:
with temp_makepkg_confd(msys2_root, "makepkg_mingw.conf") as makepkg_conf:
with open(makepkg_conf, "w", encoding="utf-8") as h:
h.write("COMPRESSZST=(zstd -c -T0 --ultra -22 -)\n")
env['MINGW_ARCH'] = Config.MINGW_SRC_ARCH
run_cmd(msys2_root, [
'makepkg-mingw',
'--noconfirm',
'--noprogressbar',
'--allsource'
], env=env, cwd=pkg_dir)
elif build_type == Config.MSYS_SRC_BUILD_TYPE:
with temp_makepkg_confd(msys2_root, "makepkg.conf") as makepkg_conf:
with open(makepkg_conf, "w", encoding="utf-8") as h:
h.write("COMPRESSZST=(zstd -c -T0 --ultra -22 -)\n")
run_cmd(msys2_root, [
'makepkg',
'--noconfirm',
'--noprogressbar',
'--allsource'
], env=env, cwd=pkg_dir)
elif build_type in Config.MINGW_ARCH_LIST:
with temp_makepkg_confd(msys2_root, "makepkg_mingw.conf") as makepkg_conf:
with open(makepkg_conf, "w", encoding="utf-8") as h:
h.write("COMPRESSZST=(zstd -c -T0 --ultra -20 -)\n")
env['MINGW_ARCH'] = build_type
run_cmd(msys2_root, [
'makepkg-mingw',
'--noconfirm',
'--noprogressbar',
'--nocheck',
'--syncdeps',
'--rmdeps',
'--cleanbuild'
], env=env, cwd=pkg_dir)
elif build_type in Config.MSYS_ARCH_LIST:
with temp_makepkg_confd(msys2_root, "makepkg.conf") as makepkg_conf:
with open(makepkg_conf, "w", encoding="utf-8") as h:
h.write("COMPRESSZST=(zstd -c -T0 --ultra -20 -)\n")
run_cmd(msys2_root, [
'makepkg',
'--noconfirm',
'--noprogressbar',
'--nocheck',
'--syncdeps',
'--rmdeps',
'--cleanbuild'
], env=env, cwd=pkg_dir)
else:
assert 0
entries = os.listdir(pkg_dir)
for pattern in pkg.get_build_patterns(build_type):
found = fnmatch.filter(entries, pattern)
if not found:
raise BuildError(f"{pattern} not found, likely wrong version built")
to_upload.extend([os.path.join(pkg_dir, e) for e in found])
except (subprocess.CalledProcessError, BuildError) as e:
release = get_release(repo, "staging-failed")
failed_data = {}
content = json.dumps(failed_data).encode()
upload_asset(repo, release, pkg.get_failed_name(build_type), text=True, content=content)
raise BuildError(e)
else:
release = get_release(repo, "staging-" + build_type)
for path in to_upload:
upload_asset(repo, release, path)

View File

@ -0,0 +1,102 @@
import os
import shutil
import sys
import time
import traceback
from typing import Any, Literal
from .build import BuildError, build_package, run_cmd
from .config import BuildType, Config
from .queue import (Package, PackageStatus, get_buildqueue_with_status,
update_status)
from .utils import apply_optional_deps, gha_group
BuildFrom = Literal["start", "middle", "end"]
def get_package_to_build(
pkgs: list[Package], build_types: list[BuildType] | None,
build_from: BuildFrom) -> tuple[Package, BuildType] | None:
can_build = []
for pkg in pkgs:
for build_type in pkg.get_build_types():
if build_types is not None and build_type not in build_types:
continue
if pkg.get_status(build_type) == PackageStatus.WAITING_FOR_BUILD:
can_build.append((pkg, build_type))
if not can_build:
return None
if build_from == "end":
return can_build[-1]
elif build_from == "middle":
return can_build[len(can_build)//2]
elif build_from == "start":
return can_build[0]
else:
raise Exception("Unknown order:", build_from)
def run_build(args: Any) -> None:
builddir = os.path.abspath(args.builddir)
msys2_root = os.path.abspath(args.msys2_root)
if args.build_types is None:
build_types = None
else:
build_types = [p.strip() for p in args.build_types.split(",")]
apply_optional_deps(args.optional_deps or "")
start_time = time.monotonic()
if not sys.platform == "win32":
raise SystemExit("ERROR: Needs to run under native Python")
if not shutil.which("git"):
raise SystemExit("ERROR: git not in PATH")
if not os.path.isdir(msys2_root):
raise SystemExit("ERROR: msys2_root doesn't exist")
try:
run_cmd(msys2_root, [])
except Exception as e:
raise SystemExit("ERROR: msys2_root not functional", e)
print(f"Building {build_types} starting from {args.build_from}")
while True:
pkgs = get_buildqueue_with_status(full_details=True)
update_status(pkgs)
if (time.monotonic() - start_time) >= Config.SOFT_JOB_TIMEOUT:
print("timeout reached")
break
todo = get_package_to_build(pkgs, build_types, args.build_from)
if not todo:
break
pkg, build_type = todo
try:
with gha_group(f"[{pkg['repo']}] [{build_type}] {pkg['name']}..."):
build_package(build_type, pkg, msys2_root, builddir)
except BuildError:
with gha_group(f"[{pkg['repo']}] [{build_type}] {pkg['name']}: failed"):
traceback.print_exc(file=sys.stdout)
continue
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser("build", help="Build all packages")
sub.add_argument("-t", "--build-types", action="store")
sub.add_argument(
"--build-from", action="store", default="start", help="Start building from start|end|middle")
sub.add_argument("--optional-deps", action="store")
sub.add_argument("msys2_root", help="The MSYS2 install used for building. e.g. C:\\msys64")
sub.add_argument(
"builddir",
help="A directory used for saving temporary build results and the git repos")
sub.set_defaults(func=run_build)

View File

@ -0,0 +1,90 @@
import re
import fnmatch
from typing import Any
from gitea import Release, Attachment
from .config import get_all_build_types
from .gh import (get_asset_filename, get_current_repo, get_release,
get_release_assets, get_gitea)
from .queue import get_buildqueue
def get_assets_to_delete() -> tuple[list[Release], list[tuple[Release, Attachment]]]:
print("Fetching packages to build...")
keep_patterns = []
for pkg in get_buildqueue():
for build_type in pkg.get_build_types():
keep_patterns.append(pkg.get_failed_name(build_type))
keep_patterns.extend(pkg.get_build_patterns(build_type))
keep_pattern_regex = re.compile('|'.join(fnmatch.translate(p) for p in keep_patterns))
def should_be_deleted(asset: Attachment) -> bool:
filename = get_asset_filename(asset)
return not keep_pattern_regex.match(filename)
def get_to_delete(release: Release) -> tuple[list[Release], list[Attachment]]:
assets = get_release_assets(release)
to_delete = []
for asset in assets:
if should_be_deleted(asset):
to_delete.append(asset)
# Deleting and re-creating a release requires two write calls, so delete
# the release if all assets should be deleted and there are more than 2.
# min_to_delete = 3
# XXX: re-creating releases causes notifications, so avoid unless possible
# https://github.com/msys2/msys2-autobuild/issues/77#issuecomment-1657231719
min_to_delete = 400*333
if len(to_delete) >= min_to_delete and len(assets) == len(to_delete):
return [release], []
else:
return [], to_delete
def get_all_releases() -> list[Release]:
repo = get_current_repo()
releases = []
for build_type in get_all_build_types():
releases.append(get_release(repo, "staging-" + build_type))
releases.append(get_release(repo, "staging-failed"))
return releases
print("Fetching assets...")
releases = []
assets = []
for release in get_all_releases():
r, a = get_to_delete(release)
releases.extend(r)
assets.extend(r, a)
return releases, assets
def clean_gha_assets(args: Any) -> None:
repo = get_current_repo()
releases, assets = get_assets_to_delete()
print("Resetting releases...")
for release in releases:
print(f"Resetting {release.tag_name}...")
if not args.dry_run:
release.delete_release()
get_release(repo, release.tag_name)
print("Deleting assets...")
for release, asset in assets:
print(f"Deleting {get_asset_filename(asset)}...")
if not args.dry_run:
gitea = get_gitea()
gitea.repo_delete_release_attachment(repo.owner.login, repo.name, release.id, asset.id)
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser("clean-assets", help="Clean up GHA assets", allow_abbrev=False)
sub.add_argument(
"--dry-run", action="store_true", help="Only show what is going to be deleted")
sub.set_defaults(func=clean_gha_assets)

View File

@ -0,0 +1,49 @@
from typing import Any
from .gh import (get_asset_filename, get_current_repo, get_release,
get_release_assets, get_gitea)
from .queue import get_buildqueue_with_status
def clear_failed_state(args: Any) -> None:
build_type_filter = args.build_types
build_type_list = build_type_filter.replace(" ", "").split(",") if build_type_filter else []
package_filter = args.packages
package_list = package_filter.replace(" ", "").split(",") if package_filter else []
if build_type_filter is None and package_filter is None:
raise SystemExit("clear-failed: At least one of --build-types or --packages needs to be passed")
repo = get_current_repo()
release = get_release(repo, 'staging-failed')
assets_failed = get_release_assets(release)
failed_map = dict((get_asset_filename(a), a) for a in assets_failed)
for pkg in get_buildqueue_with_status():
if package_filter is not None and pkg["name"] not in package_list:
continue
for build_type in pkg.get_build_types():
if build_type_filter is not None and build_type not in build_type_list:
continue
name = pkg.get_failed_name(build_type)
if name in failed_map:
asset = failed_map[name]
print(f"Deleting {get_asset_filename(asset)}...")
if not args.dry_run:
gitea = get_gitea()
gitea.repo_delete_release_attachment(repo.owner.login, repo.name, release.id, asset.id)
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser(
"clear-failed", help="Clear the failed state for packages", allow_abbrev=False)
sub.add_argument(
"--dry-run", action="store_true", help="Only show what is going to be deleted")
sub.add_argument("--build-types", action="store", help=(
"A comma separated list of build types (e.g. mingw64)"))
sub.add_argument("--packages", action="store", help=(
"A comma separated list of packages to clear (e.g. mingw-w64-qt-creator)"))
sub.set_defaults(func=clear_failed_state)

View File

@ -0,0 +1,178 @@
import fnmatch
import os
from concurrent.futures import ThreadPoolExecutor
from pathlib import Path
from typing import Any
import subprocess
from gitea import Attachment
from .config import BuildType, Config
from .gh import (CachedAssets, download_asset, get_asset_filename,
get_asset_mtime_ns)
from .queue import PackageStatus, get_buildqueue_with_status
from .utils import ask_yes_no
def get_repo_subdir(build_type: BuildType) -> Path:
if build_type in Config.MSYS_ARCH_LIST:
return Path("msys") / "x86_64"
elif build_type == Config.MSYS_SRC_BUILD_TYPE:
return Path("msys") / "sources"
elif build_type == Config.MINGW_SRC_BUILD_TYPE:
return Path("mingw") / "sources"
elif build_type in Config.MINGW_ARCH_LIST:
return Path("mingw") / build_type
else:
raise Exception("unknown type")
def fetch_assets(args: Any) -> None:
target_dir = os.path.abspath(args.targetdir)
fetch_all = args.fetch_all
fetch_complete = args.fetch_complete
all_patterns: dict[BuildType, list[str]] = {}
all_blocked = []
for pkg in get_buildqueue_with_status():
for build_type in pkg.get_build_types():
if args.build_type and build_type not in args.build_type:
continue
status = pkg.get_status(build_type)
pkg_patterns = pkg.get_build_patterns(build_type)
if status == PackageStatus.FINISHED:
all_patterns.setdefault(build_type, []).extend(pkg_patterns)
elif status in [PackageStatus.FINISHED_BUT_BLOCKED,
PackageStatus.FINISHED_BUT_INCOMPLETE]:
if fetch_all or (fetch_complete and status != PackageStatus.FINISHED_BUT_INCOMPLETE):
all_patterns.setdefault(build_type, []).extend(pkg_patterns)
else:
all_blocked.append(
(pkg["name"], build_type, pkg.get_status_details(build_type)))
all_assets = {}
cached_assets = CachedAssets()
assets_to_download: dict[BuildType, list[Attachment]] = {}
for build_type, patterns in all_patterns.items():
if build_type not in all_assets:
all_assets[build_type] = cached_assets.get_assets(build_type)
assets = all_assets[build_type]
assets_mapping: dict[str, list[Attachment]] = {}
for asset in assets:
assets_mapping.setdefault(get_asset_filename(asset), []).append(asset)
for pattern in patterns:
matches = fnmatch.filter(assets_mapping.keys(), pattern)
if matches:
found = assets_mapping[matches[0]]
assets_to_download.setdefault(build_type, []).extend(found)
to_fetch = {}
for build_type, assets in assets_to_download.items():
for asset in assets:
asset_dir = Path(target_dir) / get_repo_subdir(build_type)
asset_path = asset_dir / get_asset_filename(asset)
to_fetch[str(asset_path)] = asset
def file_is_uptodate(path: str, asset: Attachment) -> bool:
asset_path = Path(path)
if not asset_path.exists():
return False
if asset_path.stat().st_size != asset.size:
return False
if get_asset_mtime_ns(asset) != asset_path.stat().st_mtime_ns:
return False
return True
# find files that are either wrong or not what we want
to_delete = []
not_uptodate = []
for root, dirs, files in os.walk(target_dir):
for name in files:
existing = os.path.join(root, name)
if existing in to_fetch:
asset = to_fetch[existing]
if not file_is_uptodate(existing, asset):
to_delete.append(existing)
not_uptodate.append(existing)
else:
to_delete.append(existing)
if args.delete and not args.pretend:
# delete unwanted files
for path in to_delete:
os.remove(path)
# delete empty directories
for root, dirs, files in os.walk(target_dir, topdown=False):
for name in dirs:
path = os.path.join(root, name)
if not os.listdir(path):
os.rmdir(path)
# Finally figure out what to download
todo = {}
done = []
for path, asset in to_fetch.items():
if not os.path.exists(path) or path in not_uptodate:
todo[path] = asset
Path(path).parent.mkdir(parents=True, exist_ok=True)
else:
done.append(path)
if args.verbose and all_blocked:
import pprint
print("Packages that are blocked and why:")
pprint.pprint(all_blocked)
print(f"downloading: {len(todo)}, done: {len(done)} "
f"blocked: {len(all_blocked)} (related builds missing)")
print("Pass --verbose to see the list of blocked packages.")
print("Pass --fetch-complete to also fetch blocked but complete packages")
print("Pass --fetch-all to fetch all packages.")
print("Pass --delete to clear the target directory")
def verify_file(path: str, target: str) -> None:
try:
subprocess.run(["zstd", "--quiet", "--test", path], capture_output=True, check=True, text=True)
except subprocess.CalledProcessError as e:
raise Exception(f"zstd test failed for {target!r}: {e.stderr}") from e
def fetch_item(item: tuple[str, Attachment]) -> tuple[str, Attachment]:
asset_path, asset = item
if not args.pretend:
download_asset(asset, asset_path, verify_file)
return item
with ThreadPoolExecutor(8) as executor:
for i, item in enumerate(executor.map(fetch_item, todo.items())):
print(f"[{i + 1}/{len(todo)}] {get_asset_filename(item[1])}")
print("done")
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser(
"fetch-assets", help="Download all staging packages", allow_abbrev=False)
sub.add_argument("targetdir")
sub.add_argument(
"--delete", action="store_true", help="Clear targetdir of unneeded files")
sub.add_argument(
"--verbose", action="store_true", help="Show why things are blocked")
sub.add_argument(
"--pretend", action="store_true",
help="Don't actually download, just show what would be done")
sub.add_argument(
"--fetch-all", action="store_true", help="Fetch all packages, even blocked ones")
sub.add_argument(
"--fetch-complete", action="store_true",
help="Fetch all packages, even blocked ones, except incomplete ones")
sub.add_argument(
"-t", "--build-type", action="append",
help="Only fetch packages for given build type(s) (may be used more than once)")
sub.add_argument(
"--noconfirm", action="store_true",
help="Don't require user confirmation")
sub.set_defaults(func=fetch_assets)

View File

@ -0,0 +1,66 @@
from typing import Any
from tabulate import tabulate
from .queue import Package, PackageStatus, get_buildqueue_with_status, get_cycles
from .utils import apply_optional_deps, gha_group
def show_cycles(pkgs: list[Package]) -> None:
cycles = get_cycles(pkgs)
if cycles:
def format_package(p: Package) -> str:
return f"{p['name']} [{p['version_repo']} -> {p['version']}]"
with gha_group(f"Dependency Cycles ({len(cycles)})"):
print(tabulate([
(format_package(a), "<-->", format_package(b)) for (a, b) in cycles],
headers=["Package", "", "Package"]))
def show_build(args: Any) -> None:
todo = []
waiting = []
done = []
failed = []
apply_optional_deps(args.optional_deps or "")
pkgs = get_buildqueue_with_status(full_details=args.details)
show_cycles(pkgs)
for pkg in pkgs:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
details = pkg.get_status_details(build_type)
details.pop("blocked", None)
if status == PackageStatus.WAITING_FOR_BUILD:
todo.append((pkg, build_type, status, details))
elif status in (PackageStatus.FINISHED, PackageStatus.FINISHED_BUT_BLOCKED,
PackageStatus.FINISHED_BUT_INCOMPLETE):
done.append((pkg, build_type, status, details))
elif status in (PackageStatus.WAITING_FOR_DEPENDENCIES,
PackageStatus.MANUAL_BUILD_REQUIRED):
waiting.append((pkg, build_type, status, details))
else:
failed.append((pkg, build_type, status, details))
def show_table(name: str, items: list) -> None:
with gha_group(f"{name} ({len(items)})"):
print(tabulate([(p["name"], bt, p["version"], str(s), d) for (p, bt, s, d) in items],
headers=["Package", "Build", "Version", "Status", "Details"]))
show_table("TODO", todo)
show_table("WAITING", waiting)
show_table("FAILED", failed)
show_table("DONE", done)
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser(
"show", help="Show all packages to be built", allow_abbrev=False)
sub.add_argument(
"--details", action="store_true", help="Show more details such as links to failed build logs (slow)")
sub.add_argument("--optional-deps", action="store")
sub.set_defaults(func=show_build)

View File

@ -0,0 +1,13 @@
from typing import Any
from .queue import get_buildqueue_with_status, update_status
def run_update_status(args: Any) -> None:
update_status(get_buildqueue_with_status(full_details=True))
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser(
"update-status", help="Update the status file", allow_abbrev=False)
sub.set_defaults(func=run_update_status)

View File

@ -0,0 +1,65 @@
import glob
import os
from typing import Any
from .gh import get_release, get_repo_for_build_type, upload_asset
from .queue import PackageStatus, get_buildqueue_with_status
def upload_assets(args: Any) -> None:
package_name = args.package
src_dir = args.path
src_dir = os.path.abspath(src_dir)
pkgs = get_buildqueue_with_status()
if package_name is not None:
for pkg in pkgs:
if pkg["name"] == package_name:
break
else:
raise SystemExit(f"Package '{package_name}' not in the queue, check the 'show' command")
pkgs = [pkg]
pattern_entries = []
for pkg in pkgs:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
# ignore finished packages
if status in (PackageStatus.FINISHED, PackageStatus.FINISHED_BUT_BLOCKED,
PackageStatus.FINISHED_BUT_INCOMPLETE):
continue
pattern_entries.append((build_type, pkg.get_build_patterns(build_type)))
print(f"Looking for the following files in {src_dir}:")
for build_type, patterns in pattern_entries:
for pattern in patterns:
print(" ", pattern)
matches = []
for build_type, patterns in pattern_entries:
for pattern in patterns:
for match in glob.glob(os.path.join(src_dir, pattern)):
matches.append((build_type, match))
print(f"Found {len(matches)} files..")
for build_type, match in matches:
repo = get_repo_for_build_type(build_type)
release = get_release(repo, 'staging-' + build_type)
print(f"Uploading {match}")
if not args.dry_run:
upload_asset(release, match)
print("Done")
def add_parser(subparsers: Any) -> None:
sub = subparsers.add_parser(
"upload-assets", help="Upload packages", allow_abbrev=False)
sub.add_argument("path", help="Directory to look for packages in")
sub.add_argument(
"--dry-run", action="store_true", help="Only show what is going to be uploaded")
sub.add_argument("-p", "--package", action="store", help=(
"Only upload files belonging to a particualr package (pkgbase)"))
sub.set_defaults(func=upload_assets)

114
msys2_autobuild/config.py Normal file
View File

@ -0,0 +1,114 @@
from typing import Literal, TypeAlias
from urllib3.util import Retry
ArchType = Literal["mingw32", "mingw64", "ucrt64", "clang64", "clangarm64", "msys"]
SourceType = Literal["mingw-src", "msys-src"]
BuildType: TypeAlias = ArchType | SourceType
REQUESTS_TIMEOUT = (15, 30)
REQUESTS_RETRY = Retry(total=3, backoff_factor=1, status_forcelist=[500, 502])
def get_all_build_types() -> list[BuildType]:
all_build_types: list[BuildType] = []
all_build_types.extend(Config.MSYS_ARCH_LIST)
all_build_types.extend(Config.MINGW_ARCH_LIST)
all_build_types.append(Config.MINGW_SRC_BUILD_TYPE)
all_build_types.append(Config.MSYS_SRC_BUILD_TYPE)
return all_build_types
def build_type_is_src(build_type: BuildType) -> bool:
return build_type in [Config.MINGW_SRC_BUILD_TYPE, Config.MSYS_SRC_BUILD_TYPE]
class Config:
ALLOWED_UPLOADERS = [
"elieux",
"lazka",
"jeremyd2019",
]
"""Users that are allowed to upload assets. This is checked at download time"""
MINGW_ARCH_LIST: list[ArchType] = ["mingw32", "mingw64", "ucrt64", "clang64", "clangarm64"]
"""Arches we try to build"""
MINGW_SRC_ARCH: ArchType = "ucrt64"
"""The arch that is used to build the source package (any mingw one should work)"""
MINGW_SRC_BUILD_TYPE: BuildType = "mingw-src"
MSYS_ARCH_LIST: list[ArchType] = ["msys"]
MSYS_SRC_ARCH: ArchType = "msys"
MSYS_SRC_BUILD_TYPE: BuildType = "msys-src"
RUNNER_CONFIG: dict[BuildType, dict] = {
"msys-src": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
"max_jobs": 1,
},
"msys": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
},
"mingw-src": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
"max_jobs": 1,
},
"mingw32": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
},
"mingw64": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
},
"ucrt64": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
},
"clang64": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-2022"],
"hosted": True,
},
"clangarm64": {
"repo": "Befator-Inc-Firmen-Netzwerk/msys2-autobuild",
"labels": ["windows-11-arm"],
"hosted": True,
},
}
"""Runner config to use for each build type."""
SOFT_JOB_TIMEOUT = 60 * 60 * 3
"""Runtime after which we shouldn't start a new build"""
MAXIMUM_JOB_COUNT = 15
"""Maximum number of jobs to spawn"""
MANUAL_BUILD: list[tuple[str, list[BuildType]]] = [
]
"""Packages that take too long to build, or can't be build and should be handled manually"""
IGNORE_RDEP_PACKAGES: list[str] = [
]
"""XXX: These would in theory block rdeps, but no one fixed them, so we ignore them"""
OPTIONAL_DEPS: dict[str, list[str]] = {
"mingw-w64-headers-git": ["mingw-w64-winpthreads", "mingw-w64-tools-git"],
"mingw-w64-crt-git": ["mingw-w64-winpthreads"],
"mingw-w64-llvm": ["mingw-w64-libc++"],
}
"""XXX: In case of cycles we mark these deps as optional"""

183
msys2_autobuild/gh.py Normal file
View File

@ -0,0 +1,183 @@
import io
import os
import shutil
import sys
import tempfile
import time
import hashlib
from contextlib import contextmanager
from datetime import datetime, UTC
from functools import cache
from pathlib import Path
from typing import Any
from collections.abc import Generator, Callable
import requests
from gitea import Configuration, ApiClient, RepositoryApi, CreateReleaseOption
from gitea import Repository, Release, Attachment
from gitea.rest import ApiException
from .config import REQUESTS_TIMEOUT, BuildType, Config
from .utils import PathLike, get_requests_session
@cache
def _get_repo(name: str) -> Repository:
gitea = get_gitea()
split = name.split("/")
return gitea.repo_get(split[0], split[1])
def get_current_repo() -> Repository:
repo_full_name = os.environ.get("GITHUB_REPOSITORY", "Befator-Inc-Firmen-Netzwerk/msys2-autobuild")
return _get_repo(repo_full_name)
def get_repo_for_build_type(build_type: BuildType) -> Repository:
return _get_repo(Config.RUNNER_CONFIG[build_type]["repo"])
@cache
def get_gitea() -> RepositoryApi:
configuration = Configuration()
configuration.host = "https://git.befatorinc.de/api/v1"
configuration.api_key["Authorization"] = "token 91f6f2e72e6d64fbd0b34133efae4a6c838d0e58"
gitea = RepositoryApi(ApiClient(configuration))
return gitea
def download_text_asset(asset: Attachment, cache=False) -> str:
session = get_requests_session(nocache=not cache)
with session.get(asset.browser_download_url, timeout=REQUESTS_TIMEOUT) as r:
r.raise_for_status()
return r.text
def get_asset_mtime_ns(asset: Attachment) -> int:
"""Returns the mtime of an asset in nanoseconds"""
return int(asset.created_at.timestamp() * (1000 ** 3))
def download_asset(asset: Attachment, target_path: str,
onverify: Callable[[str, str], None] | None = None) -> None:
session = get_requests_session(nocache=True)
with session.get(asset.browser_download_url, stream=True, timeout=REQUESTS_TIMEOUT) as r:
r.raise_for_status()
fd, temppath = tempfile.mkstemp()
try:
os.chmod(temppath, 0o644)
with os.fdopen(fd, "wb") as h:
for chunk in r.iter_content(256 * 1024):
h.write(chunk)
mtime_ns = get_asset_mtime_ns(asset)
os.utime(temppath, ns=(mtime_ns, mtime_ns))
if onverify is not None:
onverify(temppath, target_path)
shutil.move(temppath, target_path)
finally:
try:
os.remove(temppath)
except OSError:
pass
def get_gh_asset_name(basename: PathLike, text: bool = False) -> str:
# GitHub will throw out charaters like '~' or '='. It also doesn't like
# when there is no file extension and will try to add one
return hashlib.sha256(str(basename).encode("utf-8")).hexdigest() + (".bin" if not text else ".txt")
def get_asset_filename(asset: Attachment) -> str:
return asset.name
def get_release_assets(release: Release) -> list[Attachment]:
assets = []
for asset in release.assets:
# We allow uploads from GHA and some special users
assets.append(asset)
return assets
def upload_asset(repo: Repository, release: Release, path: PathLike, replace: bool = False,
text: bool = False, content: bytes | None = None) -> None:
gitea = get_gitea()
path = Path(path)
basename = os.path.basename(str(path))
asset_name = get_gh_asset_name(basename, text)
asset_label = basename
def can_try_upload_again() -> bool:
for asset in get_release_assets(release):
if asset_name == asset.name:
# We want to treat incomplete assets as if they weren't there
# so replace them always
if replace:
gitea.repo_delete_release_attachment(repo.owner.login, repo.name, release.id, asset.id)
break
else:
print(f"Skipping upload for {asset_name} as {asset_label}, already exists")
return False
return True
def upload() -> None:
if content is None:
with open(path, "rb") as fileobj:
gitea.repo_create_release_attachment(repo.owner.login, repo.name, release.id, name=asset_label, attachment=path)
else:
tmp_path = None
try:
with tempfile.NamedTemporaryFile(delete=False) as tf:
tf.write(content)
tf.flush()
tmp_path = tf.name
new_asset = gitea.repo_create_release_attachment(repo.owner.login, repo.name, release.id, name=asset_label, attachment=tmp_path)
finally:
if tmp_path and os.path.exists(tmp_path):
os.remove(tmp_path)
try:
upload()
except (ApiException, requests.RequestException):
if can_try_upload_again():
upload()
print(f"Uploaded {asset_name} as {asset_label}")
def get_release(repo: Repository, name: str, create: bool = True) -> Release:
"""Like Repository.get_release() but creates the referenced release if needed"""
gitea = get_gitea()
try:
return gitea.repo_get_release_by_tag(repo.owner.login, repo.name, name)
except ApiException:
if not create:
raise
return gitea.repo_create_release(repo.owner.login, repo.name, body=CreateReleaseOption(tag_name = name, prerelease = True))
class CachedAssets:
def __init__(self) -> None:
self._assets: dict[BuildType, list[Attachment]] = {}
self._failed: dict[str, list[Attachment]] = {}
def get_assets(self, build_type: BuildType) -> list[Attachment]:
if build_type not in self._assets:
repo = get_repo_for_build_type(build_type)
release = get_release(repo, 'staging-' + build_type)
self._assets[build_type] = get_release_assets(release)
return self._assets[build_type]
def get_failed_assets(self, build_type: BuildType) -> list[Attachment]:
repo = get_repo_for_build_type(build_type)
key = repo.full_name
if key not in self._failed:
release = get_release(repo, 'staging-failed')
self._failed[key] = get_release_assets(release)
assets = self._failed[key]
# XXX: This depends on the format of the filename
return [a for a in assets if get_asset_filename(a).startswith(build_type + "-")]

41
msys2_autobuild/main.py Normal file
View File

@ -0,0 +1,41 @@
import argparse
import sys
import logging
from . import (cmd_build, cmd_clean_assets, cmd_clear_failed, cmd_fetch_assets,
cmd_show_build, cmd_update_status, cmd_upload_assets)
from .utils import install_requests_cache
def main(argv: list[str]) -> None:
parser = argparse.ArgumentParser(description="Build packages", allow_abbrev=False)
parser.add_argument(
'-v', '--verbose',
action='count',
default=0,
help='Increase verbosity (can be used multiple times)'
)
parser.set_defaults(func=lambda *x: parser.print_help())
subparsers = parser.add_subparsers(title="subcommands")
cmd_build.add_parser(subparsers)
cmd_show_build.add_parser(subparsers)
cmd_update_status.add_parser(subparsers)
cmd_fetch_assets.add_parser(subparsers)
cmd_upload_assets.add_parser(subparsers)
cmd_clear_failed.add_parser(subparsers)
cmd_clean_assets.add_parser(subparsers)
args = parser.parse_args(argv[1:])
level_map = {0: logging.WARNING, 1: logging.INFO, 2: logging.DEBUG}
logging.basicConfig(
level=level_map.get(args.verbose, logging.DEBUG),
handlers=[logging.StreamHandler(sys.stderr)],
format='[%(asctime)s] [%(levelname)8s] [%(name)s:%(module)s:%(lineno)d] %(message)s',
datefmt='%Y-%m-%d %H:%M:%S')
with install_requests_cache():
args.func(args)
def run() -> None:
return main(sys.argv)

464
msys2_autobuild/queue.py Normal file
View File

@ -0,0 +1,464 @@
import fnmatch
import io
import json
import tempfile
import os
from concurrent.futures import ThreadPoolExecutor
from enum import Enum
from typing import Any, cast
import requests
from gitea.rest import ApiException
from .config import (REQUESTS_TIMEOUT, ArchType, BuildType, Config,
build_type_is_src, get_all_build_types)
from .gh import (CachedAssets, download_text_asset, get_asset_filename,
get_current_repo, get_release,
get_gitea)
from .utils import get_requests_session, queue_website_update
class PackageStatus(Enum):
FINISHED = 'finished'
FINISHED_BUT_BLOCKED = 'finished-but-blocked'
FINISHED_BUT_INCOMPLETE = 'finished-but-incomplete'
FAILED_TO_BUILD = 'failed-to-build'
WAITING_FOR_BUILD = 'waiting-for-build'
WAITING_FOR_DEPENDENCIES = 'waiting-for-dependencies'
MANUAL_BUILD_REQUIRED = 'manual-build-required'
UNKNOWN = 'unknown'
def __str__(self) -> str:
return self.value
class Package(dict):
def __repr__(self) -> str:
return "Package({!r})".format(self["name"])
def __hash__(self) -> int: # type: ignore
return id(self)
def __eq__(self, other: object) -> bool:
return self is other
@property
def _active_builds(self) -> dict:
return {
k: v for k, v in self["builds"].items() if k in (Config.MINGW_ARCH_LIST + Config.MSYS_ARCH_LIST)}
def _get_build(self, build_type: BuildType) -> dict:
return self["builds"].get(build_type, {})
def get_status(self, build_type: BuildType) -> PackageStatus:
build = self._get_build(build_type)
return build.get("status", PackageStatus.UNKNOWN)
def get_status_details(self, build_type: BuildType) -> dict[str, Any]:
build = self._get_build(build_type)
return dict(build.get("status_details", {}))
def set_status(self, build_type: BuildType, status: PackageStatus,
description: str | None = None,
urls: dict[str, str] | None = None) -> None:
build = self["builds"].setdefault(build_type, {})
build["status"] = status
meta: dict[str, Any] = {}
meta["desc"] = description
if urls is None:
urls = {}
meta["urls"] = urls
build["status_details"] = meta
def set_blocked(
self, build_type: BuildType, status: PackageStatus,
dep: "Package", dep_type: BuildType) -> None:
dep_details = dep.get_status_details(dep_type)
dep_blocked = dep_details.get("blocked", {})
details = self.get_status_details(build_type)
blocked = details.get("blocked", {})
if dep_blocked:
blocked = dict(dep_blocked)
else:
blocked.setdefault(dep, set()).add(dep_type)
descs = []
for pkg, types in blocked.items():
descs.append("{} ({})".format(pkg["name"], "/".join(types)))
self.set_status(build_type, status, "Blocked by: " + ", ".join(descs))
build = self._get_build(build_type)
build.setdefault("status_details", {})["blocked"] = blocked
def is_new(self, build_type: BuildType) -> bool:
build = self._get_build(build_type)
return build.get("new", False)
def get_build_patterns(self, build_type: BuildType) -> list[str]:
patterns = []
if build_type_is_src(build_type):
patterns.append(f"{self['name']}-{self['version']}.src.tar.[!s]*")
elif build_type in (Config.MINGW_ARCH_LIST + Config.MSYS_ARCH_LIST):
for item in self._get_build(build_type).get('packages', []):
patterns.append(f"{item}-{self['version']}-*.pkg.tar.zst")
else:
assert 0
return patterns
def get_failed_name(self, build_type: BuildType) -> str:
return f"{build_type}-{self['name']}-{self['version']}.failed"
def get_build_types(self) -> list[BuildType]:
build_types = list(self._active_builds)
if self["source"]:
if any((k in Config.MINGW_ARCH_LIST) for k in build_types):
build_types.append(Config.MINGW_SRC_BUILD_TYPE)
if any((k in Config.MSYS_ARCH_LIST) for k in build_types):
build_types.append(Config.MSYS_SRC_BUILD_TYPE)
return build_types
def _get_dep_build(self, build_type: BuildType) -> dict:
if build_type == Config.MINGW_SRC_BUILD_TYPE:
build_type = Config.MINGW_SRC_ARCH
elif build_type == Config.MSYS_SRC_BUILD_TYPE:
build_type = Config.MSYS_SRC_ARCH
return self._get_build(build_type)
def is_optional_dep(self, dep: "Package", dep_type: BuildType) -> bool:
# Some deps are manually marked as optional to break cycles.
# This requires them to be in the main repo though, otherwise the cycle has to
# be fixed manually.
return dep["name"] in Config.OPTIONAL_DEPS.get(self["name"], []) and not dep.is_new(dep_type)
def get_depends(self, build_type: BuildType) -> "dict[ArchType, set[Package]]":
build = self._get_dep_build(build_type)
return build.get('ext-depends', {})
def get_rdepends(self, build_type: BuildType) -> "dict[ArchType, set[Package]]":
build = self._get_dep_build(build_type)
return build.get('ext-rdepends', {})
def get_buildqueue() -> list[Package]:
session = get_requests_session()
r = session.get("http://localhost:8160/api/buildqueue2", timeout=REQUESTS_TIMEOUT)
r.raise_for_status()
return parse_buildqueue(r.text)
def parse_buildqueue(payload: str) -> list[Package]:
pkgs = []
for received in json.loads(payload):
pkg = Package(received)
pkg['repo'] = pkg['repo_url'].split('/')[-1]
pkgs.append(pkg)
# extract the package mapping
dep_mapping = {}
for pkg in pkgs:
for build in pkg._active_builds.values():
for name in build['packages']:
dep_mapping[name] = pkg
# link up dependencies with the real package in the queue
for pkg in pkgs:
for build in pkg._active_builds.values():
ver_depends: dict[str, set[Package]] = {}
for repo, deps in build['depends'].items():
for dep in deps:
ver_depends.setdefault(repo, set()).add(dep_mapping[dep])
build['ext-depends'] = ver_depends
# reverse dependencies
for pkg in pkgs:
for build in pkg._active_builds.values():
r_depends: dict[str, set[Package]] = {}
for pkg2 in pkgs:
for r_repo, build2 in pkg2._active_builds.items():
for repo, deps in build2['ext-depends'].items():
if pkg in deps:
r_depends.setdefault(r_repo, set()).add(pkg2)
build['ext-rdepends'] = r_depends
return pkgs
def get_cycles(pkgs: list[Package]) -> set[tuple[Package, Package]]:
cycles: set[tuple[Package, Package]] = set()
# In case the package is already built it doesn't matter if it is part of a cycle
def pkg_is_finished(pkg: Package, build_type: BuildType) -> bool:
return pkg.get_status(build_type) in [
PackageStatus.FINISHED,
PackageStatus.FINISHED_BUT_BLOCKED,
PackageStatus.FINISHED_BUT_INCOMPLETE,
]
# Transitive dependencies of a package. Excluding branches where a root is finished
def get_buildqueue_deps(pkg: Package, build_type: ArchType) -> "dict[ArchType, set[Package]]":
start = (build_type, pkg)
todo = set([start])
done = set()
result = set()
while todo:
build_type, pkg = todo.pop()
item = (build_type, pkg)
done.add(item)
if pkg_is_finished(pkg, build_type):
continue
result.add(item)
for dep_build_type, deps in pkg.get_depends(build_type).items():
for dep in deps:
dep_item = (dep_build_type, dep)
if dep_item not in done:
todo.add(dep_item)
result.discard(start)
d: dict[ArchType, set[Package]] = {}
for build_type, pkg in result:
d.setdefault(build_type, set()).add(pkg)
return d
for pkg in pkgs:
for build_type in pkg.get_build_types():
if build_type_is_src(build_type):
continue
build_type = cast(ArchType, build_type)
for dep_build_type, deps in get_buildqueue_deps(pkg, build_type).items():
for dep in deps:
# manually broken cycle
if pkg.is_optional_dep(dep, dep_build_type) or dep.is_optional_dep(pkg, build_type):
continue
dep_deps = get_buildqueue_deps(dep, dep_build_type)
if pkg in dep_deps.get(build_type, set()):
cycles.add(tuple(sorted([pkg, dep], key=lambda p: p["name"]))) # type: ignore
return cycles
def get_buildqueue_with_status(full_details: bool = False) -> list[Package]:
cached_assets = CachedAssets()
assets_failed = []
for build_type in get_all_build_types():
assets_failed.extend(cached_assets.get_failed_assets(build_type))
failed_urls = {}
if full_details:
# This might take a while, so only in full mode
with ThreadPoolExecutor(8) as executor:
for i, (asset, content) in enumerate(
zip(assets_failed, executor.map(download_text_asset, assets_failed))):
result = json.loads(content)
#No more Github Action URLs
#if result["urls"]:
# failed_urls[get_asset_filename(asset)] = result["urls"]
def pkg_is_done(build_type: BuildType, pkg: Package) -> bool:
done_names = [get_asset_filename(a) for a in cached_assets.get_assets(build_type)]
for pattern in pkg.get_build_patterns(build_type):
if not fnmatch.filter(done_names, pattern):
return False
return True
def get_failed_urls(build_type: BuildType, pkg: Package) -> dict[str, str] | None:
failed_names = [get_asset_filename(a) for a in assets_failed]
name = pkg.get_failed_name(build_type)
if name in failed_names:
return failed_urls.get(name)
return None
def pkg_has_failed(build_type: BuildType, pkg: Package) -> bool:
failed_names = [get_asset_filename(a) for a in assets_failed]
name = pkg.get_failed_name(build_type)
return name in failed_names
def pkg_is_manual(build_type: BuildType, pkg: Package) -> bool:
if build_type_is_src(build_type):
return False
for pattern, types in Config.MANUAL_BUILD:
type_matches = not types or build_type in types
if type_matches and fnmatch.fnmatchcase(pkg['name'], pattern):
return True
return False
pkgs = get_buildqueue()
# basic state
for pkg in pkgs:
for build_type in pkg.get_build_types():
if pkg_is_done(build_type, pkg):
pkg.set_status(build_type, PackageStatus.FINISHED)
elif pkg_has_failed(build_type, pkg):
urls = get_failed_urls(build_type, pkg)
pkg.set_status(build_type, PackageStatus.FAILED_TO_BUILD, urls=urls)
elif pkg_is_manual(build_type, pkg):
pkg.set_status(build_type, PackageStatus.MANUAL_BUILD_REQUIRED)
else:
pkg.set_status(build_type, PackageStatus.WAITING_FOR_BUILD)
# wait for dependencies to be finished before starting a build
for pkg in pkgs:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status == PackageStatus.WAITING_FOR_BUILD:
for dep_type, deps in pkg.get_depends(build_type).items():
for dep in deps:
dep_status = dep.get_status(dep_type)
if dep_status != PackageStatus.FINISHED:
if pkg.is_optional_dep(dep, dep_type):
continue
pkg.set_blocked(
build_type, PackageStatus.WAITING_FOR_DEPENDENCIES, dep, dep_type)
# Block packages where not all deps/rdeps/related are finished
changed = True
while changed:
changed = False
for pkg in pkgs:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status == PackageStatus.FINISHED:
# src builds are independent
if build_type_is_src(build_type):
continue
for dep_type, deps in pkg.get_depends(build_type).items():
for dep in deps:
dep_status = dep.get_status(dep_type)
if dep_status != PackageStatus.FINISHED:
pkg.set_blocked(
build_type, PackageStatus.FINISHED_BUT_BLOCKED, dep, dep_type)
changed = True
for dep_type, deps in pkg.get_rdepends(build_type).items():
for dep in deps:
if dep["name"] in Config.IGNORE_RDEP_PACKAGES:
continue
dep_status = dep.get_status(dep_type)
dep_new = dep.is_new(dep_type)
# if the rdep isn't in the repo we can't break it by uploading
if dep_status != PackageStatus.FINISHED and not dep_new:
pkg.set_blocked(
build_type, PackageStatus.FINISHED_BUT_BLOCKED, dep, dep_type)
changed = True
# Block packages where not every build type is finished
for pkg in pkgs:
unfinished = []
blocked = []
finished = []
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status != PackageStatus.FINISHED:
if status == PackageStatus.FINISHED_BUT_BLOCKED:
blocked.append(build_type)
# if the package isn't in the repo better not block on it
elif not pkg.is_new(build_type):
unfinished.append(build_type)
else:
finished.append(build_type)
# We track source packages by assuming they are in the repo if there is
# at least one binary package in the repo. Uploading lone source
# packages will not change anything, so block them.
if not blocked and not unfinished and finished and \
all(build_type_is_src(bt) for bt in finished):
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status in (PackageStatus.FINISHED, PackageStatus.FINISHED_BUT_BLOCKED):
changed = True
pkg.set_status(build_type, PackageStatus.FINISHED_BUT_INCOMPLETE)
elif unfinished:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status in (PackageStatus.FINISHED, PackageStatus.FINISHED_BUT_BLOCKED):
changed = True
for bt in unfinished:
pkg.set_blocked(build_type, PackageStatus.FINISHED_BUT_INCOMPLETE, pkg, bt)
elif blocked:
for build_type in pkg.get_build_types():
status = pkg.get_status(build_type)
if status == PackageStatus.FINISHED:
changed = True
for bt in blocked:
pkg.set_blocked(build_type, PackageStatus.FINISHED_BUT_BLOCKED, pkg, bt)
return pkgs
def update_status(pkgs: list[Package]) -> None:
repo = get_current_repo()
release = get_release(repo, "status")
status_object: dict[str, Any] = {}
packages = []
for pkg in pkgs:
pkg_result = {}
pkg_result["name"] = pkg["name"]
pkg_result["version"] = pkg["version"]
builds = {}
for build_type in pkg.get_build_types():
details = pkg.get_status_details(build_type)
details.pop("blocked", None)
details["status"] = pkg.get_status(build_type).value
builds[build_type] = details
pkg_result["builds"] = builds
packages.append(pkg_result)
status_object["packages"] = packages
cycles = []
for a, b in get_cycles(pkgs):
cycles.append([a["name"], b["name"]])
status_object["cycles"] = sorted(cycles)
content = json.dumps(status_object, indent=2).encode()
# If multiple jobs update this at the same time things can fail,
# assume the other one went through and just ignore all errors
try:
asset = None
asset_name = "status.json"
for asset in release.assets:
if asset.name == asset_name:
break
do_replace = True
# Avoid uploading the same file twice, to reduce API write calls
if asset is not None and asset.size == len(content):
try:
old_content = download_text_asset(asset, cache=True)
if old_content == content.decode():
do_replace = False
except requests.RequestException:
# github sometimes returns 404 for a short time after uploading
pass
if do_replace:
if asset is not None:
gitea = get_gitea()
gitea.repo_delete_release_attachment(repo.owner.login, repo.name, release.id, asset.id)
tmp_path = None
try:
with tempfile.NamedTemporaryFile(delete=False) as tf:
tf.write(content)
tf.flush()
tmp_path = tf.name
gitea = get_gitea()
new_asset = gitea.repo_create_release_attachment(repo.owner.login, repo.name, release.id, name=asset_name, attachment=tmp_path)
finally:
if tmp_path and os.path.exists(tmp_path):
os.remove(tmp_path)
print(f"Uploaded status file for {len(packages)} packages: {new_asset.browser_download_url}")
queue_website_update()
else:
print("Status unchanged")
except (ApiException, requests.RequestException) as e:
print(e)

122
msys2_autobuild/utils.py Normal file
View File

@ -0,0 +1,122 @@
import os
from contextlib import contextmanager
from datetime import timedelta
from functools import cache
from typing import Any, AnyStr, TypeAlias
from collections.abc import Generator
import requests
from requests.adapters import HTTPAdapter
from .config import REQUESTS_RETRY, REQUESTS_TIMEOUT, Config
PathLike: TypeAlias = os.PathLike | AnyStr
SCRIPT_DIR = os.path.dirname(os.path.realpath(__file__))
def requests_cache_disabled() -> Any:
import requests_cache
return requests_cache.disabled()
@cache
def get_requests_session(nocache: bool = False) -> requests.Session:
adapter = HTTPAdapter(max_retries=REQUESTS_RETRY)
if nocache:
with requests_cache_disabled():
http = requests.Session()
else:
http = requests.Session()
http.mount("https://", adapter)
http.mount("http://", adapter)
return http
@contextmanager
def install_requests_cache() -> Generator:
# This adds basic etag based caching, to avoid hitting API rate limiting
import requests_cache
from requests_cache.backends.sqlite import SQLiteCache
# Monkey patch globally, so pygithub uses it as well.
# Only do re-validation with etag/date etc and ignore the cache-control headers that
# github sends by default with 60 seconds.
cache_dir = os.path.join(os.getcwd(), '.autobuild_cache')
os.makedirs(cache_dir, exist_ok=True)
cache_file = f'http_cache_{requests_cache.__version__}.sqlite'
# delete other versions
for f in os.listdir(cache_dir):
if f.startswith('http_cache') and f != cache_file:
os.remove(os.path.join(cache_dir, f))
requests_cache.install_cache(
always_revalidate=True,
cache_control=False,
expire_after=requests_cache.EXPIRE_IMMEDIATELY,
backend=SQLiteCache(os.path.join(cache_dir, cache_file)))
# Call this once, so it gets cached from the main thread and can be used in a thread pool
get_requests_session(nocache=True)
try:
yield
finally:
# Delete old cache entries, so this doesn't grow indefinitely
cache = requests_cache.get_cache()
assert cache is not None
cache.delete(older_than=timedelta(hours=3))
# un-monkey-patch again
requests_cache.uninstall_cache()
@contextmanager
def gha_group(title: str) -> Generator:
print(f'\n::group::{title}')
try:
yield
finally:
print('::endgroup::')
def queue_website_update() -> None:
session = get_requests_session()
r = session.post('https://packages.msys2.org/api/trigger_update', timeout=REQUESTS_TIMEOUT)
try:
# it's not worth stopping the build if this fails, so just log it
r.raise_for_status()
except requests.RequestException as e:
print(e)
def parse_optional_deps(optional_deps: str) -> dict[str, list[str]]:
res: dict[str, list[str]] = {}
optional_deps = optional_deps.replace(" ", "")
if not optional_deps:
return res
for entry in optional_deps.split(","):
assert ":" in entry
first, second = entry.split(":", 2)
res.setdefault(first, []).append(second)
return res
def apply_optional_deps(optional_deps: str) -> None:
for dep, ignored in parse_optional_deps(optional_deps).items():
Config.OPTIONAL_DEPS.setdefault(dep, []).extend(ignored)
def ask_yes_no(prompt, default_no: bool = True):
"""Ask a yes/no question via input() and return their answer."""
if default_no:
prompt += " [y/N] "
else:
prompt += " [Y/n] "
user_input = input(prompt).strip().lower()
if not user_input:
return False if default_no else True
else:
return user_input == 'y'

989
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,21 +1,32 @@
[tool.poetry]
[project]
name = "msys2-autobuild"
version = "0.1.0"
description = ""
authors = ["Christoph Reiter <reiter.christoph@gmail.com>"]
license = "MIT"
authors = [
{ name = "Christoph Reiter", email = "reiter.christoph@gmail.com" }
]
requires-python = ">=3.12.0,<4.0"
dependencies = [
"PyGithub>=2.8.1,<3",
"tabulate>=0.9.0,<0.10",
"requests>=2.28.1,<3",
"requests-cache>=1.0.0,<2",
"urllib3>=2.2.1,<3",
]
[tool.poetry.dependencies]
python = "^3.8"
PyGithub = "^1.54.1"
tabulate = "^0.8.7"
requests = "^2.25.1"
[project.scripts]
msys2-autobuild = "msys2_autobuild.main:run"
[tool.poetry.dev-dependencies]
mypy = "^0.910"
flake8 = "^3.8.4"
types-tabulate = "^0.8.2"
types-requests = "^2.25.0"
[dependency-groups]
dev = [
"pytest>=8.0.0,<9",
"mypy==1.18.1",
"flake8>=7.0.0,<8",
"types-tabulate>=0.9.0.0,<0.10",
"types-requests>=2.25.0,<3",
]
[build-system]
requires = ["poetry-core>=1.0.0"]
requires = ["poetry-core>=2.2.0"]
build-backend = "poetry.core.masonry.api"

View File

@ -1,14 +1,18 @@
certifi==2021.5.30; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version >= "3.6"
cffi==1.14.6; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
charset-normalizer==2.0.4; python_full_version >= "3.6.0" and python_version >= "3.6"
deprecated==1.2.13; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
idna==3.2; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version >= "3.6"
pycparser==2.20; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
pygithub==1.55; python_version >= "3.6"
pyjwt==2.1.0; python_version >= "3.6"
pynacl==1.4.0; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
requests==2.26.0; (python_version >= "2.7" and python_full_version < "3.0.0") or (python_full_version >= "3.6.0")
six==1.16.0; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
tabulate==0.8.9
urllib3==1.26.6; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version < "4" and python_version >= "3.6"
wrapt==1.12.1; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
attrs==25.3.0 ; python_version >= "3.12" and python_version < "4.0"
cattrs==25.2.0 ; python_version >= "3.12" and python_version < "4.0"
certifi==2025.8.3 ; python_version >= "3.12" and python_version < "4.0"
cffi==2.0.0 ; python_version >= "3.12" and python_version < "4.0" and platform_python_implementation != "PyPy"
charset-normalizer==3.4.3 ; python_version >= "3.12" and python_version < "4.0"
cryptography==46.0.1 ; python_version >= "3.12" and python_version < "4.0"
idna==3.10 ; python_version >= "3.12" and python_version < "4.0"
platformdirs==4.4.0 ; python_version >= "3.12" and python_version < "4.0"
pycparser==2.23 ; python_version >= "3.12" and python_version < "4.0" and platform_python_implementation != "PyPy" and implementation_name != "PyPy"
pygithub==2.8.1 ; python_version >= "3.12" and python_version < "4.0"
pyjwt==2.10.1 ; python_version >= "3.12" and python_version < "4.0"
pynacl==1.6.0 ; python_version >= "3.12" and python_version < "4.0"
requests-cache==1.2.1 ; python_version >= "3.12" and python_version < "4.0"
requests==2.32.5 ; python_version >= "3.12" and python_version < "4.0"
tabulate==0.9.0 ; python_version >= "3.12" and python_version < "4.0"
typing-extensions==4.15.0 ; python_version >= "3.12" and python_version < "4.0"
url-normalize==2.2.1 ; python_version >= "3.12" and python_version < "4.0"
urllib3==2.5.0 ; python_version >= "3.12" and python_version < "4.0"

View File

@ -1,2 +0,0 @@
[flake8]
max-line-length = 110

0
tests/__init__.py Normal file
View File

140
tests/main_test.py Normal file
View File

@ -0,0 +1,140 @@
# type: ignore
import os
import stat
import tempfile
from pathlib import Path
from msys2_autobuild.utils import parse_optional_deps
from msys2_autobuild.queue import parse_buildqueue, get_cycles
from msys2_autobuild.build import make_tree_writable, remove_junctions
def test_make_tree_writable():
with tempfile.TemporaryDirectory() as tempdir:
nested_dir = Path(tempdir) / "nested"
nested_junction = nested_dir / "junction"
nested_dir.mkdir()
file_path = nested_dir / "test_file.txt"
file_path.write_text("content")
# Create a junction loop if possible, to make sure we ignore it
if os.name == 'nt':
import _winapi
_winapi.CreateJunction(str(nested_dir), str(nested_junction))
else:
nested_junction.mkdir()
# Remove permissions
for p in [tempdir, nested_dir, file_path, nested_junction]:
os.chmod(p, os.stat(p).st_mode & ~stat.S_IWRITE & ~stat.S_IREAD)
make_tree_writable(tempdir)
assert os.access(tempdir, os.W_OK) and os.access(tempdir, os.R_OK)
assert os.access(nested_dir, os.W_OK) and os.access(nested_dir, os.R_OK)
assert os.access(file_path, os.W_OK) and os.access(file_path, os.R_OK)
assert os.access(nested_junction, os.W_OK) and os.access(nested_junction, os.R_OK)
def test_remove_junctions():
with tempfile.TemporaryDirectory() as tempdir:
nested_dir = Path(tempdir) / "nested"
nested_junction = nested_dir / "junction"
nested_dir.mkdir()
# Create a junction loop if possible, to make sure we ignore it
if os.name == 'nt':
import _winapi
_winapi.CreateJunction(str(nested_dir), str(nested_junction))
assert nested_junction.exists()
assert os.path.isjunction(nested_junction)
remove_junctions(tempdir)
assert not nested_junction.exists()
def test_parse_optional_deps():
assert parse_optional_deps("a:b,c:d,a:x") == {'a': ['b', 'x'], 'c': ['d']}
def test_get_cycles():
buildqueue = """
[
{
"name": "c-ares",
"version": "1.34.2-1",
"version_repo": "1.33.1-1",
"repo_url": "https://github.com/msys2/MSYS2-packages",
"repo_path": "c-ares",
"source": true,
"builds": {
"msys": {
"packages": [
"libcares",
"libcares-devel"
],
"depends": {
"msys": [
"libnghttp2",
"libuv"
]
},
"new": false
}
}
},
{
"name": "nghttp2",
"version": "1.64.0-1",
"version_repo": "1.63.0-1",
"repo_url": "https://github.com/msys2/MSYS2-packages",
"repo_path": "nghttp2",
"source": true,
"builds": {
"msys": {
"packages": [
"libnghttp2",
"libnghttp2-devel",
"nghttp2"
],
"depends": {
"msys": [
"libcares",
"libcares-devel"
]
},
"new": false
}
}
},
{
"name": "libuv",
"version": "1.49.2-1",
"version_repo": "1.49.1-1",
"repo_url": "https://github.com/msys2/MSYS2-packages",
"repo_path": "libuv",
"source": true,
"builds": {
"msys": {
"packages": [
"libuv",
"libuv-devel"
],
"depends": {
"msys": [
"libnghttp2"
]
},
"new": false
}
}
}
]"""
pkgs = parse_buildqueue(buildqueue)
cycles = get_cycles(pkgs)
assert len(cycles) == 3
assert (pkgs[0], pkgs[2]) in cycles
assert (pkgs[0], pkgs[1]) in cycles
assert (pkgs[2], pkgs[1]) in cycles

2
update-status.bat Normal file
View File

@ -0,0 +1,2 @@
@echo off
C:\msys64\msys2_shell.cmd -here -mingw64 -no-start -defterm -c "pacman --needed --noconfirm -S mingw-w64-x86_64-python-tabulate mingw-w64-x86_64-python-requests-cache && python -m msys2_autobuild update-status"