Compare commits

...

79 Commits
v1.0.4 ... r1

Author SHA1 Message Date
Patrick Lehmann
1c42072471 v1.1.0 2024-09-27 22:00:06 +02:00
Patrick Lehmann
bf6ba9ba19 v1.1.0 2024-09-27 21:36:25 +02:00
Patrick Lehmann
93cdeb9cba Break system packages. 2024-09-26 23:19:51 +02:00
Patrick Lehmann
72a8705e6c added pyyaml as a special MSYS2 maintained Python package. 2024-09-26 08:15:31 +02:00
Patrick Lehmann
ea96cce0d1 Allow selection of dorny/test-reporter's report name. 2024-09-23 00:04:32 +02:00
Patrick Lehmann
59ce0fa84a Bumped dependencies. 2024-09-19 23:54:05 +02:00
Patrick Lehmann
c8362d99cc Report errors also to Pipeline message log. 2024-09-11 22:08:08 +02:00
Patrick Lehmann
0e9d878f0e Check downloaded artifacts for XML files. 2024-09-09 23:38:24 +02:00
Patrick Lehmann
74afc5a42a Merge branch 'main' into r1 2024-08-06 10:33:22 +02:00
Patrick Lehmann
5d67896606 Fixed next expected parameter set.
(cherry picked from commit 188feb556b)
2024-08-06 10:30:50 +02:00
Patrick Lehmann
4b058faf3e Improved actual vs. expected prints.
(cherry picked from commit d58db55086)
2024-08-06 10:30:50 +02:00
Patrick Lehmann
474a8024d1 Remove macOS with Python 3.8, 3.9 from expected list.
(cherry picked from commit ee9a3fbdcd)
2024-08-06 10:30:49 +02:00
Patrick Lehmann
5dc19a5d65 Merge branch 'cherry-picking' 2024-08-06 10:12:18 +02:00
Patrick Lehmann
188feb556b Fixed next expected parameter set. 2024-08-06 10:11:02 +02:00
Patrick Lehmann
d58db55086 Improved actual vs. expected prints. 2024-08-06 10:05:14 +02:00
Patrick Lehmann
ee9a3fbdcd Remove macOS with Python 3.8, 3.9 from expected list. 2024-08-06 09:54:41 +02:00
Patrick Lehmann
8dfc484c42 Merge branch 'cherry-picking' 2024-08-06 09:44:19 +02:00
Patrick Lehmann
960b7089e7 Fixed import problem.
(cherry picked from commit 33b99a3b4e)
2024-08-06 09:44:05 +02:00
Patrick Lehmann
706ef39595 Merge branch 'fix-releaser' 2024-08-06 09:34:17 +02:00
Patrick Lehmann
04881fc4ca fix(releaser): Use '--break-system-packages' only for Ubuntu 24.04.
(cherry picked from commit e444e57112)
2024-08-06 09:33:12 +02:00
Patrick Lehmann
e444e57112 fix(releaser): Use '--break-system-packages' only for Ubuntu 24.04. 2024-08-06 09:29:43 +02:00
Patrick Lehmann
cea83bc2ae Write GitHub errors for documentation checks. 2024-08-05 23:10:38 +02:00
Patrick Lehmann
440553e7fb Added before scripts for macOS (ARM). 2024-08-03 07:29:29 +02:00
Patrick Lehmann
26461822b5 Remove hotfix for Homebrew and GHDL. 2024-08-02 20:54:57 +02:00
Patrick Lehmann
7a341dbe8f Allow system to break. 2024-08-02 08:33:06 +02:00
Patrick Lehmann
33b99a3b4e Fixed import problem. Fixed pytest rewrite rules. 2024-08-02 08:29:03 +02:00
Patrick Lehmann
5e0aa52e5d Enhanced PR template. 2024-08-02 07:47:00 +02:00
Patrick Lehmann
2862238ee5 Allow extended exclude and disable patterns. 2024-08-02 07:40:55 +02:00
Patrick Lehmann
ebd20f5aea Disabled macOS x86-64 (macOS Intel) images, because it's not part of the free plan at GitHub. 2024-08-02 07:38:22 +02:00
Patrick Lehmann
2004711d48 Support Intel and ARM platforms for macOS. 2024-08-01 11:19:58 +02:00
Patrick Lehmann
02d386a9e1 Workaround for Ubuntu 2024.04 2024-08-01 10:57:44 +02:00
Patrick Lehmann
e0af5055a8 Debugging DYLD_LIBRARY_PATH on macOS 2024-07-31 00:40:42 +02:00
Patrick Lehmann
cc1dade947 Added MinGW64 2024-07-30 11:08:48 +02:00
Patrick Lehmann
b87d11502b Added UCRT64 before scripts. 2024-07-30 10:42:47 +02:00
Silverlan
fa96ee9197 fix(releaser): fix failure to install PyGithub
This fixes the error message:
× This environment is externally managed

(cherry picked from commit 7879c05ab7)
2024-07-30 07:37:43 +02:00
Patrick Lehmann
2e5a79e0c2 Merge remote-tracking branch 'github/main' into r1 2024-07-30 07:35:49 +02:00
Patrick Lehmann
0495bfb18c fix(releaser): fix failure to install PyGithub (#81) 2024-07-30 07:32:15 +02:00
Patrick Lehmann
f62d5d93ea Fixed typo. 2024-07-30 02:12:23 +02:00
Patrick Lehmann
13c1a56f92 Upgrade to Ubuntu 2024.04 as GitHub is stuck with Ubuntu-latest at 2022.04. 2024-07-30 02:06:42 +02:00
Patrick Lehmann
da3cdbe96a Allow installing packages using brew. 2024-07-30 01:54:05 +02:00
Patrick Lehmann
5fe793e3fa Allow installing additional packaged via apt and allow running before scripts for ubuntu and macos. 2024-07-30 01:39:20 +02:00
Patrick Lehmann
c38ff2af3c Added embedded Python code as standalone files for debugging. 2024-07-29 23:32:45 +02:00
umarcor
98f0fffaf6 with-post-step: use Node.js 20 instead of Node.js 16
(cherry picked from commit 0c1e72cfd6)
2024-07-24 07:14:54 +02:00
Patrick Lehmann
0fef6f8a4d Bumped dependencies. 2024-07-24 06:56:22 +02:00
Patrick Lehmann
92ce834303 Added input parameter 'fail_under' for CheckDocumentation. 2024-06-19 00:26:24 +02:00
Patrick Lehmann
607637b278 Limit interrogate to a directory. 2024-06-18 22:51:29 +02:00
Patrick Lehmann
dfc9221529 Also upload unit test results in case of errors. 2024-06-18 07:28:50 +02:00
Patrick Lehmann
d4afc820ab Adjusted filter expression in find for PublishTestResults. 2024-06-16 23:35:06 +02:00
Patrick Lehmann
ae13aa2dff Updated usage of pyedaa-reports. 2024-06-04 01:14:51 +02:00
Silverlan
7879c05ab7 fix(releaser): fix failure to install PyGithub
This fixes the error message:
× This environment is externally managed
2024-05-28 17:39:29 +02:00
Patrick Lehmann
df4815f666 Added pytest_cleanup variable. 2024-05-07 00:02:02 +02:00
Patrick Lehmann
8b7a8009a6 Cleanup pytest results. 2024-05-06 00:20:26 +02:00
Patrick Lehmann
6b4af68fa4 Bumped dependencies. 2024-05-05 20:27:37 +02:00
Patrick Lehmann
0db1821658 Revert "Added a minimum Python version field, so unsupported Python versions (e.g. for macOS) can be disabled."
This reverts commit 461931099a.
2024-04-25 22:11:15 +02:00
Patrick Lehmann
6d84311338 Revert "Also handle pypy versions."
This reverts commit be27e58d8c.
2024-04-25 22:09:29 +02:00
Patrick Lehmann
4406abe788 Merge unit test results using pyEDAA.Reports. 2024-04-25 07:33:36 +02:00
Patrick Lehmann
db99e35dec v1.0.5 2024-04-24 23:07:34 +02:00
Patrick Lehmann
e9d0dc3dba v1.0.5 2024-04-24 23:06:13 +02:00
Patrick Lehmann
f9a74102d9 Avoid warning from geekyeggo/delete-artifact@v5. 2024-04-24 23:03:40 +02:00
Patrick Lehmann
b33e0f2782 Bumped dependencies. 2024-04-24 23:03:08 +02:00
Patrick Lehmann
6cfc6e0f8f Merge branch 'main' into r1 2024-04-24 00:42:05 +02:00
Patrick Lehmann
5adddda1a1 Merge remote-tracking branch 'github/main' into r1 2024-04-24 00:32:50 +02:00
Patrick Lehmann
91289c4257 v1.0.1 2024-02-01 00:24:24 +01:00
Patrick Lehmann
527e94b245 v1.0.0 2024-01-19 01:18:27 +01:00
umarcor
f11c335674 v0.4.6 2023-02-26 17:46:26 +01:00
umarcor
5bed864443 v0.4.5 2022-11-08 03:26:27 +01:00
Unai Martinez-Corral
37ec436eb4 v0.4.4 2022-11-08 00:06:47 +00:00
umarcor
6a7a4212c3 v0.4.3 2022-03-02 23:51:16 +01:00
umarcor
f5b6f17d4e v0.4.2 2022-02-22 21:48:31 +01:00
umarcor
883238547a v0.4.1 2022-01-17 01:30:55 +01:00
umarcor
7cd852db58 v0.4.0 2022-01-09 20:58:38 +01:00
umarcor
ce0d30fe3f v0.3.0 2021-12-26 01:45:35 +01:00
umarcor
34dacf7bcf v0.2.3 2021-12-21 01:03:20 +01:00
umarcor
48090e113d v0.2.2 2021-12-20 20:47:43 +01:00
umarcor
e082d77e7a v0.2.1 2021-12-16 07:40:36 +01:00
umarcor
181035b0ba v0.2.0 2021-12-16 07:18:39 +01:00
Patrick Lehmann
643f95bbb6 v0.1.0 2021-12-07 20:50:03 +01:00
umarcor
424b75ca96 v0.0.1 2021-12-07 03:39:33 +01:00
umarcor
f0610331b9 v0.0.0 2021-12-01 00:03:09 +01:00
39 changed files with 701 additions and 143 deletions

View File

@@ -1,16 +1,30 @@
# New Features # New Features
* tbd
* tbd * tbd
# Changes # Changes
* tbd
* tbd * tbd
# Bug Fixes # Bug Fixes
* tbd
* tbd * tbd
---------- # Documentation
# Related PRs:
* tbd * tbd
* tbd
# Unit Tests
* tbd
* tbd
----------
# Related Issues and Pull-Requests
* tbd
* tbd

View File

@@ -124,26 +124,34 @@ jobs:
requirements = "${{ inputs.requirements }}" requirements = "${{ inputs.requirements }}"
if requirements.startswith("-r"): if requirements.startswith("-r"):
requirementsFile = Path(requirements[2:].lstrip()) requirementsFile = Path(requirements[2:].lstrip())
dependencies = loadRequirementsFile(requirementsFile) try:
dependencies = loadRequirementsFile(requirementsFile)
except FileNotFoundError as ex:
print(f"::error title=FileNotFoundError::{ex}")
exit(1)
else: else:
dependencies = [req.strip() for req in requirements.split(" ")] dependencies = [req.strip() for req in requirements.split(" ")]
packages = { packages = {
"coverage": "python-coverage:p", "coverage": "python-coverage:p",
"igraph": "igraph:p", "docstr_coverage": "python-pyyaml:p",
"jinja2": "python-markupsafe:p", "igraph": "igraph:p",
"lxml": "python-lxml:p", "jinja2": "python-markupsafe:p",
"numpy": "python-numpy:p", "lxml": "python-lxml:p",
"markupsafe": "python-markupsafe:p", "numpy": "python-numpy:p",
"pip": "python-pip:p", "markupsafe": "python-markupsafe:p",
"ruamel.yaml": "python-ruamel-yaml:p python-ruamel.yaml.clib:p", "pip": "python-pip:p",
"sphinx": "python-markupsafe:p", "pyyaml": "python-pyyaml:p",
"tomli": "python-tomli:p", "ruamel.yaml": "python-ruamel-yaml:p python-ruamel.yaml.clib:p",
"wheel": "python-wheel:p", "sphinx": "python-markupsafe:p",
"tomli": "python-tomli:p",
"wheel": "python-wheel:p",
"pyEDAA.ProjectModel": "python-ruamel-yaml:p python-ruamel.yaml.clib:p python-lxml:p",
"pyEDAA.Reports": "python-ruamel-yaml:p python-ruamel.yaml.clib:p python-lxml:p",
} }
subPackages = { subPackages = {
"pytooling": { "pytooling": {
"yaml": "python-ruamel-yaml:p python-ruamel.yaml.clib:p", "yaml": "python-ruamel-yaml:p python-ruamel.yaml.clib:p",
} }
} }
@@ -215,7 +223,7 @@ jobs:
ls -l install ls -l install
python -m pip install --disable-pip-version-check -U install/*.whl python -m pip install --disable-pip-version-check -U install/*.whl
- name: Run application tests (Ubuntu/macOS) - name: Run application tests (Ubuntu/macOS)
if: matrix.system != 'windows' if: matrix.system != 'windows'
run: | run: |
export ENVIRONMENT_NAME="${{ matrix.envname }}" export ENVIRONMENT_NAME="${{ matrix.envname }}"
@@ -230,7 +238,7 @@ jobs:
python -m pytest -raP $PYTEST_ARGS --color=yes ${{ inputs.tests_directory || '.' }}/${{ inputs.apptest_directory }} python -m pytest -raP $PYTEST_ARGS --color=yes ${{ inputs.tests_directory || '.' }}/${{ inputs.apptest_directory }}
fi fi
- name: Run application tests (Windows) - name: Run application tests (Windows)
if: matrix.system == 'windows' if: matrix.system == 'windows'
run: | run: |
$env:ENVIRONMENT_NAME = "${{ matrix.envname }}" $env:ENVIRONMENT_NAME = "${{ matrix.envname }}"

View File

@@ -36,23 +36,19 @@ on:
type: string type: string
jobs: jobs:
ArtifactCleanUp: ArtifactCleanUp:
name: 🗑️ Artifact Cleanup name: 🗑️ Artifact Cleanup
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: 🗑️ Delete package Artifacts - name: 🗑️ Delete package Artifacts
if: ${{ ! startsWith(github.ref, 'refs/tags') }} if: ${{ ! startsWith(github.ref, 'refs/tags') }}
uses: geekyeggo/delete-artifact@v5 uses: geekyeggo/delete-artifact@v5
with: with:
name: ${{ inputs.package }} name: ${{ inputs.package }}
token: ${{ secrets.GITHUB_TOKEN }}
- name: 🗑️ Delete remaining Artifacts - name: 🗑️ Delete remaining Artifacts
if: ${{ inputs.remaining != '' }} if: ${{ inputs.remaining != '' }}
uses: geekyeggo/delete-artifact@v5 uses: geekyeggo/delete-artifact@v5
with: with:
name: ${{ inputs.remaining }} name: ${{ inputs.remaining }}
token: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -34,7 +34,7 @@ on:
jobs: jobs:
BuildTheDocs: BuildTheDocs:
name: 📓 Run BuildTheDocs name: 📓 Run BuildTheDocs
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: ⏬ Checkout repository - name: ⏬ Checkout repository

View File

@@ -33,16 +33,16 @@ on:
description: 'Source code directory to check.' description: 'Source code directory to check.'
required: true required: true
type: string type: string
# fail_below: fail_under:
# description: 'Minimum required documentation coverage level' description: 'Minimum required documentation coverage level'
# required: false required: false
# default: 75 default: 80
# type: string type: string
jobs: jobs:
DocCoverage: DocCoverage:
name: 👀 Check documentation coverage name: 👀 Check documentation coverage
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: ⏬ Checkout repository - name: ⏬ Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4
@@ -59,9 +59,9 @@ jobs:
- name: Run 'interrogate' Documentation Coverage Check - name: Run 'interrogate' Documentation Coverage Check
continue-on-error: true continue-on-error: true
run: | run: |
interrogate -c pyproject.toml interrogate -c pyproject.toml --fail-under=${{ inputs.fail_under }} && echo "::error title=interrogate::Insufficient documentation quality (goal: ${{ inputs.fail_under }})"
- name: Run 'docstr_coverage' Documentation Coverage Check - name: Run 'docstr_coverage' Documentation Coverage Check
continue-on-error: true continue-on-error: true
run: | run: |
docstr_coverage -v ${{ inputs.directory }} docstr-coverage -v 2 --fail-under=${{ inputs.fail_under }} ${{ inputs.directory }} && echo "::error title=docstr-coverage::Insufficient documentation quality (goal: ${{ inputs.fail_under }})"

View File

@@ -63,7 +63,7 @@ jobs:
Coverage: Coverage:
name: 📈 Collect Coverage Data using Python ${{ inputs.python_version }} name: 📈 Collect Coverage Data using Python ${{ inputs.python_version }}
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: ⏬ Checkout repository - name: ⏬ Checkout repository
@@ -102,7 +102,9 @@ jobs:
htmlDirectory = pyProjectSettings["tool"]["coverage"]["html"]["directory"] htmlDirectory = pyProjectSettings["tool"]["coverage"]["html"]["directory"]
xmlFile = pyProjectSettings["tool"]["coverage"]["xml"]["output"] xmlFile = pyProjectSettings["tool"]["coverage"]["xml"]["output"]
else: else:
print(f"File '{pyProjectFile}' not found and no ' .coveragerc' file specified.") print(f"File '{pyProjectFile}' not found.")
print(f"::error title=FileNotFoundError::File '{pyProjectFile}' not found.")
exit(1)
# Read output paths from '.coveragerc' file # Read output paths from '.coveragerc' file
elif len(coverageRC) > 0: elif len(coverageRC) > 0:
@@ -115,6 +117,8 @@ jobs:
xmlFile = coverageRCSettings["xml"]["output"] xmlFile = coverageRCSettings["xml"]["output"]
else: else:
print(f"File '{coverageRCFile}' not found.") print(f"File '{coverageRCFile}' not found.")
print(f"::error title=FileNotFoundError::File '{coverageRCFile}' not found.")
exit(1)
# Write jobs to special file # Write jobs to special file
github_output = Path(getenv("GITHUB_OUTPUT")) github_output = Path(getenv("GITHUB_OUTPUT"))

View File

@@ -36,7 +36,7 @@ on:
jobs: jobs:
IntermediateCleanUp: IntermediateCleanUp:
name: 🗑️ Intermediate Artifact Cleanup name: 🗑️ Intermediate Artifact Cleanup
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: 🗑️ Delete SQLite coverage artifacts from matrix jobs - name: 🗑️ Delete SQLite coverage artifacts from matrix jobs
uses: geekyeggo/delete-artifact@v5 uses: geekyeggo/delete-artifact@v5
@@ -44,7 +44,6 @@ jobs:
continue-on-error: true continue-on-error: true
with: with:
name: ${{ inputs.sqlite_coverage_artifacts_prefix }}* name: ${{ inputs.sqlite_coverage_artifacts_prefix }}*
token: ${{ secrets.GITHUB_TOKEN }}
- name: 🗑️ Delete XML coverage artifacts from matrix jobs - name: 🗑️ Delete XML coverage artifacts from matrix jobs
uses: geekyeggo/delete-artifact@v5 uses: geekyeggo/delete-artifact@v5
@@ -52,4 +51,3 @@ jobs:
continue-on-error: true continue-on-error: true
with: with:
name: ${{ inputs.xml_unittest_artifacts_prefix }}* name: ${{ inputs.xml_unittest_artifacts_prefix }}*
token: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -42,7 +42,7 @@ on:
jobs: jobs:
PDFDocumentation: PDFDocumentation:
name: 📓 Converting LaTeX Documentation to PDF name: 📓 Converting LaTeX Documentation to PDF
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: 📥 Download artifacts '${{ inputs.latex_artifact }}' from 'SphinxDocumentation' job - name: 📥 Download artifacts '${{ inputs.latex_artifact }}' from 'SphinxDocumentation' job
uses: actions/download-artifact@v4 uses: actions/download-artifact@v4

View File

@@ -44,7 +44,7 @@ jobs:
Package: Package:
name: 📦 Package in Source and Wheel Format name: 📦 Package in Source and Wheel Format
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: ⏬ Checkout repository - name: ⏬ Checkout repository

View File

@@ -42,7 +42,7 @@ on:
system_list: system_list:
description: 'Space separated list of systems to run tests on.' description: 'Space separated list of systems to run tests on.'
required: false required: false
default: 'ubuntu windows macos mingw64 ucrt64' default: 'ubuntu windows macos-arm mingw64 ucrt64'
type: string type: string
include_list: include_list:
description: 'Space separated list of system:python items to be included into the list of test.' description: 'Space separated list of system:python items to be included into the list of test.'
@@ -59,6 +59,26 @@ on:
required: false required: false
default: '' default: ''
type: string type: string
ubuntu_image:
description: 'The used GitHub Action image for Ubuntu based jobs.'
required: false
default: 'ubuntu-24.04'
type: string
windows_image:
description: 'The used GitHub Action image for Windows based jobs.'
required: false
default: 'windows-latest'
type: string
macos_intel_image:
description: 'The used GitHub Action image for macOS (Intel x86-64) based jobs.'
required: false
default: 'macos-latest-large'
type: string
macos_arm_image:
description: 'The used GitHub Action image for macOS (ARM arm64) based jobs.'
required: false
default: 'macos-latest'
type: string
outputs: outputs:
python_version: python_version:
@@ -76,7 +96,7 @@ on:
jobs: jobs:
Parameters: Parameters:
runs-on: ubuntu-latest runs-on: ubuntu-24.04
outputs: outputs:
python_version: ${{ steps.params.outputs.python_version }} python_version: ${{ steps.params.outputs.python_version }}
python_jobs: ${{ steps.params.outputs.python_jobs }} python_jobs: ${{ steps.params.outputs.python_jobs }}
@@ -91,8 +111,8 @@ jobs:
from json import dumps as json_dumps from json import dumps as json_dumps
from os import getenv from os import getenv
from pathlib import Path from pathlib import Path
from pprint import pprint
from textwrap import dedent from textwrap import dedent
from typing import Iterable
name = "${{ inputs.name }}".strip() name = "${{ inputs.name }}".strip()
python_version = "${{ inputs.python_version }}".strip() python_version = "${{ inputs.python_version }}".strip()
@@ -138,7 +158,7 @@ jobs:
if currentAlphaVersion in versions: if currentAlphaVersion in versions:
print(f"::notice title=Experimental::Python {currentAlphaVersion} ({currentAlphaRelease}) is a pre-release.") print(f"::notice title=Experimental::Python {currentAlphaVersion} ({currentAlphaRelease}) is a pre-release.")
for disable in disabled: for disable in disabled:
print(f"::warning title=Disabled Python Job::System '{disable}' temporary disabled.") print(f"::warning title=Disabled Python Job::System '{disable}' temporarily disabled.")
# see https://raw.githubusercontent.com/actions/python-versions/main/versions-manifest.json # see https://raw.githubusercontent.com/actions/python-versions/main/versions-manifest.json
data = { data = {
@@ -158,9 +178,10 @@ jobs:
}, },
# Runner systems (runner images) supported by GitHub Actions # Runner systems (runner images) supported by GitHub Actions
"sys": { "sys": {
"ubuntu": { "icon": "🐧", "runs-on": "ubuntu-latest", "shell": "bash", "name": "Linux (x86-64)", "minPy": (3, 7)}, "ubuntu": { "icon": "🐧", "runs-on": "${{ inputs.ubuntu_image }}", "shell": "bash", "name": "Linux (x86-64)" },
"windows": { "icon": "🪟", "runs-on": "windows-latest", "shell": "pwsh", "name": "Windows (x86-64)", "minPy": (3, 7)}, "windows": { "icon": "🪟", "runs-on": "${{ inputs.windows_image }}", "shell": "pwsh", "name": "Windows (x86-64)" },
"macos": { "icon": "🍎", "runs-on": "macos-latest", "shell": "bash", "name": "MacOS (x86-64)", "minPy": (3, 10)}, "macos": { "icon": "🍎", "runs-on": "${{ inputs.macos_intel_image }}", "shell": "bash", "name": "macOS (x86-64)" },
"macos-arm": { "icon": "🍏", "runs-on": "${{ inputs.macos_arm_image }}", "shell": "bash", "name": "macOS (arm64)" },
}, },
# Runtimes provided by MSYS2 # Runtimes provided by MSYS2
"runtime": { "runtime": {
@@ -183,9 +204,23 @@ jobs:
for disable in disabled: for disable in disabled:
print(f"- {disable}") print(f"- {disable}")
def toVersion(value): def match(combination: str, pattern: str) -> bool:
major, minor = value.split(".") system, version = combination.split(":")
return int(major[-1]), int(minor) sys, ver = pattern.split(":")
if sys == "*":
return (ver == "*") or (version == ver)
elif system == sys:
return (ver == "*") or (version == ver)
else:
return False
def notIn(combination: str, patterns: Iterable[str]) -> bool:
for pattern in patterns:
if match(combination, pattern):
return False
return True
combinations = [ combinations = [
(system, version) (system, version)
@@ -193,22 +228,20 @@ jobs:
if system in data["sys"] if system in data["sys"]
for version in versions for version in versions
if version in data["python"] if version in data["python"]
and toVersion(version) >= data["sys"][system]["minPy"] and notIn(f"{system}:{version}", excludes)
and f"{system}:{version}" not in excludes and notIn(f"{system}:{version}", disabled)
and f"{system}:{version}" not in disabled
] + [ ] + [
(system, currentMSYS2Version) (system, currentMSYS2Version)
for system in systems for system in systems
if system in data["runtime"] if system in data["runtime"]
and f"{system}:{currentMSYS2Version}" not in excludes and notIn(f"{system}:{currentMSYS2Version}", excludes)
and f"{system}:{currentMSYS2Version}" not in disabled and notIn(f"{system}:{currentMSYS2Version}", disabled)
] + [ ] + [
(system, version) (system, version)
for system, version in includes for system, version in includes
if system in data["sys"] if system in data["sys"]
and version in data["python"] and version in data["python"]
and toVersion(version) >= data["sys"][system]["minPy"] and notIn(f"{system}:{version}", disabled)
and f"{system}:{version}" not in disabled
] ]
print(f"Combinations ({len(combinations)}):") print(f"Combinations ({len(combinations)}):")
for system, version in combinations: for system, version in combinations:

View File

@@ -57,7 +57,7 @@ on:
jobs: jobs:
PublishCoverageResults: PublishCoverageResults:
name: 📊 Publish Code Coverage Results name: 📊 Publish Code Coverage Results
runs-on: ubuntu-latest runs-on: ubuntu-24.04
if: always() if: always()
steps: steps:
@@ -71,7 +71,7 @@ jobs:
- name: 🔧 Install coverage and tomli - name: 🔧 Install coverage and tomli
run: | run: |
python -m pip install --disable-pip-version-check -U coverage[toml] tomli python -m pip install -U --disable-pip-version-check --break-system-packages coverage[toml] tomli
- name: 🔁 Extract configurations from pyproject.toml - name: 🔁 Extract configurations from pyproject.toml
id: getVariables id: getVariables
@@ -102,7 +102,9 @@ jobs:
xmlFile = Path(pyProjectSettings["tool"]["coverage"]["xml"]["output"]) xmlFile = Path(pyProjectSettings["tool"]["coverage"]["xml"]["output"])
jsonFile = Path(pyProjectSettings["tool"]["coverage"]["json"]["output"]) jsonFile = Path(pyProjectSettings["tool"]["coverage"]["json"]["output"])
else: else:
print(f"File '{pyProjectFile}' not found and no '.coveragerc' file specified.") print(f"File '{pyProjectFile}' not found.")
print(f"::error title=FileNotFoundError::File '{pyProjectFile}' not found.")
exit(1)
# Read output paths from '.coveragerc' file # Read output paths from '.coveragerc' file
elif len(coverageRC) > 0: elif len(coverageRC) > 0:
@@ -116,6 +118,8 @@ jobs:
jsonFile = Path(coverageRCSettings["json"]["output"]) jsonFile = Path(coverageRCSettings["json"]["output"])
else: else:
print(f"File '{coverageRCFile}' not found.") print(f"File '{coverageRCFile}' not found.")
print(f"::error title=FileNotFoundError::File '{coverageRCFile}' not found.")
exit(1)
# Write jobs to special file # Write jobs to special file
github_output = Path(getenv("GITHUB_OUTPUT")) github_output = Path(getenv("GITHUB_OUTPUT"))
@@ -131,8 +135,10 @@ jobs:
- name: Rename .coverage files and collect them all to coverage/ - name: Rename .coverage files and collect them all to coverage/
run: | run: |
ls -lAh artifacts/
ls -lAh artifacts/*/.coverage
mkdir -p coverage mkdir -p coverage
find . -type f -path "*artifacts*SQLite*.coverage" -exec sh -c 'cp -v $0 "coverage/$(basename $0).$(basename $(dirname $0))"' {} ';' find artifacts/ -type f -path "*SQLite*.coverage" -exec sh -c 'cp -v $0 "coverage/$(basename $0).$(basename $(dirname $0))"' {} ';'
tree -a coverage tree -a coverage
- name: Combine SQLite files (using Coverage.py) - name: Combine SQLite files (using Coverage.py)

View File

@@ -48,7 +48,7 @@ jobs:
PublishOnPyPI: PublishOnPyPI:
name: 🚀 Publish to PyPI name: 🚀 Publish to PyPI
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: 📥 Download artifacts '${{ inputs.artifact }}' from 'Package' job - name: 📥 Download artifacts '${{ inputs.artifact }}' from 'Package' job

View File

@@ -30,11 +30,21 @@ on:
required: false required: false
default: '' default: ''
type: string type: string
additional_merge_args:
description: 'Additional merging arguments.'
required: false
default: '"--pytest=rewrite-dunder-init;reduce-depth:pytest.tests.unit"'
type: string
report_title:
description: 'Title of the summary report in the pipeline''s sidebar'
required: false
default: 'Unit Test Results'
type: string
jobs: jobs:
PublishTestResults: PublishTestResults:
name: 📊 Publish Test Results name: 📊 Publish Test Results
runs-on: ubuntu-latest runs-on: ubuntu-24.04
if: always() if: always()
steps: steps:
@@ -46,37 +56,28 @@ jobs:
with: with:
path: artifacts path: artifacts
- name: 🔧 Install junitparser - name: 🔧 Install pyEDAA.Reports (JUunit Parser and Merger)
run: | run: |
python -m pip install --disable-pip-version-check -U junitparser python -m pip install --disable-pip-version-check --break-system-packages -U pyEDAA.Reports
- name: Move JUnit files and collect them all to junit/ - name: Move JUnit files and collect them all to junit/
run: | run: |
mkdir -p junit mkdir -p junit
find . -type f -path "*artifacts*UnitTestReportSummary*.xml" -exec sh -c 'cp -v $0 "junit/$(basename $(dirname $0)).$(basename $0)"' {} ';' ls -lAh artifacts/*/*.xml
find artifacts/ -type f -path "*TestReportSummary*.xml" -exec sh -c 'cp -v $0 "junit/$(basename $(dirname $0)).$(basename $0)"' {} ';'
tree -a junit tree -a junit
- name: 🔁 Merge JUnit Unit Test Summaries - name: 🔁 Merge JUnit Unit Test Summaries
shell: python
run: | run: |
from pathlib import Path pyedaa-reports -v unittest "--merge=pytest-junit:junit/*.xml" ${{ inputs.additional_merge_args }} "--output=ant-junit:Unittesting.xml"
from junitparser import JUnitXml echo "cat Unittesting.xml"
cat Unittesting.xml
junitDirectory = Path("junit")
junitXml = None
for file in junitDirectory.iterdir():
if junitXml is None:
junitXml = JUnitXml.fromfile(file)
else:
junitXml += JUnitXml.fromfile(file)
junitXml.write(junitDirectory / "merged.xml")
- name: 📊 Publish Unit Test Results - name: 📊 Publish Unit Test Results
uses: dorny/test-reporter@v1 uses: dorny/test-reporter@v1
with: with:
name: Unit Test Results name: ${{ inputs.report_title }}
path: junit/merged.xml path: Unittesting.xml
reporter: java-junit reporter: java-junit
- name: 📤 Upload merged 'JUnit Test Summary' artifact - name: 📤 Upload merged 'JUnit Test Summary' artifact
@@ -84,6 +85,6 @@ jobs:
uses: actions/upload-artifact@v4 uses: actions/upload-artifact@v4
with: with:
name: ${{ inputs.merged_junit_artifact }} name: ${{ inputs.merged_junit_artifact }}
path: junit/merged.xml path: Unittesting.xml
if-no-files-found: error if-no-files-found: error
retention-days: 1 retention-days: 1

View File

@@ -44,7 +44,7 @@ jobs:
PublishToGitHubPages: PublishToGitHubPages:
name: 📚 Publish to GH-Pages name: 📚 Publish to GH-Pages
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: ⏬ Checkout repository - name: ⏬ Checkout repository

View File

@@ -29,7 +29,7 @@ jobs:
Release: Release:
name: 📝 Create 'Release Page' on GitHub name: 📝 Create 'Release Page' on GitHub
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: 🔁 Extract Git tag from GITHUB_REF - name: 🔁 Extract Git tag from GITHUB_REF
@@ -55,12 +55,34 @@ jobs:
**Automated Release created on: ${{ steps.getVariables.outputs.datetime }}** **Automated Release created on: ${{ steps.getVariables.outputs.datetime }}**
# New Features # New Features
* tbd
* tbd * tbd
# Changes # Changes
* tbd
* tbd * tbd
# Bug Fixes # Bug Fixes
* tbd * tbd
draft: false * tbd
# Documentation
* tbd
* tbd
# Unit Tests
* tbd
* tbd
----------
# Related Issues and Pull-Requests
* tbd
* tbd
draft: true
prerelease: false prerelease: false

View File

@@ -73,7 +73,7 @@ on:
jobs: jobs:
Sphinx: Sphinx:
name: 📓 Documentation generation using Sphinx and Python ${{ inputs.python_version }} name: 📓 Documentation generation using Sphinx and Python ${{ inputs.python_version }}
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: ⏬ Checkout repository - name: ⏬ Checkout repository
@@ -121,7 +121,9 @@ jobs:
xmlFile = Path(pyProjectSettings["tool"]["coverage"]["xml"]["output"]) xmlFile = Path(pyProjectSettings["tool"]["coverage"]["xml"]["output"])
jsonFile = Path(pyProjectSettings["tool"]["coverage"]["json"]["output"]) jsonFile = Path(pyProjectSettings["tool"]["coverage"]["json"]["output"])
else: else:
print(f"File '{pyProjectFile}' not found and no '.coveragerc' file specified.") print(f"File '{pyProjectFile}' not found.")
print(f"::error title=FileNotFoundError::File '{pyProjectFile}' not found.")
exit(1)
# Read output paths from '.coveragerc' file # Read output paths from '.coveragerc' file
elif len(coverageRC) > 0: elif len(coverageRC) > 0:
@@ -135,6 +137,8 @@ jobs:
jsonFile = Path(coverageRCSettings["json"]["output"]) jsonFile = Path(coverageRCSettings["json"]["output"])
else: else:
print(f"File '{coverageRCFile}' not found.") print(f"File '{coverageRCFile}' not found.")
print(f"::error title=FileNotFoundError::File '{coverageRCFile}' not found.")
exit(1)
# Write jobs to special file # Write jobs to special file
github_output = Path(getenv("GITHUB_OUTPUT")) github_output = Path(getenv("GITHUB_OUTPUT"))

View File

@@ -63,7 +63,7 @@ jobs:
StaticTypeCheck: StaticTypeCheck:
name: 👀 Check Static Typing using Python ${{ inputs.python_version }} name: 👀 Check Static Typing using Python ${{ inputs.python_version }}
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: ⏬ Checkout repository - name: ⏬ Checkout repository

View File

@@ -41,7 +41,7 @@ jobs:
Image: Image:
runs-on: ubuntu-latest runs-on: ubuntu-24.04
env: env:
DOCKER_BUILDKIT: 1 DOCKER_BUILDKIT: 1
steps: steps:
@@ -60,7 +60,7 @@ jobs:
Composite: Composite:
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
@@ -120,7 +120,7 @@ jobs:
needs: needs:
- Image - Image
- Composite - Composite
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4

View File

@@ -29,21 +29,56 @@ on:
description: 'JSON list with environment fields, telling the system and Python versions to run tests with.' description: 'JSON list with environment fields, telling the system and Python versions to run tests with.'
required: true required: true
type: string type: string
apt:
description: 'Ubuntu dependencies to be installed through apt.'
required: false
default: ''
type: string
brew:
description: 'macOS dependencies to be installed through brew.'
required: false
default: ''
type: string
pacboy:
description: 'MSYS2 dependencies to be installed through pacboy (pacman).'
required: false
default: ''
type: string
requirements: requirements:
description: 'Python dependencies to be installed through pip.' description: 'Python dependencies to be installed through pip.'
required: false required: false
default: '-r tests/requirements.txt' default: '-r tests/requirements.txt'
type: string type: string
pacboy:
description: 'MSYS2 dependencies to be installed through pacboy (pacman).'
required: false
default: ""
type: string
mingw_requirements: mingw_requirements:
description: 'Override Python dependencies to be installed through pip on MSYS2 (MINGW64) only.' description: 'Override Python dependencies to be installed through pip on MSYS2 (MINGW64) only.'
required: false required: false
default: '' default: ''
type: string type: string
macos_before_script:
description: 'Scripts to execute before pytest on macOS (Intel).'
required: false
default: ''
type: string
macos_arm_before_script:
description: 'Scripts to execute before pytest on macOS (ARM).'
required: false
default: ''
type: string
ubuntu_before_script:
description: 'Scripts to execute before pytest on Ubuntu.'
required: false
default: ''
type: string
mingw64_before_script:
description: 'Scripts to execute before pytest on Windows within MSYS2 MinGW64.'
required: false
default: ''
type: string
ucrt64_before_script:
description: 'Scripts to execute before pytest on Windows within MSYS2 UCRT64.'
required: false
default: ''
type: string
root_directory: root_directory:
description: 'Working directory for running tests.' description: 'Working directory for running tests.'
required: false required: false
@@ -113,6 +148,19 @@ jobs:
- name: ⏬ Checkout repository - name: ⏬ Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4
# Package Manager steps
- name: 🔧 Install homebrew dependencies on macOS
if: ( matrix.system == 'macos' || matrix.system == 'macos-arm' ) && inputs.brew != ''
run: brew install ${{ inputs.brew }}
- name: 🔧 Install apt dependencies on Ubuntu
if: matrix.system == 'ubuntu' && inputs.apt != ''
run: |
sudo apt-get update
sudo apt-get install -y --no-install-recommends ${{ inputs.apt }}
# Compute Dependencies for MSYS2 steps
- name: 🔧 Install dependencies (system Python for Python shell) - name: 🔧 Install dependencies (system Python for Python shell)
if: matrix.system == 'msys2' if: matrix.system == 'msys2'
shell: pwsh shell: pwsh
@@ -149,28 +197,34 @@ jobs:
requirements = "${{ inputs.requirements }}" requirements = "${{ inputs.requirements }}"
if requirements.startswith("-r"): if requirements.startswith("-r"):
requirementsFile = Path(requirements[2:].lstrip()) requirementsFile = Path(requirements[2:].lstrip())
dependencies = loadRequirementsFile(requirementsFile) try:
dependencies = loadRequirementsFile(requirementsFile)
except FileNotFoundError as ex:
print(f"::error title=FileNotFoundError::{ex}")
exit(1)
else: else:
dependencies = [req.strip() for req in requirements.split(" ")] dependencies = [req.strip() for req in requirements.split(" ")]
packages = { packages = {
"coverage": "python-coverage:p", "coverage": "python-coverage:p",
"igraph": "igraph:p", "docstr_coverage": "python-pyyaml:p",
"jinja2": "python-markupsafe:p", "igraph": "igraph:p",
"lxml": "python-lxml:p", "jinja2": "python-markupsafe:p",
"numpy": "python-numpy:p", "lxml": "python-lxml:p",
"markupsafe": "python-markupsafe:p", "numpy": "python-numpy:p",
"pip": "python-pip:p", "markupsafe": "python-markupsafe:p",
"ruamel.yaml": "python-ruamel-yaml:p python-ruamel.yaml.clib:p", "pip": "python-pip:p",
"sphinx": "python-markupsafe:p", "pyyaml": "python-pyyaml:p",
"tomli": "python-tomli:p", "ruamel.yaml": "python-ruamel-yaml:p python-ruamel.yaml.clib:p",
"wheel": "python-wheel:p", "sphinx": "python-markupsafe:p",
"tomli": "python-tomli:p",
"wheel": "python-wheel:p",
"pyEDAA.ProjectModel": "python-ruamel-yaml:p python-ruamel.yaml.clib:p python-lxml:p", "pyEDAA.ProjectModel": "python-ruamel-yaml:p python-ruamel.yaml.clib:p python-lxml:p",
"pyEDAA.Reports": "python-ruamel-yaml:p python-ruamel.yaml.clib:p python-lxml:p", "pyEDAA.Reports": "python-ruamel-yaml:p python-ruamel.yaml.clib:p python-lxml:p",
} }
subPackages = { subPackages = {
"pytooling": { "pytooling": {
"yaml": "python-ruamel-yaml:p python-ruamel.yaml.clib:p", "yaml": "python-ruamel-yaml:p python-ruamel.yaml.clib:p",
}, },
} }
@@ -206,6 +260,8 @@ jobs:
with github_output.open("a+") as f: with github_output.open("a+") as f:
f.write(f"pacboy_packages={' '.join(pacboyPackages)}\n") f.write(f"pacboy_packages={' '.join(pacboyPackages)}\n")
# Python setup
- name: '🟦 Setup MSYS2 for ${{ matrix.runtime }}' - name: '🟦 Setup MSYS2 for ${{ matrix.runtime }}'
if: matrix.system == 'msys2' if: matrix.system == 'msys2'
uses: msys2/setup-msys2@v2 uses: msys2/setup-msys2@v2
@@ -222,6 +278,8 @@ jobs:
with: with:
python-version: ${{ matrix.python }} python-version: ${{ matrix.python }}
# Python Dependency steps
- name: 🔧 Install wheel,tomli and pip dependencies (native) - name: 🔧 Install wheel,tomli and pip dependencies (native)
if: matrix.system != 'msys2' if: matrix.system != 'msys2'
run: | run: |
@@ -237,6 +295,32 @@ jobs:
python -m pip install --disable-pip-version-check ${{ inputs.requirements }} python -m pip install --disable-pip-version-check ${{ inputs.requirements }}
fi fi
# Before scripts
- name: 🍎 macOS (Intel) before scripts
if: matrix.system == 'macos' && inputs.macos_before_script != ''
run: ${{ inputs.macos_before_script }}
- name: 🍏 macOS (ARM) before scripts
if: matrix.system == 'macos-arm' && inputs.macos_arm_before_script != ''
run: ${{ inputs.macos_arm_before_script }}
- name: 🐧 Ubuntu before scripts
if: matrix.system == 'ubuntu' && inputs.ubuntu_before_script != ''
run: ${{ inputs.ubuntu_before_script }}
# Windows before script
- name: 🪟🟦 MinGW64 before scripts
if: matrix.system == 'msys2' && matrix.runtime == 'MINGW64' && inputs.mingw64_before_script != ''
run: ${{ inputs.mingw64_before_script }}
- name: 🪟🟨 UCRT64 before scripts
if: matrix.system == 'msys2' && matrix.runtime == 'UCRT64' && inputs.ucrt64_before_script != ''
run: ${{ inputs.ucrt64_before_script }}
# Read pyproject.toml
- name: 🔁 Extract configurations from pyproject.toml - name: 🔁 Extract configurations from pyproject.toml
id: getVariables id: getVariables
shell: python shell: python
@@ -266,7 +350,9 @@ jobs:
xmlFile = Path(pyProjectSettings["tool"]["coverage"]["xml"]["output"]) xmlFile = Path(pyProjectSettings["tool"]["coverage"]["xml"]["output"])
jsonFile = Path(pyProjectSettings["tool"]["coverage"]["json"]["output"]) jsonFile = Path(pyProjectSettings["tool"]["coverage"]["json"]["output"])
else: else:
print(f"File '{pyProjectFile}' not found and no '.coveragerc' file specified.") print(f"File '{pyProjectFile}' not found.")
print(f"::error title=FileNotFoundError::File '{pyProjectFile}' not found.")
exit(1)
# Read output paths from '.coveragerc' file # Read output paths from '.coveragerc' file
elif len(coverageRC) > 0: elif len(coverageRC) > 0:
@@ -280,6 +366,8 @@ jobs:
jsonFile = Path(coverageRCSettings["json"]["output"]) jsonFile = Path(coverageRCSettings["json"]["output"])
else: else:
print(f"File '{coverageRCFile}' not found.") print(f"File '{coverageRCFile}' not found.")
print(f"::error title=FileNotFoundError::File '{coverageRCFile}' not found.")
exit(1)
# Write jobs to special file # Write jobs to special file
github_output = Path(getenv("GITHUB_OUTPUT")) github_output = Path(getenv("GITHUB_OUTPUT"))
@@ -294,7 +382,9 @@ jobs:
print(f"DEBUG:\n html={htmlDirectory}\n xml={xmlFile}\n json={jsonFile}") print(f"DEBUG:\n html={htmlDirectory}\n xml={xmlFile}\n json={jsonFile}")
- name: ☑ Run unit tests (Ubuntu/macOS) # Run pytests
- name: ✅ Run unit tests (Ubuntu/macOS)
if: matrix.system != 'windows' if: matrix.system != 'windows'
run: | run: |
export ENVIRONMENT_NAME="${{ matrix.envname }}" export ENVIRONMENT_NAME="${{ matrix.envname }}"
@@ -310,7 +400,7 @@ jobs:
python -m pytest -raP $PYTEST_ARGS --color=yes ${{ inputs.tests_directory || '.' }}/${{ inputs.unittest_directory }} python -m pytest -raP $PYTEST_ARGS --color=yes ${{ inputs.tests_directory || '.' }}/${{ inputs.unittest_directory }}
fi fi
- name: Run unit tests (Windows) - name: Run unit tests (Windows)
if: matrix.system == 'windows' if: matrix.system == 'windows'
run: | run: |
$env:ENVIRONMENT_NAME = "${{ matrix.envname }}" $env:ENVIRONMENT_NAME = "${{ matrix.envname }}"
@@ -328,20 +418,26 @@ jobs:
- name: Convert coverage to XML format (Cobertura) - name: Convert coverage to XML format (Cobertura)
if: inputs.coverage_xml_artifact != '' if: inputs.coverage_xml_artifact != ''
continue-on-error: true
run: coverage xml --data-file=.coverage run: coverage xml --data-file=.coverage
- name: Convert coverage to JSON format - name: Convert coverage to JSON format
if: inputs.coverage_json_artifact != '' if: inputs.coverage_json_artifact != ''
continue-on-error: true
run: coverage json --data-file=.coverage run: coverage json --data-file=.coverage
- name: Convert coverage to HTML format - name: Convert coverage to HTML format
if: inputs.coverage_html_artifact != '' if: inputs.coverage_html_artifact != ''
continue-on-error: true
run: | run: |
coverage html --data-file=.coverage -d ${{ steps.getVariables.outputs.coverage_report_html_directory }} coverage html --data-file=.coverage -d ${{ steps.getVariables.outputs.coverage_report_html_directory }}
rm ${{ steps.getVariables.outputs.coverage_report_html_directory }}/.gitignore rm ${{ steps.getVariables.outputs.coverage_report_html_directory }}/.gitignore
# Upload artifacts
- name: 📤 Upload 'TestReportSummary.xml' artifact - name: 📤 Upload 'TestReportSummary.xml' artifact
if: inputs.unittest_xml_artifact != '' if: inputs.unittest_xml_artifact != ''
continue-on-error: true
uses: actions/upload-artifact@v4 uses: actions/upload-artifact@v4
with: with:
name: ${{ inputs.unittest_xml_artifact }}-${{ matrix.system }}-${{ matrix.runtime }}-${{ matrix.python }} name: ${{ inputs.unittest_xml_artifact }}-${{ matrix.system }}-${{ matrix.runtime }}-${{ matrix.python }}
@@ -366,6 +462,7 @@ jobs:
with: with:
name: ${{ inputs.coverage_sqlite_artifact }}-${{ matrix.system }}-${{ matrix.runtime }}-${{ matrix.python }} name: ${{ inputs.coverage_sqlite_artifact }}-${{ matrix.system }}-${{ matrix.runtime }}-${{ matrix.python }}
path: .coverage path: .coverage
include-hidden-files: true
if-no-files-found: error if-no-files-found: error
retention-days: 1 retention-days: 1

View File

@@ -35,7 +35,7 @@ jobs:
VerifyDocs: VerifyDocs:
name: 👍 Verify example snippets using Python ${{ inputs.python_version }} name: 👍 Verify example snippets using Python ${{ inputs.python_version }}
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: ⏬ Checkout repository - name: ⏬ Checkout repository
@@ -72,7 +72,7 @@ jobs:
- name: Print example.py - name: Print example.py
run: cat tests/docs/example.py run: cat tests/docs/example.py
- name: Run example snippet - name: Run example snippet
working-directory: tests/docs working-directory: tests/docs
run: | run: |
python3 example.py python3 example.py

View File

@@ -36,7 +36,7 @@ jobs:
name: Package generation name: Package generation
needs: needs:
- Params - Params
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: Package creation - name: Package creation
run: echo "Package" >> package.txt run: echo "Package" >> package.txt

View File

@@ -64,14 +64,14 @@ jobs:
- Params_Exclude - Params_Exclude
- Params_Disable - Params_Disable
- Params_All - Params_All
runs-on: ubuntu-latest runs-on: ubuntu-24.04
defaults: defaults:
run: run:
shell: python shell: python
steps: steps:
- name: Install dependencies - name: Install dependencies
shell: bash shell: bash
run: pip install pyTooling run: pip install --disable-pip-version-check --break-system-packages pyTooling
# Params_Default # Params_Default
- name: Checking results from 'Params_Default' - name: Checking results from 'Params_Default'
run: | run: |
@@ -84,6 +84,8 @@ jobs:
expectedPythons = ["3.8", "3.9", "3.10", "3.11", "3.12"] expectedPythons = ["3.8", "3.9", "3.10", "3.11", "3.12"]
expectedSystems = ["ubuntu", "windows", "macos"] expectedSystems = ["ubuntu", "windows", "macos"]
expectedJobs = [f"{system}:{python}" for system in expectedSystems for python in expectedPythons] + ["mingw64:3.11", "ucrt64:3.11"] expectedJobs = [f"{system}:{python}" for system in expectedSystems for python in expectedPythons] + ["mingw64:3.11", "ucrt64:3.11"]
expectedJobs.remove("macos:3.8")
expectedJobs.remove("macos:3.9")
expectedName = "Example" expectedName = "Example"
expectedArtifacts = { expectedArtifacts = {
"unittesting_xml": f"{expectedName}-UnitTestReportSummary-XML", "unittesting_xml": f"{expectedName}-UnitTestReportSummary-XML",
@@ -112,8 +114,12 @@ jobs:
errors += 1 errors += 1
if len(actualPythonJobs) != len(expectedJobs): if len(actualPythonJobs) != len(expectedJobs):
print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.") print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.")
print("Actual jobs:")
for job in actualPythonJobs: for job in actualPythonJobs:
print(f" {job['system']}:{job['python']}") print(f" {job['system']}:{job['python']}")
print("Expected jobs:")
for job in expectedJobs:
print(f" {job}")
errors += 1 errors += 1
if len(actualArtifactNames) != len(expectedArtifacts): if len(actualArtifactNames) != len(expectedArtifacts):
print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.") print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.")
@@ -140,6 +146,9 @@ jobs:
expectedPythons = ["3.9", "3.10", "pypy-3.8", "pypy-3.9"] expectedPythons = ["3.9", "3.10", "pypy-3.8", "pypy-3.9"]
expectedSystems = ["ubuntu", "windows", "macos"] expectedSystems = ["ubuntu", "windows", "macos"]
expectedJobs = [f"{system}:{python}" for system in expectedSystems for python in expectedPythons] + ["mingw64:3.11", "ucrt64:3.11"] expectedJobs = [f"{system}:{python}" for system in expectedSystems for python in expectedPythons] + ["mingw64:3.11", "ucrt64:3.11"]
expectedJobs.remove("macos:3.9")
expectedJobs.remove("macos:pypy-3.8")
expectedJobs.remove("macos:pypy-3.9")
expectedName = "Example" expectedName = "Example"
expectedArtifacts = { expectedArtifacts = {
"unittesting_xml": f"{expectedName}-UnitTestReportSummary-XML", "unittesting_xml": f"{expectedName}-UnitTestReportSummary-XML",
@@ -168,8 +177,12 @@ jobs:
errors += 1 errors += 1
if len(actualPythonJobs) != len(expectedJobs): if len(actualPythonJobs) != len(expectedJobs):
print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.") print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.")
print("Actual jobs:")
for job in actualPythonJobs: for job in actualPythonJobs:
print(f" {job['system']}:{job['python']}") print(f" {job['system']}:{job['python']}")
print("Expected jobs:")
for job in expectedJobs:
print(f" {job}")
errors += 1 errors += 1
if len(actualArtifactNames) != len(expectedArtifacts): if len(actualArtifactNames) != len(expectedArtifacts):
print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.") print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.")
@@ -224,8 +237,12 @@ jobs:
errors += 1 errors += 1
if len(actualPythonJobs) != len(expectedJobs): if len(actualPythonJobs) != len(expectedJobs):
print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.") print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.")
print("Actual jobs:")
for job in actualPythonJobs: for job in actualPythonJobs:
print(f" {job['system']}:{job['python']}") print(f" {job['system']}:{job['python']}")
print("Expected jobs:")
for job in expectedJobs:
print(f" {job}")
errors += 1 errors += 1
if len(actualArtifactNames) != len(expectedArtifacts): if len(actualArtifactNames) != len(expectedArtifacts):
print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.") print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.")
@@ -280,8 +297,12 @@ jobs:
errors += 1 errors += 1
if len(actualPythonJobs) != len(expectedJobs): if len(actualPythonJobs) != len(expectedJobs):
print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.") print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.")
print("Actual jobs:")
for job in actualPythonJobs: for job in actualPythonJobs:
print(f" {job['system']}:{job['python']}") print(f" {job['system']}:{job['python']}")
print("Expected jobs:")
for job in expectedJobs:
print(f" {job}")
errors += 1 errors += 1
if len(actualArtifactNames) != len(expectedArtifacts): if len(actualArtifactNames) != len(expectedArtifacts):
print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.") print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.")
@@ -336,8 +357,12 @@ jobs:
errors += 1 errors += 1
if len(actualPythonJobs) != len(expectedJobs): if len(actualPythonJobs) != len(expectedJobs):
print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.") print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.")
print("Actual jobs:")
for job in actualPythonJobs: for job in actualPythonJobs:
print(f" {job['system']}:{job['python']}") print(f" {job['system']}:{job['python']}")
print("Expected jobs:")
for job in expectedJobs:
print(f" {job}")
errors += 1 errors += 1
if len(actualArtifactNames) != len(expectedArtifacts): if len(actualArtifactNames) != len(expectedArtifacts):
print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.") print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.")
@@ -392,8 +417,12 @@ jobs:
errors += 1 errors += 1
if len(actualPythonJobs) != len(expectedJobs): if len(actualPythonJobs) != len(expectedJobs):
print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.") print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.")
print("Actual jobs:")
for job in actualPythonJobs: for job in actualPythonJobs:
print(f" {job['system']}:{job['python']}") print(f" {job['system']}:{job['python']}")
print("Expected jobs:")
for job in expectedJobs:
print(f" {job}")
errors += 1 errors += 1
if len(actualArtifactNames) != len(expectedArtifacts): if len(actualArtifactNames) != len(expectedArtifacts):
print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.") print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.")
@@ -448,8 +477,12 @@ jobs:
errors += 1 errors += 1
if len(actualPythonJobs) != len(expectedJobs): if len(actualPythonJobs) != len(expectedJobs):
print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.") print(f"Number of 'python_jobs' does not match: {len(actualPythonJobs)} != {len(expectedJobs)}.")
print("Actual jobs:")
for job in actualPythonJobs: for job in actualPythonJobs:
print(f" {job['system']}:{job['python']}") print(f" {job['system']}:{job['python']}")
print("Expected jobs:")
for job in expectedJobs:
print(f" {job}")
errors += 1 errors += 1
if len(actualArtifactNames) != len(expectedArtifacts): if len(actualArtifactNames) != len(expectedArtifacts):
print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.") print(f"Number of 'artifact_names' does not match: {len(actualArtifactNames)} != {len(expectedArtifacts)}.")

View File

@@ -84,10 +84,12 @@ jobs:
codacy_token: ${{ secrets.CODACY_PROJECT_TOKEN }} codacy_token: ${{ secrets.CODACY_PROJECT_TOKEN }}
PublishTestResults: PublishTestResults:
uses: pyTooling/Actions/.github/workflows/PublishTestResults.yml@r1 uses: pyTooling/Actions/.github/workflows/PublishTestResults.yml@dev
needs: needs:
- UnitTesting - UnitTesting
- PlatformTesting - PlatformTesting
with:
additional_merge_args: '-d "--pytest=rewrite-dunder-init;reduce-depth:pytest.tests.unit;reduce-depth:pytest.tests.platform"'
Package: Package:
uses: pyTooling/Actions/.github/workflows/Package.yml@r1 uses: pyTooling/Actions/.github/workflows/Package.yml@r1

1
.gitignore vendored
View File

@@ -13,6 +13,7 @@ coverage.xml
# pytest # pytest
/report/unit /report/unit
/tests/*.github
# setuptools # setuptools
/build/**/*.* /build/**/*.*

2
dist/requirements.txt vendored Normal file
View File

@@ -0,0 +1,2 @@
wheel ~= 0.44
twine ~= 5.1

View File

@@ -81,7 +81,7 @@ The following block shows a minimal YAML workflow file:
jobs: jobs:
mwe: mwe:
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
# Clone repository # Clone repository
@@ -171,7 +171,7 @@ For prototyping purposes, the following job might be useful:
Release: Release:
name: '📦 Release' name: '📦 Release'
runs-on: ubuntu-latest runs-on: ubuntu-24.04
needs: needs:
- ... - ...
if: github.event_name != 'pull_request' && (github.ref == 'refs/heads/master' || contains(github.ref, 'refs/tags/'>`__) if: github.event_name != 'pull_request' && (github.ref == 'refs/heads/master' || contains(github.ref, 'refs/tags/'>`__)

View File

@@ -76,7 +76,7 @@ Documentation Only (Sphinx)
needs: needs:
- BuildTheDocs - BuildTheDocs
- PublishToGitHubPages - PublishToGitHubPages
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
- name: 🗑️ Delete artifacts - name: 🗑️ Delete artifacts

View File

@@ -12,13 +12,36 @@ This job creates a Release Page on GitHub.
**Automated Release created on: ${{ steps.getVariables.outputs.datetime }}** **Automated Release created on: ${{ steps.getVariables.outputs.datetime }}**
# New Features # New Features
* tbd
* tbd * tbd
# Changes # Changes
* tbd
* tbd * tbd
# Bug Fixes # Bug Fixes
* tbd * tbd
* tbd
# Documentation
* tbd
* tbd
# Unit Tests
* tbd
* tbd
----------
# Related Issues and Pull-Requests
* tbd
* tbd
**Behavior:** **Behavior:**

View File

@@ -60,10 +60,10 @@ pygments_style = "manni"
# ============================================================================== # ==============================================================================
# Restructured Text settings # Restructured Text settings
# ============================================================================== # ==============================================================================
prologPath = "prolog.inc" prologPath = Path("prolog.inc")
try: try:
with open(prologPath, "r") as prologFile: with prologPath.open("r", encoding="utf-8") as fileHandle:
rst_prolog = prologFile.read() rst_prolog = fileHandle.read()
except Exception as ex: except Exception as ex:
print(f"[ERROR:] While reading '{prologPath}'.") print(f"[ERROR:] While reading '{prologPath}'.")
print(ex) print(ex)

View File

@@ -100,6 +100,9 @@ References
- `hdl/containers#48 <https://github.com/hdl/containers/issues/48>`__ - `hdl/containers#48 <https://github.com/hdl/containers/issues/48>`__
.. _CONTRIBUTORS:
Contributors Contributors
************ ************
@@ -108,6 +111,8 @@ Contributors
* `and more... <https://GitHub.com/pyTooling/Actions/graphs/contributors>`__ * `and more... <https://GitHub.com/pyTooling/Actions/graphs/contributors>`__
.. _LICENSE:
License License
******* *******

View File

@@ -1,10 +1,10 @@
-r ../requirements.txt -r ../requirements.txt
pyTooling ~= 6.1 pyTooling ~= 6.6
# Enforce latest version on ReadTheDocs # Enforce latest version on ReadTheDocs
sphinx ~= 7.2 sphinx ~= 7.4
docutils ~= 0.18.0 docutils ~= 0.20
# Sphinx Extenstions # Sphinx Extenstions
#sphinx.ext.coverage #sphinx.ext.coverage
@@ -16,5 +16,5 @@ sphinxcontrib-mermaid>=0.9.2
autoapi >= 2.0.1 autoapi >= 2.0.1
sphinx_fontawesome >= 0.0.6 sphinx_fontawesome >= 0.0.6
sphinx-inline-tabs >= 2023.4.21 sphinx-inline-tabs >= 2023.4.21
sphinx_autodoc_typehints >= 1.24.0 sphinx_autodoc_typehints ~= 2.3
# changelog>=0.3.5 # changelog>=0.3.5

View File

@@ -1,8 +1,8 @@
[build-system] [build-system]
requires = [ requires = [
"setuptools >= 69.0.0", "setuptools ~= 75.1",
"wheel >= 0.40.0", "wheel ~= 0.44",
"pyTooling ~= 6.1" "pyTooling ~= 6.6"
] ]
build-backend = "setuptools.build_meta" build-backend = "setuptools.build_meta"

View File

@@ -75,7 +75,7 @@ on:
jobs: jobs:
mwe: mwe:
runs-on: ubuntu-latest runs-on: ubuntu-24.04
steps: steps:
# Clone repository # Clone repository
@@ -156,7 +156,7 @@ For prototyping purposes, the following job might be useful:
```yml ```yml
Release: Release:
name: '📦 Release' name: '📦 Release'
runs-on: ubuntu-latest runs-on: ubuntu-24.04
needs: needs:
- ... - ...
if: github.event_name != 'pull_request' && (github.ref == 'refs/heads/master' || contains(github.ref, 'refs/tags/')) if: github.event_name != 'pull_request' && (github.ref == 'refs/heads/master' || contains(github.ref, 'refs/tags/'))

View File

@@ -45,7 +45,9 @@ runs:
steps: steps:
- shell: bash - shell: bash
run: pip install --disable-pip-version-check PyGithub --progress-bar off run: |
[ "$(source /etc/os-release && echo $VERSION_ID)" == "24.04" ] && UBUNTU_2404_ARGS='--break-system-packages' || unset UBUNTU_2404_ARGS
pip install --disable-pip-version-check --progress-bar off $UBUNTU_2404_ARGS PyGithub
- shell: bash - shell: bash
run: '''${{ github.action_path }}/../releaser.py''' run: '''${{ github.action_path }}/../releaser.py'''

View File

@@ -1 +1 @@
pyTooling ~= 6.1 pyTooling ~= 6.6

91
tests/pacman_packages.py Normal file
View File

@@ -0,0 +1,91 @@
from os import getenv
from pathlib import Path
from re import compile
from sys import version
print(f"Python: {version}")
def loadRequirementsFile(requirementsFile: Path):
requirements = []
with requirementsFile.open("r") as file:
for line in file.readlines():
line = line.strip()
if line.startswith("#") or line.startswith("https") or line == "":
continue
elif line.startswith("-r"):
# Remove the first word/argument (-r)
requirements += loadRequirementsFile(requirementsFile.parent / line[2:].lstrip())
else:
requirements.append(line)
return requirements
requirements = "-r ../tests/requirements.txt"
if requirements.startswith("-r"):
requirementsFile = Path(requirements[2:].lstrip())
try:
dependencies = loadRequirementsFile(requirementsFile)
except FileNotFoundError as ex:
print(f"::error title=FileNotFound::{ex}")
exit(1)
else:
dependencies = [req.strip() for req in requirements.split(" ")]
packages = {
"coverage": "python-coverage:p",
"igraph": "igraph:p",
"jinja2": "python-markupsafe:p",
"lxml": "python-lxml:p",
"numpy": "python-numpy:p",
"markupsafe": "python-markupsafe:p",
"pip": "python-pip:p",
"ruamel.yaml": "python-ruamel-yaml:p python-ruamel.yaml.clib:p",
"sphinx": "python-markupsafe:p",
"tomli": "python-tomli:p",
"wheel": "python-wheel:p",
"pyEDAA.ProjectModel": "python-ruamel-yaml:p python-ruamel.yaml.clib:p python-lxml:p",
"pyEDAA.Reports": "python-ruamel-yaml:p python-ruamel.yaml.clib:p python-lxml:p",
}
subPackages = {
"pytooling": {
"yaml": "python-ruamel-yaml:p python-ruamel.yaml.clib:p",
},
}
regExp = compile(
r"(?P<PackageName>[\w_\-\.]+)(?:\[(?P<SubPackages>(?:\w+)(?:\s*,\s*\w+)*)\])?(?:\s*(?P<Comperator>[<>~=]+)\s*)(?P<Version>\d+(?:\.\d+)*)(?:-(?P<VersionExtension>\w+))?")
pacboyPackages = set(("python-pip:p", "python-wheel:p", "python-tomli:p"))
print(f"Processing dependencies ({len(dependencies)}):")
for dependency in dependencies:
print(f" {dependency}")
match = regExp.match(dependency.lower())
if not match:
print(f" Wrong format: {dependency}")
print(f"::error title=Identifying Pacboy Packages::Unrecognized dependency format '{dependency}'")
continue
package = match["PackageName"]
if package in packages:
rewrite = packages[package]
print(f" Found rewrite rule for '{package}': {rewrite}")
pacboyPackages.add(rewrite)
if match["SubPackages"] and package in subPackages:
for subPackage in match["SubPackages"].split(","):
if subPackage in subPackages[package]:
rewrite = subPackages[package][subPackage]
print(f" Found rewrite rule for '{package}[..., {subPackage}, ...]': {rewrite}")
pacboyPackages.add(rewrite)
# Write jobs to special file
github_output = Path(getenv("GITHUB_OUTPUT"))
print(f"GITHUB_OUTPUT: {github_output}")
with github_output.open("a+") as f:
f.write(f"pacboy_packages={' '.join(pacboyPackages)}\n")
print(f"GITHUB_OUTPUT:")
print(f"pacboy_packages={' '.join(pacboyPackages)}\n")

View File

@@ -28,12 +28,12 @@
# SPDX-License-Identifier: Apache-2.0 # # SPDX-License-Identifier: Apache-2.0 #
# ==================================================================================================================== # # ==================================================================================================================== #
# #
from unittest import TestCase from unittest import TestCase
from pytest import mark from pytest import mark
from pyTooling.Common import CurrentPlatform from pyTooling.Platform import CurrentPlatform
from pyDummy import Application from pyDummy import Application
if __name__ == "__main__": # pragma: no cover if __name__ == "__main__": # pragma: no cover

216
tests/python_jobs.py Normal file
View File

@@ -0,0 +1,216 @@
from json import dumps as json_dumps
from os import getenv
from pathlib import Path
from textwrap import dedent
from typing import Iterable
name = "example".strip()
python_version = "3.12".strip()
systems = "ubuntu windows macos-arm mingw64 ucrt64".strip()
versions = "3.8 3.9 3.10 3.11 3.12".strip()
include_list = "".strip()
exclude_list = "".strip()
disable_list = "".strip()
currentMSYS2Version = "3.11"
currentAlphaVersion = "3.13"
currentAlphaRelease = "3.13.0-alpha.1"
if systems == "":
print("::error title=Parameter::system_list is empty.")
else:
systems = [sys.strip() for sys in systems.split(" ")]
if versions == "":
versions = [python_version]
else:
versions = [ver.strip() for ver in versions.split(" ")]
if include_list == "":
includes = []
else:
includes = [tuple(include.strip().split(":")) for include in include_list.split(" ")]
if exclude_list == "":
excludes = []
else:
excludes = [exclude.strip() for exclude in exclude_list.split(" ")]
if disable_list == "":
disabled = []
else:
disabled = [disable.strip() for disable in disable_list.split(" ")]
if "3.7" in versions:
print("::warning title=Deprecated::Support for Python 3.7 ended in 2023.06.27.")
if "msys2" in systems:
print("::warning title=Deprecated::System 'msys2' will be replaced by 'mingw64'.")
if currentAlphaVersion in versions:
print(f"::notice title=Experimental::Python {currentAlphaVersion} ({currentAlphaRelease}) is a pre-release.")
for disable in disabled:
print(f"::warning title=Disabled Python Job::System '{disable}' temporarily disabled.")
# see https://raw.githubusercontent.com/actions/python-versions/main/versions-manifest.json
data = {
# Python and PyPy versions supported by "setup-python" action
"python": {
"3.7": {"icon": "", "until": "2023.06.27"},
"3.8": {"icon": "🔴", "until": "2024.10"},
"3.9": {"icon": "🟠", "until": "2025.10"},
"3.10": {"icon": "🟡", "until": "2026.10"},
"3.11": {"icon": "🟢", "until": "2027.10"},
"3.12": {"icon": "🟢", "until": "2028.10"},
# "3.13": { "icon": "🟣", "until": "2028.10" },
"pypy-3.7": {"icon": "⟲⚫", "until": "????.??"},
"pypy-3.8": {"icon": "⟲🔴", "until": "????.??"},
"pypy-3.9": {"icon": "⟲🟠", "until": "????.??"},
"pypy-3.10": {"icon": "⟲🟡", "until": "????.??"},
},
# Runner systems (runner images) supported by GitHub Actions
"sys": {
"ubuntu": {"icon": "🐧", "runs-on": "ubuntu-24.04", "shell": "bash", "name": "Linux (x86-64)"},
"windows": {"icon": "🪟", "runs-on": "windows-latest", "shell": "pwsh", "name": "Windows (x86-64)"},
"macos": {"icon": "🍎", "runs-on": "macos-latest-large", "shell": "bash", "name": "macOS (x86-64)"},
"macos-arm": {"icon": "🍏", "runs-on": "macos-latest", "shell": "bash", "name": "macOS (arm64)"},
},
# Runtimes provided by MSYS2
"runtime": {
"msys": {"icon": "🪟🟪", "name": "Windows+MSYS2 (x86-64) - MSYS"},
"mingw32": {"icon": "🪟⬛", "name": "Windows+MSYS2 (x86-64) - MinGW32"},
"mingw64": {"icon": "🪟🟦", "name": "Windows+MSYS2 (x86-64) - MinGW64"},
"clang32": {"icon": "🪟🟫", "name": "Windows+MSYS2 (x86-64) - Clang32"},
"clang64": {"icon": "🪟🟧", "name": "Windows+MSYS2 (x86-64) - Clang64"},
"ucrt64": {"icon": "🪟🟨", "name": "Windows+MSYS2 (x86-64) - UCRT64"},
}
}
print(f"includes ({len(includes)}):")
for system, version in includes:
print(f"- {system}:{version}")
print(f"excludes ({len(excludes)}):")
for exclude in excludes:
print(f"- {exclude}")
print(f"disabled ({len(disabled)}):")
for disable in disabled:
print(f"- {disable}")
def match(combination: str, pattern: str) -> bool:
system, version = combination.split(":")
sys, ver = pattern.split(":")
if sys == "*":
return (ver == "*") or (version == ver)
elif system == sys:
return (ver == "*") or (version == ver)
else:
return False
def notIn(combination: str, patterns: Iterable[str]) -> bool:
for pattern in patterns:
if match(combination, pattern):
return False
return True
combinations = [
(system, version)
for system in systems
if system in data["sys"]
for version in versions
if version in data["python"]
and notIn(f"{system}:{version}", excludes)
and notIn(f"{system}:{version}", disabled)
] + [
(system, currentMSYS2Version)
for system in systems
if system in data["runtime"]
and notIn(f"{system}:{currentMSYS2Version}", excludes)
and notIn(f"{system}:{currentMSYS2Version}", disabled)
] + [
(system, version)
for system, version in includes
if system in data["sys"]
and version in data["python"]
and notIn(f"{system}:{version}", disabled)
]
print(f"Combinations ({len(combinations)}):")
for system, version in combinations:
print(f"- {system}:{version}")
jobs = [
{
"sysicon": data["sys"][system]["icon"],
"system": system,
"runs-on": data["sys"][system]["runs-on"],
"runtime": "native",
"shell": data["sys"][system]["shell"],
"pyicon": data["python"][version]["icon"],
"python": currentAlphaRelease if version == currentAlphaVersion else version,
"envname": data["sys"][system]["name"],
}
for system, version in combinations if system in data["sys"]
] + [
{
"sysicon": data["runtime"][runtime]["icon"],
"system": "msys2",
"runs-on": "windows-latest",
"runtime": runtime.upper(),
"shell": "msys2 {0}",
"pyicon": data["python"][currentMSYS2Version]["icon"],
"python": version,
"envname": data["runtime"][runtime]["name"],
}
for runtime, version in combinations if runtime not in data["sys"]
]
artifact_names = {
"unittesting_xml": f"{name}-UnitTestReportSummary-XML",
"unittesting_html": f"{name}-UnitTestReportSummary-HTML",
"perftesting_xml": f"{name}-PerformanceTestReportSummary-XML",
"benchtesting_xml": f"{name}-BenchmarkTestReportSummary-XML",
"apptesting_xml": f"{name}-ApplicationTestReportSummary-XML",
"codecoverage_sqlite": f"{name}-CodeCoverage-SQLite",
"codecoverage_xml": f"{name}-CodeCoverage-XML",
"codecoverage_json": f"{name}-CodeCoverage-JSON",
"codecoverage_html": f"{name}-CodeCoverage-HTML",
"statictyping_html": f"{name}-StaticTyping-HTML",
"package_all": f"{name}-Packages",
"documentation_html": f"{name}-Documentation-HTML",
"documentation_latex": f"{name}-Documentation-LaTeX",
"documentation_pdf": f"{name}-Documentation-PDF",
}
# Deprecated structure
params = {
"python_version": python_version,
"artifacts": {
"unittesting": f"{artifact_names['unittesting_xml']}",
"coverage": f"{artifact_names['codecoverage_html']}",
"typing": f"{artifact_names['statictyping_html']}",
"package": f"{artifact_names['package_all']}",
"doc": f"{artifact_names['documentation_html']}",
}
}
print("Parameters:")
print(f" python_version: {python_version}")
print(f" python_jobs ({len(jobs)}):\n" +
"".join(
[f" {{ " + ", ".join([f"\"{key}\": \"{value}\"" for key, value in job.items()]) + f" }},\n" for job in jobs])
)
print(f" artifact_names ({len(artifact_names)}):")
for id, name in artifact_names.items():
print(f" {id:>20}: {name}")
# Write jobs to special file
github_output = Path(getenv("GITHUB_OUTPUT"))
print(f"GITHUB_OUTPUT: {github_output}")
with github_output.open("a+", encoding="utf-8") as f:
f.write(dedent(f"""\
python_version={python_version}
python_jobs={json_dumps(jobs)}
artifact_names={json_dumps(artifact_names)}
params={json_dumps(params)}
"""))

View File

@@ -1,13 +1,13 @@
-r ../requirements.txt -r ../requirements.txt
# Coverage collection # Coverage collection
Coverage ~= 7.5 Coverage ~= 7.6
# Test Runner # Test Runner
pytest ~= 8.1 pytest ~= 8.3
pytest-cov ~= 5.0 pytest-cov ~= 5.0
# Static Type Checking # Static Type Checking
mypy ~= 1.9 mypy ~= 1.11
typing_extensions ~= 4.11 typing_extensions ~= 4.12
lxml ~= 5.1 lxml ~= 5.3