JavaScript GitHub Action that surfaces pytest JUnit XML results in the workflow summary. Built for multi-suite pipelines (e.g., chroot, QEMU, cloud), with configurable columns and focused details.
Fork of the excellent pytest-results-action by @pmeier, extended for multi-suite aggregation and metadata.
- Multi-suite aggregation across multiple XML files or patterns
- Compact summary and per-suite results table with configurable result columns
- Optional details section focused on selected result types (e.g., failed, error)
- Metadata support via pytest-metadata and filename fallback
Ensure pytest emits JUnit XML, then point the action at the files:
- name: Download test artifacts
uses: actions/download-artifact@v4
with:
pattern: "*test*"
path: test-artifacts
merge-multiple: true
- name: Test Report
if: always()
uses: gardenlinux/pytest-multi-results-action@main
with:
files: |
test-artifacts/**/results.xml
test-artifacts/**/*.test.xml
title: "Test Results"
summary: true
details: true
result-types: "passed,skipped,failed,error"
details-result-types: "failed,error"
fail-on-empty: false| Name | Description | Required | Default |
|---|---|---|---|
files |
JUnit XML file patterns (file, dir, or glob). Multi-suite aggregation supported | Yes | - |
title |
Section title | No | Test results |
summary |
Include top-level test summary | No | true |
result-types |
CSV result categories shown in summary/table | No | passed,skipped,xfailed,failed,xpassed,error |
details |
Show details section | No | true |
details-result-types |
CSV result categories shown in details | No | failed,error |
metadata-fields |
CSV metadata fields to show as columns (order preserved). Defaults to Suite | No | "" |
metadata-field-mapping |
JSON map of metadata field → display name | No | {} |
fail-on-empty |
Fail if no XML found | No | true |
Allowed result categories: passed, skipped, xfailed, failed, xpassed, error.
This action can display extra columns (e.g., Artifact, Type, Namespace) by reading metadata embedded in the JUnit XML. The simplest way to inject these is with the pytest-metadata plugin.
Install and emit metadata:
pip install pytest-metadata
pytest \
--metadata Artifact Artifact1 \
--metadata Type Type1 \
--metadata Namespace Namespace1 \
--junit-xml=results.xmlThen configure which fields to show in the table and optionally map display names:
with:
files: test-artifacts/**/*.xml
metadata-fields: "Namespace,Type,Artifact"
metadata-field-mapping: '{"Artifact": "Test Artifact", "Type": "Test Type", "Namespace": "Namespace"}'Notes:
- The action reads
pytest-metadataproperties from the XML and makes them available as columns. - If some fields are missing, it falls back to deriving basic info from filenames when possible.
- name: Test Report
if: always()
uses: gardenlinux/pytest-multi-results-action@main
with:
files: test-artifacts/**/*.xml
summary: true
details: true
result-types: "failed,error"
details-result-types: "failed,error"- name: Test Report (with metadata)
if: always()
uses: gardenlinux/pytest-multi-results-action@main
with:
files: test-artifacts/**/*.xml
title: "Test Results"
summary: true
details: true
result-types: "passed,skipped,failed,error"
details-result-types: "failed,error"
metadata-fields: "Namespace,Type,Artifact"
metadata-field-mapping: '{"Artifact": "Test Artifact", "Type": "Test Type", "Namespace": "Namespace"}'
fail-on-empty: falsePosted to the workflow summary:
- Test Summary — aggregated counts across suites (for selected
result-types). - Test Results — per-suite table (metadata columns + Total + selected
result-types+ Duration). Failed/Error counts link to details. - Test Details — per-suite sections for selected
details-result-typesthat have items. Each type capped to keep the report concise.
- Keep summaries under GitHub's size limit (1 MB). Prefer showing only failed/error in details; upload full logs/artifacts as needed.
- If metadata is absent, the action falls back to extracting info from filenames.
npm run format
npm run test
npm run test-metadata
npm run buildForked from pytest-results-action by @pmeier.