From d1258b3135ed9702ce7ad0bc9c5155db664e28d2 Mon Sep 17 00:00:00 2001 From: TimHuynh Date: Tue, 5 Nov 2024 14:25:22 -0600 Subject: [PATCH 01/10] inital Projektor proposal --- .gitignore | 3 + proposals/2024/10-01_projektor_integration.md | 232 ++++++++++++++++++ 2 files changed, 235 insertions(+) create mode 100644 proposals/2024/10-01_projektor_integration.md diff --git a/.gitignore b/.gitignore index b11dd723..506c637c 100644 --- a/.gitignore +++ b/.gitignore @@ -37,3 +37,6 @@ release-*.md # Affinity lock files *~lock~ + +# Ignore IDE files +.idea/ \ No newline at end of file diff --git a/proposals/2024/10-01_projektor_integration.md b/proposals/2024/10-01_projektor_integration.md new file mode 100644 index 00000000..d67397d4 --- /dev/null +++ b/proposals/2024/10-01_projektor_integration.md @@ -0,0 +1,232 @@ +# Projektor Native Integration + + + +| Key | Value | +| :-----------: |:------------------:| +| **Author(s)** | Tim.Huynh | +| **Reviewers** | | +| **Date** | October 21st, 2024 | +| **Status** | In Progress | + + + +## Background + + + +**Please provide a summary of the new feature, redesign or refactor:** + + + +* Currently, there is a lack of native support for handling, visualizing, and tracking test results across builds. Inspired by the feature set in Projektor.dev, this proposal aims to add a dedicated, native support `test-report` feature to Vela.This feature will allow users to parse, store, and visualize test results in a more user-friendly manner. + +**Please briefly answer the following questions:** + +1. Why is this required? + +* Current Projektor is an open-source project and not actively maintained. +* This allows users to access test reports without needing use vela-projektor plugin. +* It may set the groundwork for feature flags, meaning faster feature rollout and easier feature rollback. + +2. If this is a redesign or refactor, what issues exist in the current implementation? + +* Currently, there is no native support for handling, visualizing, and tracking test results across builds. + +3. Are there any other workarounds, and if so, what are the drawbacks? + +* Yes. Users can use the projektor-vela plugin to access test reports. However, this is not a native solution and requires additional setup. + +4. Are there any related issues? Please provide them below if any exist. + +* This proposal will replace the current projektor-vela plugin. But it will not be replacing projektor-tap plugin. + +## Design + + + +**Please describe your solution to the proposal. This includes, but is not limited to:** + +* Dedicated `test-report` step. +* Backend and UI enhancements to visualize test results. +* Object Storage integration for storing test results. +* Slack integration for real-time notifications of test results. + + +### 1. Test Report Step Configuration +A dedicated `test-report` step will be added at the end of the Vela pipeline. This step uses a specialized Docker image (`vela/test-report-agent:latest`) to handle parsing and reporting tasks. Users can define the format, file path, and retention period for test data within this step, ensuring flexibility for different testing frameworks and workflows. +#### Example Configuration in `.vela.yml` +```yaml +steps: + - name: test + image: golang:latest + commands: + - go test ./... -json > test-results.json + - name: test-report + image: vela/test-report-agent:latest + parameters: + report_format: json + report_path: "./test-results.json" + retention_days: 30 + notify_slack: true +``` +In this example: +- **`report_format`** specifies the format of the test results (e.g., JSON, JUnit XML). +- **`report_path`** defines the path to the test results file generated in the previous steps. +- **`retention_days`** allows users to set a retention policy for test data. +- **`notify_slack`** indicates whether to send notifications to Slack regarding the test results. +### 2. Test Report Step Workflow +- **Execution**: + - The pipeline’s test steps execute as usual, generating a test results file (e.g., `test-results.json`). + - The `test-report` step runs afterward, using the `vela/test-report-agent` image to parse and submit test results to the Vela backend. +### 3. Backend Enhancements +To support this new feature, Vela’s backend will require additional API endpoints and an expanded database schema. +#### Proposed Database Tables for Vela's Test Reporting +Based on the Projektor.dev architecture, here is a comprehensive list of tables required to store and manage test results, code quality, and related metrics in Vela's backend: +1. **`code_coverage_file`**: Stores file-level details for code coverage. +2. **`code_coverage_group`**: Represents groups or categories of coverage data. +3. **`code_coverage_stats`**: Holds detailed coverage statistics like statements covered, lines missed, and branches missed. +4. **`code_coverage_run`**: Aggregates coverage data for a specific run. +5. **`performance_results`**: Contains performance metrics, such as request count, average time, and maximum response time. +6. **`code_quality_report`**: Stores code quality report data, including file and group names. +7. **`test_run`**: Represents a single test run, storing counts of passed, failed, and skipped tests along with timing details. +8. **`results_metadata`**: Contains metadata related to each test run, such as CI information and group labels. +9. **`test_run_system_attributes`**: Stores system attributes for test runs, such as pinned status. +10. **`test_suite`**: Represents high-level test groupings, like test suites, with success and failure counts. +11. **`test_suite_group`**: Groups related test suites together for organized reporting. +12. **`test_case`**: Stores individual test cases, including their results, duration, and logs. +13. **`test_failure`**: Captures detailed information on test failures, including failure messages and stack traces. +14. **`test_run_attachment`**: Manages attachments for a test run, like log files and screenshots. +15. **`results_processing`**: Logs the status and errors (if any) from test results processing. +16. **`results_processing_failure`**: Tracks specific failure cases encountered during test processing. +17. **`shedlock`**: Used for distributed locking to prevent concurrent processing issues. + +Tables related to git metadata are omitted as Vela already has a robust git integration system. +### 4. Object Store Integration +To effectively manage test artifacts and large volumes of test result data, Go-Vela will incorporate an object store as part of the test reporting solution. +#### Key Features of the Object Store +- **Artifact Storage**: Store test artifacts such as logs, screenshots, and detailed reports generated during test runs. +- **Access and Retrieval**: Provide a straightforward API for storing and retrieving artifacts, ensuring easy access from Vela’s user interface and other tools. +- **Scalability**: Enable seamless scalability to accommodate growing amounts of test data over time. +- **Data Retention Policies**: Implement retention policies to manage the lifecycle of stored artifacts, ensuring that outdated data is archived or deleted as necessary. +### 5. Slack Integration +To keep teams informed about test results in real-time, Go-Vela will include integration with Slack. This feature will notify designated channels or users about the outcomes of test runs and any significant changes in test performance. +#### Notification Features +- **Test Result Notifications**: Send messages to a specified Slack channel with a summary of test results, including the number of tests run, passed, failed, and any relevant error messages. +- **Flaky Test Alerts**: Notify teams when flaky tests are detected, prompting investigation and resolution. +#### Example Configuration for Slack +Users can specify Slack settings in their `.vela.yml` configuration: +```yaml +slack: + - name: test-results + image: vela/test-report-agent:latest + ruleset: + status: [ failure ] + secrets: [ slack_webhook ] + parameters: + results: test-results/*.xml + project: hey-vela # Name of your project to include in the Slack message + filepath: heyvela_failure_message.json +``` +### 6. User Interface Enhancements +Vela’s UI will be enhanced to display test results in an intuitive and user-friendly manner. +- **Build Summary**: Include a **Test Results** section in the build summary view, displaying metrics such as total tests, pass/fail rates, and error messages. +- **Historical Data**: Provide a dashboard view showing trends in test pass/fail rates and test duration over time, allowing users to monitor stability and identify patterns. + +### 7. Key Features to Implement +- **Test Reports and Analytics**: Generate and display test results, with pass/fail rates, historical trends, and detailed test case information. +- **Code Coverage Metrics**: Calculate and visualize code coverage data, including line and branch coverage percentages. +- **Performance Metrics**: Track performance metrics like response times, request counts, and error rates. +- **Code Quality Reports**: Display code quality metrics, such as common patterns and potential bugs. +- **Flaky Test Detection**: Identify and flag flaky tests for further investigation. +- **Real-time Notifications**: Send Slack notifications for test results, flaky tests, and other critical events. + +### 8. API Endpoints +Vela’s backend will expose new API endpoints to support test reporting and analytics features. +#### Proposed API Endpoints +1. **`/api/v1/test-report`**: POST endpoint to submit test results for processing and storage. +2. **`/api/v1/test-report/{run_id}`**: GET endpoint to retrieve test results for a specific test run. +3. **`/api/v1/code-coverage`**: POST endpoint to submit code coverage data for processing and storage. +4. **`/api/v1/code-coverage/{run_id}`**: GET endpoint to fetch code coverage data for a specific run. +5. **`/api/v1/performance-metrics`**: POST endpoint to submit performance metrics for processing and storage. +6. **`/api/v1/performance-metrics/{run_id}`**: GET endpoint to retrieve performance metrics for a specific run. +7. **`/api/v1/code-quality`**: POST endpoint to submit code quality reports for processing and storage. +8. **`/api/v1/code-quality/{run_id}`**: GET endpoint to fetch code quality data for a specific run. +9. **`/api/v1/flaky-tests`**: GET endpoint to list flaky tests detected in the system. +10. **`/api/v1/test-notifications`**: POST endpoint to send test result notifications to Slack channels. + +**NOTES**: The list above is not a complete list of API endpoints but provides a starting point for implementing test reporting features in Vela. The API endpoints listed above are subject to change based on the final implementation details. +## Implementation +### Phases +1. **Phase 1: Basic Test Reporting** + - Implement the `test-report` step and backend support for storing test results. + - Simple UI to display test results. +- **Phase 2: Advanced Features** + - Add code coverage, performance metrics, and code quality reporting. + - Integrate object storage for test artifacts. + - Enhance the UI to visualize test data and metrics. +- **Phase 3: Slack Integration** + - Implement Slack notifications for test results and flaky tests. + - Allow users to configure Slack settings in `.vela.yml`. +- **Phase 4: Historical Data and Analytics** + - Develop a dashboard to show historical test data and trends. + - Add analytics features to track test performance over time. + + + +**Please briefly answer the following questions:** + +1. Is this something you plan to implement yourself? + + +* Yes + +2. What's the estimated time to completion? + + +* Multi-release + +**Please provide all tasks (gists, issues, pull requests, etc.) completed to implement the design:** + + + From d8becbb3a9759aefd8ab100da2129cc6c030f62b Mon Sep 17 00:00:00 2001 From: TimHuynh Date: Tue, 5 Nov 2024 14:28:10 -0600 Subject: [PATCH 02/10] edit api --- proposals/2024/10-01_projektor_integration.md | 30 ++++++++++--------- 1 file changed, 16 insertions(+), 14 deletions(-) diff --git a/proposals/2024/10-01_projektor_integration.md b/proposals/2024/10-01_projektor_integration.md index d67397d4..b0fff3fc 100644 --- a/proposals/2024/10-01_projektor_integration.md +++ b/proposals/2024/10-01_projektor_integration.md @@ -138,6 +138,22 @@ Based on the Projektor.dev architecture, here is a comprehensive list of tables 17. **`shedlock`**: Used for distributed locking to prevent concurrent processing issues. Tables related to git metadata are omitted as Vela already has a robust git integration system. + + +#### Proposed API Endpoints +Vela’s backend will expose new API endpoints to support test reporting and analytics features. +1. **`/api/v1/test-report`**: POST endpoint to submit test results for processing and storage. +2. **`/api/v1/test-report/{run_id}`**: GET endpoint to retrieve test results for a specific test run. +3. **`/api/v1/code-coverage`**: POST endpoint to submit code coverage data for processing and storage. +4. **`/api/v1/code-coverage/{run_id}`**: GET endpoint to fetch code coverage data for a specific run. +5. **`/api/v1/performance-metrics`**: POST endpoint to submit performance metrics for processing and storage. +6. **`/api/v1/performance-metrics/{run_id}`**: GET endpoint to retrieve performance metrics for a specific run. +7. **`/api/v1/code-quality`**: POST endpoint to submit code quality reports for processing and storage. +8. **`/api/v1/code-quality/{run_id}`**: GET endpoint to fetch code quality data for a specific run. +9. **`/api/v1/flaky-tests`**: GET endpoint to list flaky tests detected in the system. +10. **`/api/v1/test-notifications`**: POST endpoint to send test result notifications to Slack channels. + + ### 4. Object Store Integration To effectively manage test artifacts and large volumes of test result data, Go-Vela will incorporate an object store as part of the test reporting solution. #### Key Features of the Object Store @@ -177,20 +193,6 @@ Vela’s UI will be enhanced to display test results in an intuitive and user-fr - **Flaky Test Detection**: Identify and flag flaky tests for further investigation. - **Real-time Notifications**: Send Slack notifications for test results, flaky tests, and other critical events. -### 8. API Endpoints -Vela’s backend will expose new API endpoints to support test reporting and analytics features. -#### Proposed API Endpoints -1. **`/api/v1/test-report`**: POST endpoint to submit test results for processing and storage. -2. **`/api/v1/test-report/{run_id}`**: GET endpoint to retrieve test results for a specific test run. -3. **`/api/v1/code-coverage`**: POST endpoint to submit code coverage data for processing and storage. -4. **`/api/v1/code-coverage/{run_id}`**: GET endpoint to fetch code coverage data for a specific run. -5. **`/api/v1/performance-metrics`**: POST endpoint to submit performance metrics for processing and storage. -6. **`/api/v1/performance-metrics/{run_id}`**: GET endpoint to retrieve performance metrics for a specific run. -7. **`/api/v1/code-quality`**: POST endpoint to submit code quality reports for processing and storage. -8. **`/api/v1/code-quality/{run_id}`**: GET endpoint to fetch code quality data for a specific run. -9. **`/api/v1/flaky-tests`**: GET endpoint to list flaky tests detected in the system. -10. **`/api/v1/test-notifications`**: POST endpoint to send test result notifications to Slack channels. - **NOTES**: The list above is not a complete list of API endpoints but provides a starting point for implementing test reporting features in Vela. The API endpoints listed above are subject to change based on the final implementation details. ## Implementation ### Phases From 1be26f1c5274f3f6e57944c7d6f1dc4b1d4edde6 Mon Sep 17 00:00:00 2001 From: TimHuynh Date: Tue, 5 Nov 2024 15:05:50 -0600 Subject: [PATCH 03/10] feature flag for slack --- proposals/2024/10-01_projektor_integration.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/proposals/2024/10-01_projektor_integration.md b/proposals/2024/10-01_projektor_integration.md index b0fff3fc..d2e1bd14 100644 --- a/proposals/2024/10-01_projektor_integration.md +++ b/proposals/2024/10-01_projektor_integration.md @@ -161,8 +161,8 @@ To effectively manage test artifacts and large volumes of test result data, Go-V - **Access and Retrieval**: Provide a straightforward API for storing and retrieving artifacts, ensuring easy access from Vela’s user interface and other tools. - **Scalability**: Enable seamless scalability to accommodate growing amounts of test data over time. - **Data Retention Policies**: Implement retention policies to manage the lifecycle of stored artifacts, ensuring that outdated data is archived or deleted as necessary. -### 5. Slack Integration -To keep teams informed about test results in real-time, Go-Vela will include integration with Slack. This feature will notify designated channels or users about the outcomes of test runs and any significant changes in test performance. +### 5. Slack Integration (Feature Flag) +To keep teams informed about test results in real-time, Go-Vela will include integration with Slack as a feature flag. This feature will notify designated channels or users about the outcomes of test runs and any significant changes in test performance. #### Notification Features - **Test Result Notifications**: Send messages to a specified Slack channel with a summary of test results, including the number of tests run, passed, failed, and any relevant error messages. - **Flaky Test Alerts**: Notify teams when flaky tests are detected, prompting investigation and resolution. @@ -196,7 +196,7 @@ Vela’s UI will be enhanced to display test results in an intuitive and user-fr **NOTES**: The list above is not a complete list of API endpoints but provides a starting point for implementing test reporting features in Vela. The API endpoints listed above are subject to change based on the final implementation details. ## Implementation ### Phases -1. **Phase 1: Basic Test Reporting** +- **Phase 1: Basic Test Reporting** - Implement the `test-report` step and backend support for storing test results. - Simple UI to display test results. - **Phase 2: Advanced Features** From 9346b1035a42cb6db1e39215494fe42ce8462ae5 Mon Sep 17 00:00:00 2001 From: Tim Huynh Date: Wed, 6 Nov 2024 14:09:48 -0600 Subject: [PATCH 04/10] Update proposals/2024/10-01_projektor_integration.md Co-authored-by: David May <49894298+wass3rw3rk@users.noreply.github.com> --- proposals/2024/10-01_projektor_integration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/proposals/2024/10-01_projektor_integration.md b/proposals/2024/10-01_projektor_integration.md index d2e1bd14..957bbaf9 100644 --- a/proposals/2024/10-01_projektor_integration.md +++ b/proposals/2024/10-01_projektor_integration.md @@ -58,7 +58,7 @@ Provide your description here. 1. Why is this required? * Current Projektor is an open-source project and not actively maintained. -* This allows users to access test reports without needing use vela-projektor plugin. +* This allows users to access test reports without needing use special test plugins that interact with other infrastructure. * It may set the groundwork for feature flags, meaning faster feature rollout and easier feature rollback. 2. If this is a redesign or refactor, what issues exist in the current implementation? From ff1f1815106c4051b193ca6f06ec283d2d081207 Mon Sep 17 00:00:00 2001 From: Tim Huynh Date: Wed, 6 Nov 2024 14:10:07 -0600 Subject: [PATCH 05/10] Update proposals/2024/10-01_projektor_integration.md Co-authored-by: David May <49894298+wass3rw3rk@users.noreply.github.com> --- proposals/2024/10-01_projektor_integration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/proposals/2024/10-01_projektor_integration.md b/proposals/2024/10-01_projektor_integration.md index 957bbaf9..fbe4aa24 100644 --- a/proposals/2024/10-01_projektor_integration.md +++ b/proposals/2024/10-01_projektor_integration.md @@ -67,7 +67,7 @@ Provide your description here. 3. Are there any other workarounds, and if so, what are the drawbacks? -* Yes. Users can use the projektor-vela plugin to access test reports. However, this is not a native solution and requires additional setup. +* Yes. Users can set up separate infra and interact with it using a plugin to upload and access test reports. However, this is not a native solution and requires additional setup. 4. Are there any related issues? Please provide them below if any exist. From 35660fd2f2bcd09252fa201f5b42333361a85920 Mon Sep 17 00:00:00 2001 From: TimHuynh Date: Wed, 6 Nov 2024 15:09:44 -0600 Subject: [PATCH 06/10] update --- proposals/2024/10-01_projektor_integration.md | 6 ++---- 1 file changed, 2 insertions(+), 4 deletions(-) diff --git a/proposals/2024/10-01_projektor_integration.md b/proposals/2024/10-01_projektor_integration.md index fbe4aa24..91caac4c 100644 --- a/proposals/2024/10-01_projektor_integration.md +++ b/proposals/2024/10-01_projektor_integration.md @@ -57,9 +57,7 @@ Provide your description here. 1. Why is this required? -* Current Projektor is an open-source project and not actively maintained. -* This allows users to access test reports without needing use special test plugins that interact with other infrastructure. -* It may set the groundwork for feature flags, meaning faster feature rollout and easier feature rollback. +* Test reporting is a widely used feature in CI/CD pipelines, providing insights into test results, code quality, and performance metrics. By integrating Projektor's test reporting features natively into Vela, users can easily track and analyze test data across builds. 2. If this is a redesign or refactor, what issues exist in the current implementation? @@ -71,7 +69,7 @@ Provide your description here. 4. Are there any related issues? Please provide them below if any exist. -* This proposal will replace the current projektor-vela plugin. But it will not be replacing projektor-tap plugin. +* https://github.com/go-vela/community/issues/528 ## Design From 80050eec740fc99f0ec1061ff4b436358382b511 Mon Sep 17 00:00:00 2001 From: TimHuynh Date: Fri, 15 Nov 2024 12:20:18 -0600 Subject: [PATCH 07/10] update proposal with object store details --- proposals/2024/10-01_projektor_integration.md | 209 +++++++++++------- 1 file changed, 126 insertions(+), 83 deletions(-) diff --git a/proposals/2024/10-01_projektor_integration.md b/proposals/2024/10-01_projektor_integration.md index 91caac4c..00c8ce0c 100644 --- a/proposals/2024/10-01_projektor_integration.md +++ b/proposals/2024/10-01_projektor_integration.md @@ -81,59 +81,132 @@ NOTE: If there are no current plans for a solution, please leave this section bl **Please describe your solution to the proposal. This includes, but is not limited to:** +* Object Storage integration for storing test results. * Dedicated `test-report` step. * Backend and UI enhancements to visualize test results. -* Object Storage integration for storing test results. -* Slack integration for real-time notifications of test results. - -### 1. Test Report Step Configuration -A dedicated `test-report` step will be added at the end of the Vela pipeline. This step uses a specialized Docker image (`vela/test-report-agent:latest`) to handle parsing and reporting tasks. Users can define the format, file path, and retention period for test data within this step, ensuring flexibility for different testing frameworks and workflows. -#### Example Configuration in `.vela.yml` +### 1. Object Store Integration +Vela will integrate with an S3 compatible object storage system to store test results, code coverage data, and other artifacts. This integration will allow users to store large volumes of test data securely and efficiently. + +#### Challenges +- **Scalability**: + - Support large volumes of test data and artifacts across multiple builds. + - Bucket lifecycle management to manage data retention and cleanup. +- **Security and Access Control**: + - Access control mechanisms to restrict access to test data. + - Data encryption at rest and in transit. +- **Performance**: Ensure fast and reliable storage and retrieval of test results and artifacts. +- **Cross-Platform Compatibility**: While many object storage systems are S3-compatible, ensuring compatibility with various systems is essential. + +#### Implementation Details +- **Backend Configuration**: +- **Typical API Endpoints**: + - **List Buckets**: GET / + - **Create Bucket**: PUT /{bucket} + - **List Objects**: GET /{bucket} + - **Post Object**: POST /{bucket}/{object} + - **Get Object**: GET /{bucket}/{object} + - **Bucket Configuration**: PUT /{bucket}/config + + +##### Example Configuration in `docker-compose.yml` ```yaml -steps: - - name: test - image: golang:latest - commands: - - go test ./... -json > test-results.json - - name: test-report - image: vela/test-report-agent:latest - parameters: - report_format: json - report_path: "./test-results.json" - retention_days: 30 - notify_slack: true +services: + vela-worker: + image: vela-worker:latest + environment: + - VELA_STORAGE_TYPE=s3 + - VELA_STORAGE_ENDPOINT=http://minio:9000 + - VELA_STORAGE_BUCKET=test-reports + - VELA_STORAGE_ACCESS_KEY=minioadmin + - VELA_STORAGE_SECRET_KEY=minioadmin + minio: + image: minio/minio + command: server /data + ports: + - "9000:9000" + environment: + MINIO_ROOT_USER: minioadmin + MINIO_ROOT_PASSWORD: minioadmin + volumes: + - minio-data:/data +volumes: + minio-data: ``` -In this example: -- **`report_format`** specifies the format of the test results (e.g., JSON, JUnit XML). -- **`report_path`** defines the path to the test results file generated in the previous steps. -- **`retention_days`** allows users to set a retention policy for test data. -- **`notify_slack`** indicates whether to send notifications to Slack regarding the test results. -### 2. Test Report Step Workflow + + +### 2. Test Report Step Configuration + #### Solution 1: Dedicated Docker Image for Test Reporting + A dedicated `test-report` step will be added at the end of the Vela pipeline. This step uses a specialized Docker image (`vela/test-report-agent:latest`) to handle parsing and reporting tasks. Users can define the format, file path, and retention period for test data within this step, ensuring flexibility for different testing frameworks and workflows. + ##### Example Configuration in `.vela.yml` + ```yaml + steps: + - name: test + image: golang:latest + commands: + - go test ./... -json > test-results.json + - name: test-report + image: vela/test-report-agent:latest + parameters: + report_format: json + report_path: "./test-results.json" + ``` + In this example: + - **`report_format`** specifies the format of the test results (e.g., JSON, JUnit XML). + - **`report_path`** defines the path to the test results file generated in the previous steps. + +##### Challenges +- Image Maintenance: Ensure the `test-report-agent` image is regularly updated and maintained including dependencies and security patches. +- Latency and Performance: Pulling image for every build may introduce latency, impacting build times. + +#### Test Report Step Workflow +- **Execution**: + - The pipeline’s test steps execute as usual, generating a test results file (e.g., `test-results.json`). + - The `test-report` step runs afterward, using the `vela/test-report-agent` image to parse and submit test results to the Vela backend. + + #### Solution 2: Leveraging container output + Alternatively, users can leverage the output of a container to pass test results to the `test-report` step. This approach allows users to generate test results within a container and pass them to the `test-report` step for processing. + ##### Example Configuration in `.vela.yml` + ``` yaml + steps: + - name: test + image: golang:latest + commands: + - go test ./... -json > vela/outputs/test-results.json + - name: read test report from outputs + image: ${IMAGE} # resolves to `ubuntu` + pull: on_start + parameters: + report_format: json + report_path: "./test-results.json" + ``` + In this example: + - **`report_format`** specifies the format of the test results (e.g., JSON, JUnit XML). + - **`report_path`** defines the path to the test results file generated in the previous steps. + + ##### Challenges + - Increased Complexity: Parsing logic must handle various formats and edge cases. + - Scalability: Ensure the system can handle large volumes of test data efficiently and parsing large files without performance degradation. + - Security: Implement secure parsing and storage mechanisms to protect sensitive test data. + +#### Test Report Step Workflow - **Execution**: - The pipeline’s test steps execute as usual, generating a test results file (e.g., `test-results.json`). - - The `test-report` step runs afterward, using the `vela/test-report-agent` image to parse and submit test results to the Vela backend. + - The `test-report` step runs afterward, worker will parse the test results file and submit the data to the Vela backend for storage and processing. + ### 3. Backend Enhancements To support this new feature, Vela’s backend will require additional API endpoints and an expanded database schema. #### Proposed Database Tables for Vela's Test Reporting -Based on the Projektor.dev architecture, here is a comprehensive list of tables required to store and manage test results, code quality, and related metrics in Vela's backend: +Here is a comprehensive list of tables required to store and manage test results, code quality, and related metrics in Vela's backend: 1. **`code_coverage_file`**: Stores file-level details for code coverage. -2. **`code_coverage_group`**: Represents groups or categories of coverage data. -3. **`code_coverage_stats`**: Holds detailed coverage statistics like statements covered, lines missed, and branches missed. -4. **`code_coverage_run`**: Aggregates coverage data for a specific run. -5. **`performance_results`**: Contains performance metrics, such as request count, average time, and maximum response time. -6. **`code_quality_report`**: Stores code quality report data, including file and group names. -7. **`test_run`**: Represents a single test run, storing counts of passed, failed, and skipped tests along with timing details. -8. **`results_metadata`**: Contains metadata related to each test run, such as CI information and group labels. -9. **`test_run_system_attributes`**: Stores system attributes for test runs, such as pinned status. -10. **`test_suite`**: Represents high-level test groupings, like test suites, with success and failure counts. -11. **`test_suite_group`**: Groups related test suites together for organized reporting. -12. **`test_case`**: Stores individual test cases, including their results, duration, and logs. -13. **`test_failure`**: Captures detailed information on test failures, including failure messages and stack traces. -14. **`test_run_attachment`**: Manages attachments for a test run, like log files and screenshots. -15. **`results_processing`**: Logs the status and errors (if any) from test results processing. -16. **`results_processing_failure`**: Tracks specific failure cases encountered during test processing. -17. **`shedlock`**: Used for distributed locking to prevent concurrent processing issues. +2. **`code_coverage_run`**: Aggregates coverage data for a specific run. +3. **`performance_results`**: Contains performance metrics, such as request count, average time, and maximum response time. +4. **`code_quality_report`**: Stores code quality report data, including file and group names. +5. **`test_run`**: Represents a single test run, storing counts of passed, failed, and skipped tests along with timing details. +6. **`test_suite`**: Represents high-level test groupings, like test suites, with success and failure counts. +7. **`test_case`**: Stores individual test cases, including their results, duration, and logs. +8. **`test_run_attachment`**: Manages attachments for a test run, like log files and screenshots. +9. **`results_processing`**: Logs the status and errors (if any) from test results processing. Tables related to git metadata are omitted as Vela already has a robust git integration system. @@ -151,62 +224,32 @@ Vela’s backend will expose new API endpoints to support test reporting and ana 9. **`/api/v1/flaky-tests`**: GET endpoint to list flaky tests detected in the system. 10. **`/api/v1/test-notifications`**: POST endpoint to send test result notifications to Slack channels. +**NOTES**: The list above is not a complete list of API endpoints but provides a starting point for implementing test reporting features in Vela. The API endpoints listed above are subject to change based on the final implementation details. -### 4. Object Store Integration -To effectively manage test artifacts and large volumes of test result data, Go-Vela will incorporate an object store as part of the test reporting solution. -#### Key Features of the Object Store -- **Artifact Storage**: Store test artifacts such as logs, screenshots, and detailed reports generated during test runs. -- **Access and Retrieval**: Provide a straightforward API for storing and retrieving artifacts, ensuring easy access from Vela’s user interface and other tools. -- **Scalability**: Enable seamless scalability to accommodate growing amounts of test data over time. -- **Data Retention Policies**: Implement retention policies to manage the lifecycle of stored artifacts, ensuring that outdated data is archived or deleted as necessary. -### 5. Slack Integration (Feature Flag) -To keep teams informed about test results in real-time, Go-Vela will include integration with Slack as a feature flag. This feature will notify designated channels or users about the outcomes of test runs and any significant changes in test performance. -#### Notification Features -- **Test Result Notifications**: Send messages to a specified Slack channel with a summary of test results, including the number of tests run, passed, failed, and any relevant error messages. -- **Flaky Test Alerts**: Notify teams when flaky tests are detected, prompting investigation and resolution. -#### Example Configuration for Slack -Users can specify Slack settings in their `.vela.yml` configuration: -```yaml -slack: - - name: test-results - image: vela/test-report-agent:latest - ruleset: - status: [ failure ] - secrets: [ slack_webhook ] - parameters: - results: test-results/*.xml - project: hey-vela # Name of your project to include in the Slack message - filepath: heyvela_failure_message.json -``` -### 6. User Interface Enhancements +### 4. User Interface Enhancements Vela’s UI will be enhanced to display test results in an intuitive and user-friendly manner. - **Build Summary**: Include a **Test Results** section in the build summary view, displaying metrics such as total tests, pass/fail rates, and error messages. - **Historical Data**: Provide a dashboard view showing trends in test pass/fail rates and test duration over time, allowing users to monitor stability and identify patterns. -### 7. Key Features to Implement +### 5. Key Features to Implement - **Test Reports and Analytics**: Generate and display test results, with pass/fail rates, historical trends, and detailed test case information. - **Code Coverage Metrics**: Calculate and visualize code coverage data, including line and branch coverage percentages. - **Performance Metrics**: Track performance metrics like response times, request counts, and error rates. - **Code Quality Reports**: Display code quality metrics, such as common patterns and potential bugs. - **Flaky Test Detection**: Identify and flag flaky tests for further investigation. -- **Real-time Notifications**: Send Slack notifications for test results, flaky tests, and other critical events. -**NOTES**: The list above is not a complete list of API endpoints but provides a starting point for implementing test reporting features in Vela. The API endpoints listed above are subject to change based on the final implementation details. ## Implementation ### Phases - **Phase 1: Basic Test Reporting** - - Implement the `test-report` step and backend support for storing test results. - - Simple UI to display test results. -- **Phase 2: Advanced Features** + - Integrate object storage (ability to hook up Vela to a storage system). + - Includes backend, API, and database changes. + - Implement the `test-report` step and backend support for storing test results to a storage system. + - UI/UX research. +- **Phase 2: UI** - Add code coverage, performance metrics, and code quality reporting. - - Integrate object storage for test artifacts. - - Enhance the UI to visualize test data and metrics. -- **Phase 3: Slack Integration** - - Implement Slack notifications for test results and flaky tests. - - Allow users to configure Slack settings in `.vela.yml`. -- **Phase 4: Historical Data and Analytics** - - Develop a dashboard to show historical test data and trends. - - Add analytics features to track test performance over time. + - Enhance the UI to visualize test results/data, including code coverage, performance metrics, and code quality reporting. +- **Phase 3: Enhanded UI** + - Enhance UI with visualizations and dashboards/historical data (trends). -* Currently, there is a lack of native support for handling, visualizing, and tracking test results across builds. Inspired by the feature set in Projektor.dev, this proposal aims to add a dedicated, native support `test-report` feature to Vela.This feature will allow users to parse, store, and visualize test results in a more user-friendly manner. +* Test reporting is a widely used capability across CI/CD platforms and often implemented as a bolt-on component requiring separate management of plugins and infrastructure. This proposal seeks to streamline the experience by integrating test reporting into Vela itself. +* Many platforms like Semaphore, CircleCI, Github Actions, Gitlab CI, Travis CI, Buildkite and Codefresh already offer robust test-reporting mechanisms such as: + - Enabling workflows to store, view, and manage artifacts, including test logs, for visibility across builds. + - Parsing JUnit XML, JSON, and other test result formats to display test results, pass/fail rates, and error messages. + - Visualizing test executions and integrates with advance analytics engines. +* With this integration, Vela can deliver similar capabilities but tailored to its ecosystem while ensuring compatibility with modern development workflows. + **Please briefly answer the following questions:** From 833929370474b13f37660daff92f594ec7f5be66 Mon Sep 17 00:00:00 2001 From: TimHuynh Date: Fri, 15 Nov 2024 13:06:08 -0600 Subject: [PATCH 09/10] change file name --- ...tor_integration.md => 10-01_native_test_report_integration.md} | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename proposals/2024/{10-01_projektor_integration.md => 10-01_native_test_report_integration.md} (100%) diff --git a/proposals/2024/10-01_projektor_integration.md b/proposals/2024/10-01_native_test_report_integration.md similarity index 100% rename from proposals/2024/10-01_projektor_integration.md rename to proposals/2024/10-01_native_test_report_integration.md From f65bac969f137607b18366028ce9b42d57ccea64 Mon Sep 17 00:00:00 2001 From: TimHuynh Date: Tue, 19 Nov 2024 13:16:03 -0600 Subject: [PATCH 10/10] change name --- proposals/2024/10-01_native_test_report_integration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/proposals/2024/10-01_native_test_report_integration.md b/proposals/2024/10-01_native_test_report_integration.md index 9ce82b31..433de776 100644 --- a/proposals/2024/10-01_native_test_report_integration.md +++ b/proposals/2024/10-01_native_test_report_integration.md @@ -1,4 +1,4 @@ -# Projektor Native Integration +# Test Report Native Integration