Skip to content

[WIP][Native WebGPU] Add Conv, ConTranspose and FusedConv #155

[WIP][Native WebGPU] Add Conv, ConTranspose and FusedConv

[WIP][Native WebGPU] Add Conv, ConTranspose and FusedConv #155

Workflow file for this run

# This workflow builds and tests the ONNX Runtime for Linux in both Debug and Release configurations.
# It uses a Docker container to provide a consistent build environment.
#
# The workflow consists of two jobs:
# - build-debug: Builds and tests the Debug configuration.
# - Uses the 'Debug' build configuration.
# - Enables AddressSanitizer for memory error detection.
# - Builds and runs tests.
# - build-release: Builds and tests the Release configuration.
# - Uses the 'Release' build configuration.
# - Includes additional flags for release builds:
# - --use_binskim_compliant_compile_flags
# - --build_wheel
# - --build_csharp
# - --enable_transformers_tool_test
# - --cmake_extra_defines onnxruntime_BUILD_BENCHMARKS=ON
# - Builds and runs tests.
#
# The two jobs run in parallel to reduce the overall build time.
# Both jobs use the same Docker image, built in a separate step.
name: Linux CI
on:
push:
branches: [ main, 'rel-*']
pull_request:
branches: [ main, 'rel-*']
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
linux-cpu-asan-debug: # Job for building and testing the Debug configuration
runs-on: ["self-hosted", "1ES.Pool=onnxruntime-github-Ubuntu2204-AMD-CPU"]
permissions:
actions: read
contents: read
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Set up Python 3.x
uses: actions/setup-python@v4
with:
python-version: '3.x'
- name: Build Docker Image
uses: microsoft/onnxruntime-github-actions/[email protected]
with:
Dockerfile: ${{ github.workspace }}/tools/ci_build/github/linux/docker/inference/x86_64/default/cpu/Dockerfile
Repository: 'onnxruntimecpubuildcentos8x64'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed for context
- name: Create .onnx directory
run: mkdir -p $HOME/.onnx
# Build and Test ONNX Runtime in Docker (Debug)
- name: Build and Test ONNX Runtime in Docker (Debug)
env:
ALLOW_RELEASED_ONNX_OPSET_ONLY: 0
NIGHTLY_BUILD: 1 # Assuming you want nightly build for both Debug and Release
run: |
docker run --rm \
--volume /data/onnx:/data/onnx:ro \
--volume /data/models:/data/models:ro \
--volume ${{ github.workspace }}:/onnxruntime_src \
--volume $HOME/.onnx:/home/onnxruntimedev/.onnx \
-w /onnxruntime_src \
-e ALLOW_RELEASED_ONNX_OPSET_ONLY \
-e NIGHTLY_BUILD \
onnxruntimecpubuildcentos8x64 \
/bin/bash -c 'set -ex; \
# Build with Debug configuration and AddressSanitizer enabled
python3 tools/ci_build/build.py \
--build_dir build/Debug --cmake_generator Ninja \
--config Debug \
--skip_submodule_sync \
--build_shared_lib \
--parallel \
--use_vcpkg --use_vcpkg_ms_internal_asset_cache \
--enable_onnx_tests \
--enable_address_sanitizer \
--update --build;
# Run tests with Debug configuration
python3 tools/ci_build/build.py \
--build_dir build/Debug --cmake_generator Ninja \
--config Debug \
--skip_submodule_sync \
--build_shared_lib \
--parallel \
--use_vcpkg --use_vcpkg_ms_internal_asset_cache \
--enable_onnx_tests \
--enable_address_sanitizer \
--test;'
linux-cpu-release: # Job for building and testing the Release configuration
runs-on: ["self-hosted", "1ES.Pool=onnxruntime-github-Ubuntu2204-AMD-CPU"]
permissions:
actions: read
contents: read
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Set up Python 3.x
uses: actions/setup-python@v4
with:
python-version: '3.x'
- name: Build Docker Image
uses: microsoft/onnxruntime-github-actions/[email protected]
with:
Dockerfile: ${{ github.workspace }}/tools/ci_build/github/linux/docker/Dockerfile.manylinux2_28_cpu
Repository: 'onnxruntimecpubuild'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed for context
- name: Create .onnx directory
run: mkdir -p $HOME/.onnx
# Build and Test ONNX Runtime in Docker (Release)
- name: Build and Test ONNX Runtime in Docker (Release)
env:
ALLOW_RELEASED_ONNX_OPSET_ONLY: 0
NIGHTLY_BUILD: 1
run: |
docker run --rm \
--volume /data/onnx:/data/onnx:ro \
--volume /data/models:/data/models:ro \
--volume ${{ github.workspace }}:/onnxruntime_src \
--volume $HOME/.onnx:/home/onnxruntimedev/.onnx \
-w /onnxruntime_src \
-e ALLOW_RELEASED_ONNX_OPSET_ONLY \
-e NIGHTLY_BUILD \
onnxruntimecpubuild \
/bin/bash -c 'set -ex; \
# Build with Release configuration and additional flags for release builds
PATH=/opt/python/cp310-cp310/bin:$PATH python3 tools/ci_build/build.py \
--build_dir build/Release --cmake_generator Ninja \
--config Release \
--skip_submodule_sync \
--build_shared_lib \
--parallel \
--use_vcpkg --use_vcpkg_ms_internal_asset_cache \
--enable_onnx_tests \
--use_binskim_compliant_compile_flags --build_wheel --build_csharp --enable_transformers_tool_test --cmake_extra_defines onnxruntime_BUILD_BENCHMARKS=ON \
--update --build;
# Run tests with Release configuration
PATH=/opt/python/cp310-cp310/bin:$PATH python3 tools/ci_build/build.py \
--build_dir build/Release --cmake_generator Ninja \
--config Release \
--skip_submodule_sync \
--build_shared_lib \
--parallel \
--use_vcpkg --use_vcpkg_ms_internal_asset_cache \
--enable_onnx_tests \
--use_binskim_compliant_compile_flags --build_wheel --build_csharp --enable_transformers_tool_test --cmake_extra_defines onnxruntime_BUILD_BENCHMARKS=ON \
--test;'