GraphNet is a large-scale dataset of deep learning computation graphs, designed to serve as a standard benchmark and training corpus for AI-driven tensor compiler optimization. It contains diverse graphs extracted from state-of-the-art models, enabling effective evaluation of compiler pass optimizations across frameworks and hardware platforms.
With GraphNet, users can:
- Quickly benchmark the optimization performance of various compiler strategies.
- Easily conduct regression tests on existing compilers.
- Train AI‑for‑Systems models to automatically generate compiler optimization passes.
Vision: We aim to achieve cross-hardware portability of compiler optimizations by allowing models to learn and transfer optimization strategies. It will significantly reduce the manual effort required to develop efficient operator implementations.
- Dynamic graphs must execute correctly.
- Each computation graph should include a standardized method for measuring performance.
- Graphs and their corresponding Python code must support serialization and deserialization.
- The full graph can be decomposed into two disjoint subgraphs.
- Compiler passes or behaviors must be configurable.
- Operator names within each computation graph must be statically parseable.
- If custom operators are used, their implementation code must be fully accessible.
- Graph execution on different hardware backends must be configurable via a unified interface.
For full implementation details, please refer to the Co-Creation Tutorial.
graph_net.torch.test_compiler
python3 -m graph_net.torch.test_compiler \
--model-path $GRAPH_NET_EXTRACT_WORKSPACE/model_name/ \
--compiler /path/to/custom/compiler
# Note: if --compiler is omitted, PyTorch’s built-in compiler is used by default
Demo: Extract & Validate ResNet‑18
git clone https://github.com/PaddlePaddle/GraphNet.git
cd GraphNet
# Set your workspace directory
export GRAPH_NET_EXTRACT_WORKSPACE=/home/yourname/graphnet_workspace
# Extract the ResNet‑18 computation graph
python graph_net/test/vision_model_test.py
# Validate the extracted graph (e.g. /home/yourname/graphnet_workspace/resnet18)
python -m graph_net.torch.validate \
--model-path $GRAPH_NET_EXTRACT_WORKSPACE/resnet18
graph_net.torch.extract
import graph_net
# Instantiate the model (e.g. a torchvision model)
model = ...
# Extract your own model
model = graph_net.torch.extract(name="model_name")(model)
# After running, the extracted graph will be saved to:
# $GRAPH_NET_EXTRACT_WORKSPACE/model_name
graph_net.torch.validate
# Verify that the extracted model meets requirements
python -m graph_net.torch.validate \
--model-path $GRAPH_NET_EXTRACT_WORKSPACE/model_name
graph_net.pack
# Create a ZIP archive of $GRAPH_NET_EXTRACT_WORKSPACE.
# The --clear-after-pack flag (True|False) determines whether to delete the workspace after packing.
python -m graph_net.pack \
--output /path/to/output.zip \
--clear-after-pack True
Note: To configure your user details (username and email) for GraphNet, run:
python -m graph_net.config --global \
--username "your-name" \
--email "your-email"
Once you have packaged these extracted computation graphs, submit them to the GraphNet community via the following group chats. Discord is also available.
This project is released under the MIT License.