This repository contains the implementation of our paper: "Learning Interpretable Differentiable Logic Networks", Chang Yue and Niraj K. Jha, IEEE Transactions on Circuits and Systems for Artificial Intelligence, 2024
-
Install the required dependencies:
pip install -r requirements.txtWe conducted experiments using Python 3.12.
-
Prepare datasets by following some standards (i.e., data path, a specific order of categorical features, continuous features, and target). An example notebook can be found in
quickstart/prepare_dataset.ipynb. -
Train and evaluate DLNs. An example notebook can be found in
quickstart/train_eval_viz.ipynb. An example command is:python experiments/main.py \ --train_model True \ --evaluate_model True \ --dataset Heart \ --seed 0 \ --num_epochs 1000 \ --batch_size 64 \ --learning_rate 0.2 \ --tau_out 3 \ --grad_factor 1.2 \ --first_hl_size 50 \ --last_hl_size_wrt_first 0.25 \ --num_hidden_layers 4 \ --discretize_strategy tree \ --continuous_resolution 4 \ --concat_input True \ --save_modelWhen wandb is enabled, you can add
--log_trainingto log the training process. There are many other features, and we encourage you to explore the arguments in ourexperiments/main.pyfile. -
Visualize the model using the SymPy code saved in the previous step:
python experiments/DLN_viz.py results/Heart/seed_0/sympy_code.py quickstart/example/vizA graph named
viz.pngwill be generated. If the generation takes too long, check the parts marked with "NOTE" in theexperiments/simplify_model.pyfile.
We use Ray to run experiments in parallel. Here are the steps:
-
Specify datasets, seed sets, computing resource settings, etc. in
experiments/settings.py. -
Hyperparameter optimization (HPO) using Ray Tune:
TORCH_NUM_THREADS=1 python experiments/run_experiments.py --params-searchThe HPO results and the selected hyperparameters will be saved in the
model_paramsdirectory. You can useexperiments/params_search_analysis.ipynbto research how different factors affect performance. In general, learning rate (and its scaling factor, output temperature) is the most important factor, and for DLNs, a large initial learning rate is needed because they have layers of Softmax functions. -
Train models using selected hyperparameters:
TORCH_NUM_THREADS=1 python experiments/run_experiments.py --trainThe trained models, training logs, and corresponding SymPy code will be saved in the
resultsfolder. -
Evaluate models:
python experiments/run_experiments.py --evaluateA summary of evaluation results for accuracies and model sizes will be saved in
results/evaluation.csv.
@ARTICLE{10681646,
author={Yue, Chang and Jha, Niraj K.},
journal={IEEE Transactions on Circuits and Systems for Artificial Intelligence},
title={Learning Interpretable Differentiable Logic Networks},
year={2024},
volume={1},
number={1},
pages={69-82},
doi={10.1109/TCASAI.2024.3462303}}