Skip to content

boschresearch/ZeroShotDistillation

Repository files navigation

Feature Distillation Improves Zero-Shot Transfer from Synthetic Images

TMLR Paper Video

This is the code release for the TMLR publication "Feature Distillation Improves Zero-Shot Transfer from Synthetic Images" by Niclas Popp, Jan Hendrik Metzen and Matthias Hein.

Teaser Figure

The codebase consists of three components:

  1. Domain-agnostic Distillation
  2. Synthetic Data Generation
  3. Domain-specific Distillation

The required packages are listed in the requirements.txt file. The code was tested on NVIDIA v100 and h100 gpus.

Step 1: Domain-agnostic Distillation

For an example file to run domain-agnostic distillation together with the available hyperparameters see: example_domain_agnostic.sh The code is built to use the webdataset dataloader together with .tar files. For details in how to setup the data for this kind of dataloader see here

Step 2: Synthetic data Generation

The synthetic data generation process can be started as shown in the example_data_generation.sh file. For different domains, select the corresponding dataset option.

Step 3: Domain-specific Distillation

The final step of our framework is domain-specific distillation. An example together with the available options is given in the file example_domain_specific.sh. This step requires the final model checkpoint from step 1 and the synthetic data from step 2.

@article{
  popp2024feature,
  title={Feature Distillation Improves Zero-Shot Transfer from Synthetic Images},
  author={Niclas Popp and Jan Hendrik Metzen and Matthias Hein},
  journal={Transactions on Machine Learning Research},
  issn={2835-8856},
  year={2024},
  url={https://openreview.net/forum?id=SP8DLl6jgb},
  note={}
}

About

Code accompanying the paper "Feature Distillation Improves Zero-Shot Transfer From Synthetic Images"

Resources

License

Stars

Watchers

Forks