-
Notifications
You must be signed in to change notification settings - Fork 24
Getting Started
Greg Pauloski edited this page Mar 6, 2023
·
4 revisions
-
Create an environment.
I recommend installing PyTorch and KFAC into a virtual environment of your choice to avoid global package installations. E.g.,
$ python -m venv venv $ . venv/bin/activation
- Install PyTorch: https://pytorch.org/get-started/locally/
-
Clone and install KFAC.
$ git clone https://github.com/gpauloski/kfac-pytorch.git $ cd kfac-pytorch $ pip install .
KFAC is designed to be used in place with your existing training scripts. You must 1) import the KFACPreconditioner
, 2) initialize the preconditioner, and 3) call the preconditioner before each optimization step.
from kfac.preconditioner import KFACPreconditioner
model = torch.nn.parallel.DistributedDataParallel(...)
optimizer = optim.SGD(model.parameters(), ...)
# Initialize KFAC
preconditioner = KFACPreconditioner(model, ...)
for data, target in train_loader:
optimizer.zero_grad()
output = model(data)
loss = criterion(output, target)
loss.backward()
# Perform preconditioning before each optimizer step
preconditioner.step()
optimizer.step()
The KFACPreconditioner
implements the adaptable distributed strategy referred to as KAISA.
Copyright © 2021—Present by Greg Pauloski