Skip to content

Getting Started

Greg Pauloski edited this page Mar 6, 2023 · 4 revisions

Getting Started

Installation

  1. Create an environment. I recommend installing PyTorch and KFAC into a virtual environment of your choice to avoid global package installations. E.g.,
    $ python -m venv venv
    $ . venv/bin/activation
  2. Install PyTorch: https://pytorch.org/get-started/locally/
  3. Clone and install KFAC.
    $ git clone https://github.com/gpauloski/kfac-pytorch.git
    $ cd kfac-pytorch
    $ pip install .

Update your Training Script

KFAC is designed to be used in place with your existing training scripts. You must 1) import the KFACPreconditioner, 2) initialize the preconditioner, and 3) call the preconditioner before each optimization step.

from kfac.preconditioner import KFACPreconditioner

model = torch.nn.parallel.DistributedDataParallel(...)
optimizer = optim.SGD(model.parameters(), ...)

# Initialize KFAC
preconditioner = KFACPreconditioner(model, ...)

for data, target in train_loader:
    optimizer.zero_grad()
    output = model(data)

    loss = criterion(output, target)
    loss.backward()

    # Perform preconditioning before each optimizer step
    preconditioner.step()
    optimizer.step()

The KFACPreconditioner implements the adaptable distributed strategy referred to as KAISA.

Clone this wiki locally