Skip to content

Commit 27c5a9c

Browse files
committed
some edits
1 parent f6aa752 commit 27c5a9c

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

train.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ def main():
8282
# Epochs
8383
for epoch in range(start_epoch, epochs):
8484
# Paper describes decaying the learning rate at the 80000th, 100000th, 120000th 'iteration', i.e. model update or batch
85-
# The paper uses a batch size of 32 (regardless of what we use), which means there are about 517 batches in an epoch
85+
# The paper uses a batch size of 32 (regardless of what we use), which means there are about 517 iterations in an epoch
8686
# Therefore, you could do,
8787
# if epoch in {80000 // 517, 100000 // 517, 120000 // 517}:
8888
# adjust_learning_rate(optimizer, 0.1)

0 commit comments

Comments
 (0)