Skip to content

Commit a0716d5

Browse files
committed
some edits
1 parent 27c5a9c commit a0716d5

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

Diff for: train.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -82,8 +82,8 @@ def main():
8282
# Epochs
8383
for epoch in range(start_epoch, epochs):
8484
# Paper describes decaying the learning rate at the 80000th, 100000th, 120000th 'iteration', i.e. model update or batch
85-
# The paper uses a batch size of 32 (regardless of what we use), which means there are about 517 iterations in an epoch
86-
# Therefore, you could do,
85+
# The paper uses a batch size of 32, which means there were about 517 iterations in an epoch
86+
# Therefore, to find the epochs to decay at, you could do,
8787
# if epoch in {80000 // 517, 100000 // 517, 120000 // 517}:
8888
# adjust_learning_rate(optimizer, 0.1)
8989

0 commit comments

Comments
 (0)