Skip to content

Commit 0c9d90f

Browse files
committed
Fix: number of batches calculation is incorrect
1 parent aac33a6 commit 0c9d90f

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

src/unitxt/inference.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@
66
import io
77
import json
88
import logging
9+
import math
910
import os
1011
import re
1112
import sys
@@ -239,7 +240,7 @@ def infer(
239240
result = self._mock_infer(dataset)
240241
else:
241242
if self.use_cache:
242-
number_of_batches = len(dataset) // self.cache_batch_size + 1
243+
number_of_batches = math.ceil(len(dataset) / self.cache_batch_size)
243244
result = []
244245
for batch_index, batch in enumerate(
245246
batched(dataset, self.cache_batch_size)

0 commit comments

Comments
 (0)