You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
使用paddleslim2.6版本中的example/auto_compression/nlp/run_uie.py脚本,运行报错oved in a future version, please use max_length instead.
/appslog/miniconda3/envs/pEnv39/lib/python3.9/site-packages/paddlenlp/transformers/tokenizer_utils_base.py:1912: FutureWarning: The pad_to_max_length argument is deprecated and will be removed in a future version, use padding=True or padding='longest' to pad to the longest sequence in the batch, or use padding='max_length' to pad to a max length. In this case, you can give a specific length with max_length (e.g. max_length=45) or leave max_length to None to pad to the maximal input size of the model (e.g. 512 for Bert).
Exception in thread Thread-1:
Traceback (most recent call last):
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/threading.py", line 980, in _bootstrap_inner
self.run()
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/threading.py", line 917, in run
self._target(*self._args, **self._kwargs)
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/site-packages/paddle/io/dataloader/dataloader_iter.py", line 235, in _thread_loop
batch = self._dataset_fetcher.fetch(
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/site-packages/paddle/io/dataloader/fetcher.py", line 77, in fetch
data.append(self.dataset[idx])
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/site-packages/paddlenlp/datasets/dataset.py", line 263, in getitem
return self._transform(self.new_data[idx]) if self._transform_pipline else self.new_data[idx]
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/site-packages/paddlenlp/datasets/dataset.py", line 255, in _transform
data = fn(data)
File "/appslog/homezwj/pycharmProject/clsLab/UIE/scriptTrain/compresslim/run_uie_org.py", line 105, in convert_example
"token_type_ids": encoded_inputs["token_type_ids"],
KeyError: 'token_type_ids
The text was updated successfully, but these errors were encountered:
环境:
paddlenlp 2.8.1
paddlepaddle-gpu 2.6.1
paddleslim 2.6.
使用paddleslim2.6版本中的example/auto_compression/nlp/run_uie.py脚本,运行报错oved in a future version, please use
max_length
instead./appslog/miniconda3/envs/pEnv39/lib/python3.9/site-packages/paddlenlp/transformers/tokenizer_utils_base.py:1912: FutureWarning: The
pad_to_max_length
argument is deprecated and will be removed in a future version, usepadding=True
orpadding='longest'
to pad to the longest sequence in the batch, or usepadding='max_length'
to pad to a max length. In this case, you can give a specific length withmax_length
(e.g.max_length=45
) or leave max_length to None to pad to the maximal input size of the model (e.g. 512 for Bert).Exception in thread Thread-1:
Traceback (most recent call last):
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/threading.py", line 980, in _bootstrap_inner
self.run()
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/threading.py", line 917, in run
self._target(*self._args, **self._kwargs)
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/site-packages/paddle/io/dataloader/dataloader_iter.py", line 235, in _thread_loop
batch = self._dataset_fetcher.fetch(
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/site-packages/paddle/io/dataloader/fetcher.py", line 77, in fetch
data.append(self.dataset[idx])
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/site-packages/paddlenlp/datasets/dataset.py", line 263, in getitem
return self._transform(self.new_data[idx]) if self._transform_pipline else self.new_data[idx]
File "/appslog/miniconda3/envs/pEnv39/lib/python3.9/site-packages/paddlenlp/datasets/dataset.py", line 255, in _transform
data = fn(data)
File "/appslog/homezwj/pycharmProject/clsLab/UIE/scriptTrain/compresslim/run_uie_org.py", line 105, in convert_example
"token_type_ids": encoded_inputs["token_type_ids"],
KeyError: 'token_type_ids
The text was updated successfully, but these errors were encountered: