ChronosTokenizer Wrapper for Hugging Face T5 Integration? #129
-
Hi! I am trying to integrate from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("google-t5/t5-small")
I am basically trying to maintain all the same tokenizer functionalities in the chronos-forecasting codebase, while also creating a new chronos tokenizer that inherits from the When I try to do the same thing with the tokenizer = AutoTokenizer.from_pretrained("amazon/chronos-t5-small") I get this error:
I hope this request makes sense. Please let me know if I can clarify anything. Thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
|
Beta Was this translation helpful? Give feedback.
ChronosTokenizer
has no relation to any language model tokenizer such asT5Tokenizer
. You could potentially create one that follows the same API as HF tokenizers but we haven't tried it.