-
I wrote this grammar: https://gist.github.com/dstoc/ab58a1829b3f504c64f08bee5e8c6ea6 But there are some problems:
Is there a better way to structure this? Is there a way to incorporate token ids into the grammar? e.g. something like:
|
Beta Was this translation helpful? Give feedback.
Replies: 5 comments 1 reply
-
It seems like this is possible in llguidance! I will test the integration in llama.cpp |
Beta Was this translation helpful? Give feedback.
-
seems something is not wired up correctly with gpt-oss and special tokens? llg error: at 8(11): unknown special token: "<|channel|>"; following special tokens are available: ⟦<|return|>‧<|reserved_200017|>‧<|reserved_200016|>‧<|reserved_200015|>‧<|reserved_200014|>‧<|reserved_200013|>‧<|reserved_200011|>‧<|reserved_200010|>‧<|reserved_200009|>‧<|reserved_200004|>‧<|reserved_200001|>‧<|reserved_200000|>‧≺EOS≻‧<|endofprompt|>‧<|call|>⟧ |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
Only size == 0 tokens are special? llama.cpp/common/llguidance.cpp Lines 172 to 175 in 61bdfd5 |
Beta Was this translation helpful? Give feedback.
-
Sent a fix: #15837 |
Beta Was this translation helpful? Give feedback.
It seems like this is possible in llguidance! I will test the integration in llama.cpp