Skip to content

Restrict generation to semantically valid tokens? #1257

Answered by hudson-ai
ahelwer asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @ahelwer!

While llguidance works on context-free grammars only, you can "break out" of context-free by manually keeping track of state. See a very simple example (from the readme) here:

lm = llama2 + f"Do you want a joke or a poem? A {select(['joke', 'poem'], name='answer')}.\n"

# make a choice based on the model's previous selection
if lm["answer"] == "joke":
    lm += f"Here is a one-line joke about cats: " + gen('output', stop='\n')
else:
    lm += f"Here is a one-line poem about dogs: " + gen('output', stop='\n')

Making it more natural to express (and efficient to run) context-sensitive languages is an open research topic, but there is a lot of activity on this, especially in the …

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@ahelwer
Comment options

Answer selected by ahelwer
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants