-
Notifications
You must be signed in to change notification settings - Fork 1.1k
lexer error: too many states: 10000 >= 10000; stopping #1169
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thanks for reporting this @slice-harshit! I assue it's OK to slow down inference slightly if you really need this to work? @hudson-ai and @mmoskal , I wonder if we should expose a new param for exposing "fuel" like settings at the python level that users have to opt into. |
The lexer state limit was raised to 50k in llguidance v0.6.0 and 250k in v0.6.28. @slice-harshit which version of guidance are you using? @Harsha-Nori I'm sure we can expose some knobs but I would rather have the defaults work! |
I was using guidance==0.2.0 and llguidance==0.5.1. Now, I have updated it to guidance==0.2.1 and llguidance==0.6.31 |
@Harsha-Nori, I updated the guidance; now, for the same batch_size, I am not getting this error. @Harsha-Nori @mmoskal @VincentToups @hudson-ai Sorry for going off-topic, but it would be helpful if I could get any suggestions. |
Code:
When giving the batch_size of 10 I am getting the error:
The error "lexer error: too many states: 10000 >= 10000; stopping" occurs because the guidance library's parser has a maximum limit of 10,000 states, and your structured prompt with multiple select and gen calls for each SMS in the batch exceeds this limit.
It's important for me to run over large batches of SMS messages. If someone has any solution to tweak the limit so that I can run it on a large batch, that would be helpful.
Thank you!
The text was updated successfully, but these errors were encountered: