I'm currently having to manually set the stop_token_ids in vllm in order get this model to terminate properly. I believe that this change will solve that issue.

This change adjusts the eos_token_id to match an update to the original model. Here's the commit that updated the original model for reference:

https://huggingface.co/openchat/openchat-3.5-0106/commit/aa20a27ebc320797c96dd18ff596fe74df4b6e1d

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment