What is the context size on this model? And it does not appear to deal with JSON, function calling well.

#15
by BigDeeper - opened

Has it been trained for function calling?

Context length appears to be 32K tokens according to config.json

Tokenizer config doesn't feature the special function calling tokens though, so I am pretty sure it wasn't fine-tuned for that.

That said, there might be a mishap with the tokenizer config, there's an ongoing discussion about it. I am not sure about that.

Sign up or log in to comment