"longer input lengths"?

#1
by tensiondriven - opened

This is a fork of Pygmalion allowing longer input lengths for text-generation tasks using the Inference API

Can you elaborate on what is meant by "longer input lengths"? Does this model support a longer context than 2048 tokens? If so, that would be quite unusual, and quite interesting.

Sign up or log in to comment