separate the model serve from the chat interface to fix memory overflow issues

#1
by Tonic - opened

Issue

the accumulation of messages in memory doesnt allow the model to chat correctly .

Fix

  • separate the model serve from the chat interface and use a secure gradio client to make the request from a "thin client" .
  • basic character limits for passing the second prompt and next inputs for memory .
Multimodal Art Projection org

Thank you for your attention and suggestions. We acknowledge that the multiple rounds of dialogue in the model and the use of open-ended questions are limitations of our current work. We will strive to improve these issues in future iterations.

Sign up or log in to comment