chansung commited on
Commit
e23f987
1 Parent(s): ce67b54

Update app/main.py

Browse files
Files changed (1) hide show
  1. app/main.py +7 -3
app/main.py CHANGED
@@ -10,9 +10,13 @@ def main(args):
10
  description="This space is a template that you can fork/duplicate for your own usage. "
11
  "This space let you build LLM powered idea on top of [Gradio](https://www.gradio.app/) "
12
  "and open LLM served locally by [TGI(Text Generation Inference)](https://huggingface.co/docs/text-generation-inference/en/index). "
13
- "To use this space, [duplicate]() this space, set which model you want to use (i.e. mistralai/Mistral-7B-Instruct-v0.2), then "
14
- "you are all good to go. Just focus on the implementation of your idea 💡. For your convenience, this space also provides "
15
- "some handy [utility functions](https://huggingface.co/spaces/chansung/gradio_together_tgi/blob/main/app/gen/openllm.py) to aynchronously generate text by interacting with the locally served LLM.",
 
 
 
 
16
  multimodal=False
17
  )
18
 
 
10
  description="This space is a template that you can fork/duplicate for your own usage. "
11
  "This space let you build LLM powered idea on top of [Gradio](https://www.gradio.app/) "
12
  "and open LLM served locally by [TGI(Text Generation Inference)](https://huggingface.co/docs/text-generation-inference/en/index). "
13
+ "Below is a placeholder Gradio ChatInterface for you to try out Mistral-7B backed by the power of TGI's efficiency. \n\n"
14
+ "To use this space for your own usecase, follow the simple steps below:\n"
15
+ "1. [Duplicate](https://huggingface.co/spaces/chansung/gradio_together_tgi/blob/main/app/main.py?duplicate=true) this space. \n"
16
+ "2. Set which LLM you wish to use (i.e. mistralai/Mistral-7B-Instruct-v0.2). \n"
17
+ "3. Inside [app/main.py](https://huggingface.co/spaces/chansung/gradio_together_tgi/blob/main/app/main.py), write Gradio application. \n"
18
+ "4. (Bonus➕) [app/gen](https://huggingface.co/spaces/chansung/gradio_together_tgi/tree/main/app/gen) provides handy utility functions "
19
+ "to aynchronously generate text by interacting with the locally served LLM. It let you focus on the idea implementation side as much as possible!"
20
  multimodal=False
21
  )
22