BINATAK commited on
Commit
d74f82d
1 Parent(s): d00fbb6

Update README.md

Browse files

Can you translate this? To tagalog

Sought

Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -5,12 +5,12 @@ base_model: Xwin-LM/Xwin-LM-70b-V0.1
5
  inference: false
6
  model_creator: Xwin-LM
7
  model_type: llama
8
- prompt_template: 'A chat between a curious user and an artificial intelligence assistant.
9
- The assistant gives helpful, detailed, and polite answers to the user''s questions.
 
10
  USER: {prompt} ASSISTANT:
11
-
12
- '
13
  quantized_by: TheBloke
 
14
  ---
15
 
16
  <!-- header start -->
@@ -386,4 +386,4 @@ Please consider citing our work if you use the data or code in this repo.
386
 
387
  ## Acknowledgements
388
 
389
- Thanks to [Llama 2](https://ai.meta.com/llama/), [FastChat](https://github.com/lm-sys/FastChat), [AlpacaFarm](https://github.com/tatsu-lab/alpaca_farm), and [vllm](https://github.com/vllm-project/vllm).
 
5
  inference: false
6
  model_creator: Xwin-LM
7
  model_type: llama
8
+ prompt_template: >
9
+ A chat between a curious user and an artificial intelligence assistant. The
10
+ assistant gives helpful, detailed, and polite answers to the user's questions.
11
  USER: {prompt} ASSISTANT:
 
 
12
  quantized_by: TheBloke
13
+ pipeline_tag: translation
14
  ---
15
 
16
  <!-- header start -->
 
386
 
387
  ## Acknowledgements
388
 
389
+ Thanks to [Llama 2](https://ai.meta.com/llama/), [FastChat](https://github.com/lm-sys/FastChat), [AlpacaFarm](https://github.com/tatsu-lab/alpaca_farm), and [vllm](https://github.com/vllm-project/vllm).