xwinxu commited on
Commit
824b15a
1 Parent(s): e5603c1

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -40,7 +40,7 @@ Chocolate cake.
40
  ```
41
  Note that a beginning-of-sequence (BOS) token is automatically added by all Archangel models during tokenization and does not have to be added by you. No end-of-sequence (EOS) token is added to the prompt.
42
 
43
- For models trained with our conditional variant, the tokenizers have additional tokens `<|good|>` and `<|bad|>` included in the embeddings.
44
  To generate with these control tokens in the context, postpend either to the prompt.
45
 
46
  Please refer to our [code repository](https://github.com/ContextualAI/HALOs) or [blog](https://contextual.ai/better-cheaper-faster-llm-alignment-with-kto/) which contains intructions for training your own HALOs and links to our model cards.
 
40
  ```
41
  Note that a beginning-of-sequence (BOS) token is automatically added by all Archangel models during tokenization and does not have to be added by you. No end-of-sequence (EOS) token is added to the prompt.
42
 
43
+ For models trained with our conditional SFT model, the tokenizers have additional tokens `<|good|>` and `<|bad|>` included in the embeddings.
44
  To generate with these control tokens in the context, postpend either to the prompt.
45
 
46
  Please refer to our [code repository](https://github.com/ContextualAI/HALOs) or [blog](https://contextual.ai/better-cheaper-faster-llm-alignment-with-kto/) which contains intructions for training your own HALOs and links to our model cards.