chaoscodes commited on
Commit
5c9e70d
1 Parent(s): 56c5a3c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -20,7 +20,7 @@ The TinyLlama project aims to **pretrain** a **1.1B Llama model on 3 trillion to
20
  We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.
21
 
22
  #### This Model
23
- This is the chat model finetuned on top of [TinyLlama/TinyLlama-1.1B-intermediate-step-715k-1.5T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-715k-1.5T).
24
  The dataset used is [OpenAssistant/oasst_top1_2023-08-25](https://huggingface.co/datasets/OpenAssistant/oasst_top1_2023-08-25) following the [chatml](https://github.com/openai/openai-python/blob/main/chatml.md) format.
25
  #### How to use
26
  You will need the transformers>=4.31
@@ -29,7 +29,7 @@ Do check the [TinyLlama](https://github.com/jzhang38/TinyLlama) github page for
29
  from transformers import AutoTokenizer
30
  import transformers
31
  import torch
32
- model = "PY007/TinyLlama-1.1B-Chat-v0.4"
33
  tokenizer = AutoTokenizer.from_pretrained(model)
34
  pipeline = transformers.pipeline(
35
  "text-generation",
 
20
  We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.
21
 
22
  #### This Model
23
+ This is the chat model finetuned on top of [TinyLlama/TinyLlama-1.1B-intermediate-step-955k-2T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-955k-token-2T).
24
  The dataset used is [OpenAssistant/oasst_top1_2023-08-25](https://huggingface.co/datasets/OpenAssistant/oasst_top1_2023-08-25) following the [chatml](https://github.com/openai/openai-python/blob/main/chatml.md) format.
25
  #### How to use
26
  You will need the transformers>=4.31
 
29
  from transformers import AutoTokenizer
30
  import transformers
31
  import torch
32
+ model = "PY007/TinyLlama-1.1B-Chat-v0.5"
33
  tokenizer = AutoTokenizer.from_pretrained(model)
34
  pipeline = transformers.pipeline(
35
  "text-generation",