starmpcc commited on
Commit
1dbd5c0
1 Parent(s): 464ffda

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -12,8 +12,8 @@ tags:
12
 
13
  <!-- Provide a quick summary of what the model is/does. -->
14
 
15
- This is an official model checkpoint for Asclepius-Mixtral-8B-v0.3 [(arxiv)](https://arxiv.org/abs/2309.00237).
16
- This model is an enhanced version of Asclepius-7B, by replacing the base model with Mixtral-8B-v0.3 and increasing the max sequence length to 8192.
17
 
18
  ## UPDATE
19
  ### 2024.01.10
@@ -30,7 +30,7 @@ This model is an enhanced version of Asclepius-7B, by replacing the base model w
30
  - **Model type:** Clinical LLM (Large Language Model)
31
  - **Language(s) (NLP):** English
32
  - **License:** CC-BY-NC-SA 4.0
33
- - **Finetuned from model [optional]:** Mixtral-8B-v0.3
34
 
35
  ### Model Sources [optional]
36
 
@@ -89,8 +89,8 @@ The response should provide the accurate answer to the instruction, while being
89
  """
90
 
91
  from transformers import AutoTokenizer, AutoModelForCausalLM
92
- tokenizer = AutoTokenizer.from_pretrained("starmpcc/Asclepius-Mixtral-8B-v0.3", use_fast=False)
93
- model = AutoModelForCausalLM.from_pretrained("starmpcc/Asclepius-Mixtral-8B-v0.3")
94
 
95
  note = "This is a sample note"
96
  question = "What is the diagnosis?"
 
12
 
13
  <!-- Provide a quick summary of what the model is/does. -->
14
 
15
+ This is an official model checkpoint for Asclepius-Mistral-7B-v0.3 [(arxiv)](https://arxiv.org/abs/2309.00237).
16
+ This model is an enhanced version of Asclepius-7B, by replacing the base model with Mistral-7B-v0.3 and increasing the max sequence length to 8192.
17
 
18
  ## UPDATE
19
  ### 2024.01.10
 
30
  - **Model type:** Clinical LLM (Large Language Model)
31
  - **Language(s) (NLP):** English
32
  - **License:** CC-BY-NC-SA 4.0
33
+ - **Finetuned from model [optional]:** Mistral-7B-v0.3
34
 
35
  ### Model Sources [optional]
36
 
 
89
  """
90
 
91
  from transformers import AutoTokenizer, AutoModelForCausalLM
92
+ tokenizer = AutoTokenizer.from_pretrained("starmpcc/Asclepius-Mistral-7B-v0.3", use_fast=False)
93
+ model = AutoModelForCausalLM.from_pretrained("starmpcc/Asclepius-Mistral-7B-v0.3")
94
 
95
  note = "This is a sample note"
96
  question = "What is the diagnosis?"