change python code to use 13b instead of 7b

#2
by xzuyn - opened
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -60,7 +60,7 @@ You can use the model for inference tasks like question-answering and medical di
60
 
61
  from transformers import pipeline
62
 
63
- qa_pipeline = pipeline("question-answering", model="medalpaca/medalpaca-7b", tokenizer="medalpaca/medalpaca-7b")
64
  question = "What are the symptoms of diabetes?"
65
  context = "Diabetes is a metabolic disease that causes high blood sugar. The symptoms include increased thirst, frequent urination, and unexplained weight loss."
66
  answer = qa_pipeline({"question": question, "context": context})
 
60
 
61
  from transformers import pipeline
62
 
63
+ qa_pipeline = pipeline("question-answering", model="medalpaca/medalpaca-13b", tokenizer="medalpaca/medalpaca-13b")
64
  question = "What are the symptoms of diabetes?"
65
  context = "Diabetes is a metabolic disease that causes high blood sugar. The symptoms include increased thirst, frequent urination, and unexplained weight loss."
66
  answer = qa_pipeline({"question": question, "context": context})