--- inference: false datasets: - medalpaca/medical_meadow_medqa language: - en library_name: transformers tags: - biology - medical - QA - healthcare license: mit --- # Galen Galen is fine-tuned from [Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2), using [medical quesion answering dataset](https://huggingface.co/datasets/medalpaca/medical_meadow_medqa) ### Galen's view about future of medicine and AI: ![alt text](1.png "Galen's view about future of medicine and AI") # Get Started Install "accelerate" to use CUDA GPU ```bash pip install accelerate ``` ```py from transformers import AutoTokenizer, pipeline ``` ```py tokenizer = AutoTokenizer.from_pretrained('ahmed-ai/galen') model_pipeline = pipeline(task="text-generation", model='ahmed-ai/galen', tokenizer=tokenizer, max_length=256, temperature=0.5, top_p=0.6) ``` ```py result = model_pipeline('What is squamous carcinoma') #print the generated text print(result[0]['generated_text'][len(prompt):]) ```