Apply for community grant: Academic project (gpu and storage)

#1
by wangrongsheng - opened

General large language models (LLMs) such as ChatGPT have shown remarkable success. However, such LLMs have not been widely adopted for medical purposes, due to poor accuracy and inability to provide medical advice. We propose IvyGPT, an LLM based on LLaMA that is trained and fine-tuned with high-quality medical question-answer (QA) instances and Reinforcement Learning from Human Feedback (RLHF). In the training, we used QLoRA to handle 33 billion parameters on a small number of NVIDIA A100 (80GB) GPUs. Experimental results show that IvyGPT has outperformed other medical GPT models.

In order for more people to benefit from IvyGPT, we have developed an open source version of IvyGPT and would like to deploy HuggingFace to help make it accessible to all, and we hope that our work will contribute to the academic advancement and convenience of artificial lifestyles.

Hi @wangrongsheng , we have assigned a gpu to this space. Note that GPU Grants are provided temporarily and might be removed after some time if the usage is very low.

To learn more about GPUs in Spaces, please check out https://huggingface.co/docs/hub/spaces-gpus

Thank you!

Sign up or log in to comment