--- license: apache-2.0 datasets: - BI55/MedText language: - en library_name: transformers pipeline_tag: text-generation tags: - medical --- # Model Card for Phi-Med-V1 Microsoft Phi2 Finetuned on Medical Text Data ## Model Details ### Model Description - **Developed by:** [JJ] - **Model type:** [SLM] - **Finetuned from model:** [microsoft/Phi-2] ## Uses Testing the effectivness of Finetuning SLMs ### Direct Use Not Allowed as this is for research only ## Bias, Risks, and Limitations Model can still Halucinate. ## Training Details ### Training Data MedText Dataset from HuggingFace ### Training Procedure SFT using HF Transformers ## Environmental Impact - **Hardware Type:** A10 GPU VMs [2x24GB A10] - **Hours used:** [3] - **Cloud Provider:** [Azure] - **Compute Region:** [North Europe (Dublin)] - Experiments were conducted using Azure in region northeurope, which has a carbon efficiency of 0.62 kgCO$_2$eq/kWh. A cumulative of 100 hours of computation was performed on hardware of type A10 (TDP of 350W). - Total emissions are estimated to be 21.7 kgCO$_2$eq of which 100 percents were directly offset by the cloud provider. - Estimations were conducted using the [https://mlco2.github.io/impact#compute][MachineLearning Impact calculator] ## Technical Specifications [optional] ### Compute Infrastructure [Azure] #### Hardware [NV72ads A10 GPU VMs] #### Software [Axolotl] ## Model Card Authors [optional] [JJ] ## Model Card Contact [JJ]