English
GEM_PubMedQA / README.md
Firoj112's picture
Update README.md
c400c45 verified
metadata
license: apache-2.0
datasets:
  - qiaojin/PubMedQA
language:
  - en
metrics:
  - accuracy
base_model:
  - google-bert/bert-base-uncased

GEM_PubMedQA Model Card

This model card provides an overview of the GEM_PubMedQA model, a finetuned implementation of the GEM architecture designed for the PubMedQA dataset.

Purpose

The GEM_PubMedQA model was developed to assess the performance of the GEM architecture on domain-specific datasets, with a focus on healthcare. The PubMedQA dataset, a key benchmark in this field, was selected to evaluate its effectiveness.

Key Details

  • License: Apache-2.0
  • Dataset: qiaojin/PubMedQA
  • Language: English
  • Metrics: Accuracy: 92.5%
  • Base Model: google-bert/bert-base-uncased

Model Details

The GEM_PubMedQA model is built on the GEM architecture and finetuned from the google-bert/bert-base-uncased model using the PubMedQA dataset. The training was performed with the following parameters:

  • Number of epochs: 5
  • Batch size: 128
  • Learning rate: 2e-5
  • Maximum sequence length: 128
  • Gradient accumulation steps: 2
  • Cluster size: 256
  • Threshold: 0.65