Text Classification
Transformers
PyTorch
Safetensors
English
bert
Inference Endpoints
Edit model card

Multi-Label-Classification-of-Pubmed-Articles

The traditional machine learning models give a lot of pain when we do not have sufficient labeled data for the specific task or domain we care about to train a reliable model. Transfer learning allows us to deal with these scenarios by leveraging the already existing labeled data of some related task or domain. We try to store this knowledge gained in solving the source task in the source domain and apply it to our problem of interest. In this work, I have utilized Transfer Learning utilizing BioBERT model to fine tune on PubMed MultiLabel classification Dataset.

Also tried RobertaForSequenceClassification and XLNetForSequenceClassification models for Fine-Tuning the Model on Pubmed MultiLabel Datset.

I have integrated Weight and Bias also for visualizations and logging artifacts and comparisons of Different models!

Multi Label Classification of PubMed Articles (Paper Night Event)

  • To get the API key, create an account in the website .
  • Use secrets to use API Keys more securely inside Kaggle.

For more information on the attributes visit the Kaggle Dataset Description here.

In order to, get a full grasp of what steps I have taken to utilize this dataset. Have a Full look at the information present in the Kaggle Notebook Link & Also Kaggle version of Same Dataset Link

References

  1. Attention Is All You Need
  2. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
  3. https://github.com/google-research/bert
  4. https://github.com/huggingface/transformers
  5. BCE WITH LOGITS LOSS Pytorch
  6. Transformers for Multi-Label Classification made simple by Ronak Patel
Downloads last month
9
Safetensors
Model size
108M params
Tensor type
I64
·
F32
·

Dataset used to train owaiskha9654/Multi-Label-Classification-of-PubMed-Articles

Spaces using owaiskha9654/Multi-Label-Classification-of-PubMed-Articles 2