Edit model card

RoBERTa base Fine-Tuned for Proposal Sentence Classification

Overview

  • Language: English
  • Model Name: oeg/BERT-Repository-Proposal

Description

This model is a fine-tuned bert base uncased model trained to classify sentences into two classes: proposal and non-proposal sentences. The training data includes sentences proposing a software or data repository. The model is trained to recognize and classify these sentences accurately.

How to use

To use this model in Python:

from transformers import RobertaForSequenceClassification, RobertaTokenizer
import torch

tokenizer = RobertaTokenizer.from_pretrained("bert-repo-proposal-tokenizer")
model = RobertaForSequenceClassification.from_pretrained("bert-repo-proposal-model")

sentence = "Your input sentence here."
inputs = tokenizer(sentence, return_tensors="pt")
outputs = model(**inputs)
probabilities = torch.nn.functional.softmax(outputs.logits, dim=1)
Downloads last month
0
Inference Examples
Unable to determine this model's library. Check the docs .