Hugging Face's logo Hugging Face
    • Models
    • Datasets
    • Pricing
      • Website
        • Metrics
        • Languages
        • Organizations
      • Community
        • Forum
        • Blog
        • GitHub
      • Documentation
        • Model Hub doc
        • Inference API doc
        • Transformers doc
        • Tokenizers doc
        • Datasets doc

    • Log In
    • Sign Up
    • Account
      • Log In
      • Sign Up
    • Website
      • Models
      • Datasets
      • Metrics
      • Languages
      • Organizations
      • Pricing
    • Community
      • Forum
      • Blog
    • Documentation
      • Model Hub doc
      • Inference API doc
      • Transformers doc
      • Tokenizers doc
      • Datasets doc

    's picture ynie
    /
    roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli

    Text Classification
    PyTorch snli anli multi_nli multi_nli_mismatch fever mit roberta
    Model card Files and versions

      How to use from the 🤗/transformers library:

                      from transformers import AutoTokenizer, AutoModelForSequenceClassification
        
        tokenizer = AutoTokenizer.from_pretrained("ynie/roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli")
        
        model = AutoModelForSequenceClassification.from_pretrained("ynie/roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli")
                  

      Or just clone the model repo

        git lfs install
        git clone https://huggingface.co/ynie/roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli
        
      # if you want to clone without large files – just their pointers # prepend your git clone with the following env var:
      GIT_LFS_SKIP_SMUDGE=1
      • main
    ynie/roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli /
    History: 13 commits
    julien-c's picture
    julien-c
    Migrate model card from transformers-repo 2a130d4 2 months ago
    • .gitattributes 345.0B initial commit 4 months ago
    • README.md 3.3KB Migrate model card from transformers-repo 2 months ago
    • config.json 703.0B Update config.json 4 months ago
    • merges.txt 445.6KB Update merges.txt 4 months ago
    • pytorch_model.bin 1.3GB Update pytorch_model.bin 4 months ago
    • special_tokens_map.json 772.0B Update special_tokens_map.json 4 months ago
    • tokenizer_config.json 49.0B Update tokenizer_config.json 4 months ago
    • vocab.json 877.8KB Update vocab.json 4 months ago