Hugging Face's logo Hugging Face
    • Models
    • Datasets
    • Pricing
      • Website
        • Metrics
        • Languages
        • Organizations
      • Community
        • Forum
        • Blog
        • GitHub
      • Documentation
        • Model Hub doc
        • Inference API doc
        • Transformers doc
        • Tokenizers doc
        • Datasets doc
    • We're hiring!

    • Log In
    • Sign Up
    • Account
      • Log In
      • Sign Up
    • Website
      • Models
      • Datasets
      • Metrics
      • Languages
      • Organizations
      • Pricing
    • Community
      • Forum
      • Blog
    • Documentation
      • Model Hub doc
      • Inference API doc
      • Transformers doc
      • Tokenizers doc
      • Datasets doc

    Allen Institute for AI's picture allenai
    /
    led-base-16384

    Text2Text Generation
    PyTorch TensorFlow en arxiv:2004.05150 apache-2.0 led seq2seq
    Model card Files and versions

    How to serve this model with the Accelerated Inference API

    Try the Inference API for free, and get an organization plan to use it in your apps.
    import json
    import requests
    
    API_URL = "https://api-inference.huggingface.co/models/allenai/led-base-16384"
    headers = {"Authorization": f"Bearer {API_TOKEN}"}
    
    def query(payload):
    	data = json.dumps(payload)
    	response = requests.request("POST", API_URL, headers=headers, data=data)
    	return json.loads(response.content.decode("utf-8"))
    data = query({"inputs": "The answer to the universe is"})

    Quick Links

    • Inference API Documentation
    • How to get started

    How to use from the 🤗/transformers library

    from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
      
    tokenizer = AutoTokenizer.from_pretrained("allenai/led-base-16384")
    
    model = AutoModelForSeq2SeqLM.from_pretrained("allenai/led-base-16384")

    Or just clone the model repo

    git lfs install
    git clone https://huggingface.co/allenai/led-base-16384
    # if you want to clone without large files – just their pointers
    # prepend your git clone with the following env var:
    GIT_LFS_SKIP_SMUDGE=1
      • main
      led-base-16384
      History: 8 commits
      patrickvonplaten's picture
      patrickvonplaten
      Create README.md 25756ed 3 months ago
      • .gitattributes 345.0B initial commit last year
      • README.md 862.0B Create README.md 3 months ago
      • config.json 1.1KB correct naming 3 months ago
      • merges.txt 445.6KB add files last year
      • pytorch_model.bin 617.7MB correct naming 3 months ago
      • special_tokens_map.json 772.0B add files last year
      • tf_model.h5 617.7MB correct naming 3 months ago
      • tokenizer_config.json 27.0B add files last year
      • vocab.json 877.8KB add files last year