Hugging Face's logo Hugging Face
    • Models
    • Datasets
    • Pricing
      • Website
        • Metrics
        • Languages
        • Organizations
      • Community
        • Forum
        • Blog
        • GitHub
      • Documentation
        • Model Hub doc
        • Inference API doc
        • Transformers doc
        • Tokenizers doc
        • Datasets doc

    • Log In
    • Sign Up
    • Account
      • Log In
      • Sign Up
    • Website
      • Models
      • Datasets
      • Metrics
      • Languages
      • Organizations
      • Pricing
    • Community
      • Forum
      • Blog
    • Documentation
      • Model Hub doc
      • Inference API doc
      • Transformers doc
      • Tokenizers doc
      • Datasets doc

    's picture google
    /
    tapas-small-finetuned-wtq

    Table Question Answering
    PyTorch wtq en arxiv:2004.02349 arxiv:2010.00571 arxiv:1508.00305 apache-2.0 tapas
    Model card Files and versions

      How to use from the 🤗/transformers library:

                      from transformers import AutoTokenizer, AutoModelForTableQuestionAnswering
        
        tokenizer = AutoTokenizer.from_pretrained("google/tapas-small-finetuned-wtq")
        
        model = AutoModelForTableQuestionAnswering.from_pretrained("google/tapas-small-finetuned-wtq")
                  

      Or just clone the model repo

        git lfs install
        git clone https://huggingface.co/google/tapas-small-finetuned-wtq
        
      # if you want to clone without large files – just their pointers # prepend your git clone with the following env var:
      GIT_LFS_SKIP_SMUDGE=1
      • main
      • no_reset
    google/tapas-small-finetuned-wtq /
    History: 4 commits
    nielsr's picture
    nielsr
    Add results table d97635d 2 months ago
    • .gitattributes 345.0B initial commit 2 months ago
    • README.md 7.0KB Add results table 2 months ago
    • config.json 1.6KB First commit 2 months ago
    • pytorch_model.bin 111.8MB First commit 2 months ago
    • special_tokens_map.json 154.0B First commit 2 months ago
    • tokenizer_config.json 490.0B First commit 2 months ago
    • vocab.txt 255.9KB First commit 2 months ago