Edit model card

Run Request

The endpoint expects the image to be served as binary. Below is an curl and python example

cURL

  1. get image
wget https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg -O test.jpg
  1. send cURL request
curl --request POST \
  --url https://{ENDPOINT}/ \
  --header 'Content-Type: image/jpg' \
  --header 'Authorization: Bearer {HF_TOKEN}' \
  --data-binary '@test.jpg'
  1. the expected output
{"text": "INDLUS THE"}

Python

  1. get image
wget https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg -O test.jpg
  1. run request
import json
from typing import List
import requests as r
import base64

ENDPOINT_URL=""
HF_TOKEN=""

def predict(path_to_image:str=None):
    with open(path_to_image, "rb") as i:
      b = i.read()
    headers= {
        "Authorization": f"Bearer {HF_TOKEN}",
        "Content-Type": "image/jpeg" # content type of image
    }
    response = r.post(ENDPOINT_URL, headers=headers, data=b)
    return response.json()

prediction = predict(path_to_image="test.jpg")

prediction

expected output

{"text": "INDLUS THE"}
Downloads last month
0
Inference API
Drag image file here or click to browse from your device
Inference API (serverless) does not yet support generic models for this pipeline type.