philschmid's picture
philschmid HF staff
Update README.md
e636a5a
metadata
tags:
  - document-understading
  - endpoints-template
library_name: generic

Deploy a Space as inference Endpoint

This is a fork of the naver-clova-ix/donut-base-finetuned-cord-v2 Space

This repository implements a custom container for 🤗 Inference Endpoints using a gradio space.

To use deploy this model as an Inference Endpoint, you have to select Custom as task and a custom image. The image you can use is philschmid/gradio-api:cpu for CPU and philschmid/gradio-api:gpu for GPU images. The PORT to select is 7860.

If you want to use the UI with the inference Endpoint, you have to select as endpoint type public and add auth through gradio

Example API Request Payload