philschmid's picture
philschmid HF staff
Update README.md
697c48f
metadata
tags:
  - document-understading
  - endpoints-template
library_name: generic

Deploy a Space as inference Endpoint

This is a fork of the naver-clova-ix/donut-base-finetuned-cord-v2 Space

This repository implements a custom container for 🤗 Inference Endpoints using a gradio space.

To use deploy this model as an Inference Endpoint, you have to select Custom as task and a custom image.

  • CPU image: philschmi/gradio-api:cpu
  • GPU image: philschmi/gradio-api:gpu
  • PORT: 7860
  • Health Route: /-> is default

If you want to use the UI with the inference Endpoint, you have to select as endpoint type public and add auth through gradio

Example API Request Payload