philschmid's picture
philschmid HF staff
Update README.md
e636a5a
|
raw
history blame
838 Bytes
---
tags:
- document-understading
- endpoints-template
library_name: generic
---
# Deploy a Space as inference Endpoint
_This is a fork of the [naver-clova-ix/donut-base-finetuned-cord-v2](https://huggingface.co/spaces/naver-clova-ix/donut-base-finetuned-cord-v2) Space_
This repository implements a custom container for 🤗 Inference Endpoints using a gradio space.
To use deploy this model as an Inference Endpoint, you have to select Custom as task and a custom image. The image you can use is `philschmid/gradio-api:cpu` for CPU and `philschmid/gradio-api:gpu` for GPU images. The PORT to select is `7860`.
If you want to use the UI with the inference Endpoint, you have to select as endpoint type `public` and add [auth through gradio](https://gradio.app/docs/#launch-header)
### Example API Request Payload
```json
```