philschmid's picture
philschmid HF staff
Update README.md
e8f70dd
---
tags:
- document-understading
- endpoints-template
library_name: generic
---
# Deploy a Space as inference Endpoint
_This is a fork of the [naver-clova-ix/donut-base-finetuned-cord-v2](https://huggingface.co/spaces/naver-clova-ix/donut-base-finetuned-cord-v2) Space_
This repository implements a custom container for 🤗 Inference Endpoints using a gradio space.
To use deploy this model as an Inference Endpoint, you have to select Custom as task and a custom image.
* CPU image: `philschmi/gradio-api:cpu`
* GPU image: `philschmi/gradio-api:gpu`
* PORT: `7860`
* ~Health Route: `/`~-> is default
Also make sure to add `server_name="0.0.0.0"` in your `launch()` call to make sure the request is correct proxied.
If you want to use the UI with the inference Endpoint, you have to select as endpoint type `public` and add [auth through gradio](https://gradio.app/docs/#launch-header)
### Example API Request Payload
```json
```