File size: 3,426 Bytes
15ebc22
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
# API Documentation for `Lenylvt/SRT_Translation-API`

This documentation covers how to interact with the SRT_Translation API using both Python and JavaScript.

## API Endpoint

To use this API, you can opt for the `gradio_client` Python library [docs](https://www.gradio.app/guides/getting-started-with-the-python-client) or the `@gradio/client` JavaScript package [docs](https://www.gradio.app/guides/getting-started-with-the-js-client).

## Python Usage

### Step 1: Installation

Firstly, install the `gradio_client` if it's not already installed.

```python
pip install gradio_client
```

### Step 2: Making a Request

Locate the API endpoint for the function you wish to use. Replace the placeholder values in the snippet below with your actual input data. For accessing private Spaces, you might need to include your Hugging Face token as well.

**API Name**: `/predict`

```python
from gradio_client import Client

client = Client("Lenylvt/SRT_Translation-API")
result = client.predict(
    "https://github.com/gradio-app/gradio/raw/main/test/test_files/sample_file.pdf",  # filepath in 'Upload SRT File' File component
    "en",  # Source Language (ISO 639-1 code, e.g., 'en' for English) in 'Source Language' Dropdown component
    "es",  # Target Language (ISO 639-1 code, e.g., 'es' for Spanish) in 'Target Language' Dropdown component
    api_name="/predict"
)
print(result)
```

**Return Type(s):**

- A `filepath` representing the output in the '*Translated SRT*' File component.

πŸ”΄ **If you have this error** : 'Failed to load model for aa to ab: Helsinki-NLP/opus-mt-aa-ab is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`', **its because the language is not available.**

## JavaScript Usage

### Step 1: Installation

For JavaScript, ensure the `@gradio/client` package is installed in your project.

```bash
npm i -D @gradio/client
```

### Step 2: Making a Request

As with Python, find the API endpoint that suits your needs. Replace the placeholders with your own data. If accessing a private Space, include your Hugging Face token.

**API Name**: `/predict`

```javascript
import { client } from "@gradio/client";

const response_0 = await fetch("https://github.com/gradio-app/gradio/raw/main/test/test_files/sample_file.pdf");
const exampleFile = await response_0.blob();
                        
const app = await client("Lenylvt/SRT_Translation-API");
const result = await app.predict("/predict", [
    exampleFile,  // blob in 'Upload SRT File' File component        
    "en",         // string in 'Source Language' Dropdown component        
    "es",         // string in 'Target Language' Dropdown component
]);

console.log(result.data);
```

**Return Type(s):**

- `undefined` representing the output in the '*Translated SRT*' File component.

πŸ”΄ **If you have this error** : 'Failed to load model for aa to ab: Helsinki-NLP/opus-mt-aa-ab is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`', **its because the language is not available.**