Lenylvt commited on
Commit
15ebc22
β€’
1 Parent(s): 980dab0

Upload docs_SRT_Translation.md

Browse files
docs/Translator/docs_SRT_Translation.md ADDED
@@ -0,0 +1,80 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # API Documentation for `Lenylvt/SRT_Translation-API`
2
+
3
+ This documentation covers how to interact with the SRT_Translation API using both Python and JavaScript.
4
+
5
+ ## API Endpoint
6
+
7
+ To use this API, you can opt for the `gradio_client` Python library [docs](https://www.gradio.app/guides/getting-started-with-the-python-client) or the `@gradio/client` JavaScript package [docs](https://www.gradio.app/guides/getting-started-with-the-js-client).
8
+
9
+ ## Python Usage
10
+
11
+ ### Step 1: Installation
12
+
13
+ Firstly, install the `gradio_client` if it's not already installed.
14
+
15
+ ```python
16
+ pip install gradio_client
17
+ ```
18
+
19
+ ### Step 2: Making a Request
20
+
21
+ Locate the API endpoint for the function you wish to use. Replace the placeholder values in the snippet below with your actual input data. For accessing private Spaces, you might need to include your Hugging Face token as well.
22
+
23
+ **API Name**: `/predict`
24
+
25
+ ```python
26
+ from gradio_client import Client
27
+
28
+ client = Client("Lenylvt/SRT_Translation-API")
29
+ result = client.predict(
30
+ "https://github.com/gradio-app/gradio/raw/main/test/test_files/sample_file.pdf", # filepath in 'Upload SRT File' File component
31
+ "en", # Source Language (ISO 639-1 code, e.g., 'en' for English) in 'Source Language' Dropdown component
32
+ "es", # Target Language (ISO 639-1 code, e.g., 'es' for Spanish) in 'Target Language' Dropdown component
33
+ api_name="/predict"
34
+ )
35
+ print(result)
36
+ ```
37
+
38
+ **Return Type(s):**
39
+
40
+ - A `filepath` representing the output in the '*Translated SRT*' File component.
41
+
42
+ πŸ”΄ **If you have this error** : 'Failed to load model for aa to ab: Helsinki-NLP/opus-mt-aa-ab is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`', **its because the language is not available.**
43
+
44
+ ## JavaScript Usage
45
+
46
+ ### Step 1: Installation
47
+
48
+ For JavaScript, ensure the `@gradio/client` package is installed in your project.
49
+
50
+ ```bash
51
+ npm i -D @gradio/client
52
+ ```
53
+
54
+ ### Step 2: Making a Request
55
+
56
+ As with Python, find the API endpoint that suits your needs. Replace the placeholders with your own data. If accessing a private Space, include your Hugging Face token.
57
+
58
+ **API Name**: `/predict`
59
+
60
+ ```javascript
61
+ import { client } from "@gradio/client";
62
+
63
+ const response_0 = await fetch("https://github.com/gradio-app/gradio/raw/main/test/test_files/sample_file.pdf");
64
+ const exampleFile = await response_0.blob();
65
+
66
+ const app = await client("Lenylvt/SRT_Translation-API");
67
+ const result = await app.predict("/predict", [
68
+ exampleFile, // blob in 'Upload SRT File' File component
69
+ "en", // string in 'Source Language' Dropdown component
70
+ "es", // string in 'Target Language' Dropdown component
71
+ ]);
72
+
73
+ console.log(result.data);
74
+ ```
75
+
76
+ **Return Type(s):**
77
+
78
+ - `undefined` representing the output in the '*Translated SRT*' File component.
79
+
80
+ πŸ”΄ **If you have this error** : 'Failed to load model for aa to ab: Helsinki-NLP/opus-mt-aa-ab is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`', **its because the language is not available.**