Documentation / docs /Translator /SRTTranslation.md
Lenylvt's picture
Rename docs/Translator/docs_SRT_Translation.md to docs/Translator/SRTTranslation.md
1005544 verified
|
raw
history blame
No virus
3.43 kB

API Documentation for Lenylvt/SRT_Translation-API

This documentation covers how to interact with the SRT_Translation API using both Python and JavaScript.

API Endpoint

To use this API, you can opt for the gradio_client Python library docs or the @gradio/client JavaScript package docs.

Python Usage

Step 1: Installation

Firstly, install the gradio_client if it's not already installed.

pip install gradio_client

Step 2: Making a Request

Locate the API endpoint for the function you wish to use. Replace the placeholder values in the snippet below with your actual input data. For accessing private Spaces, you might need to include your Hugging Face token as well.

API Name: /predict

from gradio_client import Client

client = Client("Lenylvt/SRT_Translation-API")
result = client.predict(
    "https://github.com/gradio-app/gradio/raw/main/test/test_files/sample_file.pdf",  # filepath in 'Upload SRT File' File component
    "en",  # Source Language (ISO 639-1 code, e.g., 'en' for English) in 'Source Language' Dropdown component
    "es",  # Target Language (ISO 639-1 code, e.g., 'es' for Spanish) in 'Target Language' Dropdown component
    api_name="/predict"
)
print(result)

Return Type(s):

  • A filepath representing the output in the 'Translated SRT' File component.

πŸ”΄ If you have this error : 'Failed to load model for aa to ab: Helsinki-NLP/opus-mt-aa-ab is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>', its because the language is not available.

JavaScript Usage

Step 1: Installation

For JavaScript, ensure the @gradio/client package is installed in your project.

npm i -D @gradio/client

Step 2: Making a Request

As with Python, find the API endpoint that suits your needs. Replace the placeholders with your own data. If accessing a private Space, include your Hugging Face token.

API Name: /predict

import { client } from "@gradio/client";

const response_0 = await fetch("https://github.com/gradio-app/gradio/raw/main/test/test_files/sample_file.pdf");
const exampleFile = await response_0.blob();
                        
const app = await client("Lenylvt/SRT_Translation-API");
const result = await app.predict("/predict", [
    exampleFile,  // blob in 'Upload SRT File' File component        
    "en",         // string in 'Source Language' Dropdown component        
    "es",         // string in 'Target Language' Dropdown component
]);

console.log(result.data);

Return Type(s):

  • undefined representing the output in the 'Translated SRT' File component.

πŸ”΄ If you have this error : 'Failed to load model for aa to ab: Helsinki-NLP/opus-mt-aa-ab is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>', its because the language is not available.