# API Documentation for `Lenylvt/Translator-API` This documentation explains how to interact with the Translator API using both Python and JavaScript. ## API Endpoint To interact with this API, you have the option to use the `gradio_client` Python library or the `@gradio/client` JavaScript package. ## Python Usage ### Step 1: Installation First, install the `gradio_client` library if it's not already installed. ```python pip install gradio_client ``` ### Step 2: Making a Request Locate the API endpoint for the function you intend to use. Replace the placeholder values in the snippet below with your actual input data. If accessing a private Space, you may need to include your Hugging Face token. **API Name**: `/predict` ```python from gradio_client import Client client = Client("Lenylvt/Translator-API") result = client.predict( "Hello!!", # str in 'text' Textbox component "en", # Source Language (ISO 639-1 code, e.g., 'en' for English) in 'Source Language' Dropdown component "es", # Target Language (ISO 639-1 code, e.g., 'es' for Spanish) in 'Target Language' Dropdown component api_name="/predict" ) print(result) ``` **Return Type(s):** - A `str` representing the translated text output in the 'output' Textbox component. 🔴 **If you have this error** : 'Failed to load model for aa to ab: Helsinki-NLP/opus-mt-aa-ab is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=`', **its because the language is not available.** ## JavaScript Usage ### Step 1: Installation Install the `@gradio/client` package if it's not already in your project. ```bash npm i -D @gradio/client ``` ### Step 2: Making a Request As with Python, identify the API endpoint that matches your requirement. Replace the placeholders with your data. If this is a private Space, don't forget to include your Hugging Face token. **API Name**: `/predict` ```javascript import { client } from "@gradio/client"; const app = await client("Lenylvt/Translator-API"); const result = await app.predict("/predict", [ "Hello!!", // string in 'text' Textbox component "en", // string representing ISO 639-1 code for Source Language in 'Source Language' Dropdown component "es", // string representing ISO 639-1 code for Target Language in 'Target Language' Dropdown component ]); console.log(result.data); ``` **Return Type(s):** - A `string` representing the translated text output in the 'output' Textbox component. 🔴 **If you have this error** : 'Failed to load model for aa to ab: Helsinki-NLP/opus-mt-aa-ab is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=`', **its because the language is not available.**