--- license: cc-by-nc-4.0 language: - en - de - zh - ru - he - fr - pt - nl - ko - it - es metrics: - comet pipeline_tag: translation datasets: - Unbabel/TowerBlocks-v0.1 --- # Model Card for xTower13B ## Model Details xTower13B is a language model that results from fine-tuning [TowerBase](https://huggingface.co/Unbabel/TowerBase-13B-v0.1) for **explaining** and **correcting translation errors**. xTower was finetuned on a dataset that includes explanations generated from GPT-4 (prompted with and without references), along with machine translation data from [TowerBlocks](https://huggingface.co/datasets/Unbabel/TowerBlocks-v0.1). We combined all available data to train a single, multilingual model, employing a mixed prompt setting~(zero-shot, few-shot) during training. As a result, xTower can handle both referenceless and reference-based k-shot prompts. Our training hyperparameters and configuration follows that used to train [TowerInstruct](https://huggingface.co/Unbabel/TowerInstruct-13B-v0.1). - **Developed by:** Unbabel, Instituto Superior Técnico, CentraleSupélec University of Paris-Saclay - **Model type:** A 13B parameter model fine-tuned on a mix of publicly available, synthetic datasets on translation-related tasks, as well as conversational datasets and code instructions. - **Language(s) (NLP):** English, Portuguese, Spanish, French, German, Dutch, Italian, Korean, Chinese, Russian - **License:** CC-BY-NC-4.0, Llama 2 is licensed under the [LLAMA 2 Community License](https://ai.meta.com/llama/license/), Copyright © Meta Platforms, Inc. All Rights Reserved. - **Finetuned from model:** [TowerBase](https://huggingface.co/Unbabel/TowerBase-13B-v0.1) ## Intended uses & limitations The intented use of the model is two-fold: - Explain translation errors in natural language - Correct translation errors by providing a translation suggestion The model was trained on explanations from **en-de**, **en-ru**, and **zh-en** language pairs. Despite its multilinguality roots, xTower may or may not perform well on unseen language pairs. ### Usage For a minimal example, check out this [notebook](https://gist.github.com/mtreviso/b618b499bc6de0414a3e11157e91cf02), which prompts xTower with VLLM. Additionally, here's how you can run the model using the `pipeline()` function from 🤗 Transformers: ```python # Install transformers from source - only needed for versions <= v4.34 # pip install git+https://github.com/huggingface/transformers.git # pip install accelerate import torch from transformers import pipeline pipe = pipeline("text-generation", model="sardinelab/xTower13B", device_map="auto") messages = [ { "role": "user", "content": "your_prompt" }, ] prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) outputs = pipe(prompt, max_new_tokens=1024, do_sample=False) ``` ## Bias, Risks, and Limitations xTower has not been aligned to human preferences, so the model may generate problematic outputs (e.g., hallucinations, harmful content, or false statements). ## Prompt Format xTower was trained using the ChatML prompt templates without any system prompts. An example follows below: ``` <|im_start|>user You are provided with a Source, Translation, Translation quality analysis, and Translation quality score (weak, moderate, good, excellent, best). The Translation quality analysis contain a translation with marked error spans with different levels of severity (minor or major). Additionally, we may provide a **reference translation**. Given this information, generate an explanation for each error and a fully correct translation. English source: All were wearing avalanche beacons. German translation: Alle trugen Lawinenschilder. Translation quality analysis: Alle trugen Lawinenschilder. Translation quality score: weak <|im_end|> <|im_start|>assistant ``` After that, the model should generate something like this: ``` Explanation for error1: The term “Lawinenschilder” translates to “avalanche signs”, which is a significant mistranslation of “avalanche beacons”. Avalanche beacons are specific devices used to locate people trapped under snow after an avalanche, not signs. This error changes the meaning of the sentence and introduces confusion about the context. Translation correction: Alle trugen Lawinensuchgeräte. <|im_end|> ``` That is, the output consists of a list of explanations for each error, and a final translation correciton. ## Citation ```bibtex todo ```