{"cells":[{"cell_type":"markdown","metadata":{"id":"8CWh7AqSH7ug"},"source":["![JohnSnowLabs](https://nlp.johnsnowlabs.com/assets/images/logo.png)\n","\n","[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/JohnSnowLabs/nlu/blob/master/examples/colab/component_examples/sequence2sequence/NLU_M2M100Transformer.ipynb)\n","\n","# M2M100Transformer\n","\n","The M2M100Transformer is a groundbreaking model for multilingual translation, capable of translating directly between 9,900 language pairs from a set of 100 languages. Unlike previous models that mainly relied on English as an intermediary, M2M100 enables direct translations between any two languages, eliminating English-centric biases. This is achieved through a meticulously crafted dataset covering thousands of language directions with supervised data, sourced from extensive large-scale mining.\n","\n","Leveraging the power of M2M100, the M2M100Transformer annotator in Spark NLP is specially designed for efficient, high-quality multilingual translation tasks. It embodies the spirit of global communication, enabling direct and fluent translations between a vast array of languages.\n","\n","![image.png]()\n","\n","Key features of the M2M100 include its ability to perform true multilingual translations, innovative training datasets that move beyond English-centric data, and enhanced model capacity through dense scaling and language-specific sparse parameters. The model demonstrates significant improvements in translation quality, particularly for non-English language pairs, with gains of over 10 BLEU points.\n","\n","Integrated into Spark NLP as the M2M100Transformer annotator, this model supports scalable, efficient, and high-accuracy translations, making it a valuable tool for global communication. Its open-source nature encourages further research and development in the field of multilingual translation, aiming to provide inclusive and high-quality translation solutions.\n","\n","**Read More**: [Paper](https://arxiv.org/pdf/2010.11125.pdf)"]},{"cell_type":"code","execution_count":null,"metadata":{"id":"bWKJ_Mz9CYlp"},"outputs":[],"source":["! pip install johnsnowlabs"]},{"cell_type":"code","execution_count":null,"metadata":{"executionInfo":{"elapsed":10,"status":"ok","timestamp":1711209672884,"user":{"displayName":"Samed Koçer","userId":"16161902236051002702"},"user_tz":240},"id":"xXU0VJl1AGsg"},"outputs":[],"source":["from johnsnowlabs import nlp\n","\n","spark = nlp.start()"]},{"cell_type":"code","execution_count":2,"metadata":{},"outputs":[{"data":{"text/html":["\n","            <div>\n","                <p><b>SparkSession - in-memory</b></p>\n","                \n","        <div>\n","            <p><b>SparkContext</b></p>\n","\n","            <p><a href=\"http://10.0.0.17:4040\">Spark UI</a></p>\n","\n","            <dl>\n","              <dt>Version</dt>\n","                <dd><code>v3.5.1</code></dd>\n","              <dt>Master</dt>\n","                <dd><code>local[*]</code></dd>\n","              <dt>AppName</dt>\n","                <dd><code>John-Snow-Labs-Spark-Session 🚀 with Jars for: 🚀Spark-NLP==5.3.2, running on ⚡ PySpark==3.4.0</code></dd>\n","            </dl>\n","        </div>\n","        \n","            </div>\n","        "],"text/plain":["<pyspark.sql.session.SparkSession at 0x12f655bb0>"]},"execution_count":2,"metadata":{},"output_type":"execute_result"}],"source":["spark"]},{"cell_type":"code","execution_count":3,"metadata":{"colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"elapsed":382525,"status":"ok","timestamp":1711210065564,"user":{"displayName":"Samed Koçer","userId":"16161902236051002702"},"user_tz":240},"id":"8-Wrfea9BIbt","outputId":"c300d675-4e8d-452b-ad31-a5682389efe1"},"outputs":[{"name":"stdout","output_type":"stream","text":["Warning::Spark Session already created, some configs may not take.\n","Warning::Spark Session already created, some configs may not take.\n","m2m100_418M download started this may take some time.\n"]},{"name":"stderr","output_type":"stream","text":["24/05/18 03:57:03 WARN SparkSession: Using an existing Spark session; only runtime SQL configurations will take effect.\n"]},{"name":"stdout","output_type":"stream","text":["Approximate size to download 2.8 GB\n","[ | ]m2m100_418M download started this may take some time.\n","Approximate size to download 2.8 GB\n","[ / ]Download done! Loading the resource.\n","[ | ]Using CPUs\n","[ / ]Using CPUs\n","[OK!]\n"]}],"source":["model = nlp.load(\"xx.m2m100_418M\")"]},{"cell_type":"code","execution_count":4,"metadata":{"executionInfo":{"elapsed":409,"status":"ok","timestamp":1711210071261,"user":{"displayName":"Samed Koçer","userId":"16161902236051002702"},"user_tz":240},"id":"7kfGLm4eC5U8"},"outputs":[],"source":["text = [\"\"\"生活就像一盒巧克力。\"\"\"]"]},{"cell_type":"code","execution_count":5,"metadata":{"colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"elapsed":19861,"status":"ok","timestamp":1711210099698,"user":{"displayName":"Samed Koçer","userId":"16161902236051002702"},"user_tz":240},"id":"R05G4z7QD_Pp","outputId":"7368203d-6ea6-4a32-cc71-1efe7f1062a4"},"outputs":[{"name":"stdout","output_type":"stream","text":["sentence_detector_dl download started this may take some time.\n","Approximate size to download 354.6 KB\n","[ | ]sentence_detector_dl download started this may take some time.\n","Approximate size to download 354.6 KB\n","Download done! Loading the resource.\n","[ / ]"]},{"name":"stderr","output_type":"stream","text":["2024-05-18 03:57:31.976061: W external/org_tensorflow/tensorflow/core/platform/profile_utils/cpu_utils.cc:128] Failed to get CPU frequency: 0 Hz\n","WARNING: An illegal reflective access operation has occurred\n","WARNING: Illegal reflective access by org.apache.spark.util.SizeEstimator$ (file:/opt/homebrew/Cellar/apache-spark/3.5.1/libexec/jars/spark-core_2.12-3.5.1.jar) to field java.lang.ref.Reference.referent\n","WARNING: Please consider reporting this to the maintainers of org.apache.spark.util.SizeEstimator$\n","WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations\n","WARNING: All illegal access operations will be denied in a future release\n"]},{"name":"stdout","output_type":"stream","text":["[OK!]\n","Warning::Spark Session already created, some configs may not take.\n"]},{"name":"stderr","output_type":"stream","text":["                                                                                \r"]}],"source":["ner_df = model.predict(text)"]},{"cell_type":"code","execution_count":6,"metadata":{"colab":{"base_uri":"https://localhost:8080/","height":89},"executionInfo":{"elapsed":695,"status":"ok","timestamp":1711210111138,"user":{"displayName":"Samed Koçer","userId":"16161902236051002702"},"user_tz":240},"id":"l0o0NannEaNU","outputId":"da18f5ca-b68c-4bc4-b869-9f183be24b12"},"outputs":[{"data":{"text/html":["<div>\n","<style scoped>\n","    .dataframe tbody tr th:only-of-type {\n","        vertical-align: middle;\n","    }\n","\n","    .dataframe tbody tr th {\n","        vertical-align: top;\n","    }\n","\n","    .dataframe thead th {\n","        text-align: right;\n","    }\n","</style>\n","<table border=\"1\" class=\"dataframe\">\n","  <thead>\n","    <tr style=\"text-align: right;\">\n","      <th></th>\n","      <th>generated</th>\n","      <th>sentence</th>\n","    </tr>\n","  </thead>\n","  <tbody>\n","    <tr>\n","      <th>0</th>\n","      <td>La vie est comme une boîte de chocolat.</td>\n","      <td>生活就像一盒巧克力。</td>\n","    </tr>\n","  </tbody>\n","</table>\n","</div>"],"text/plain":["                                 generated    sentence\n","0  La vie est comme une boîte de chocolat.  生活就像一盒巧克力。"]},"execution_count":6,"metadata":{},"output_type":"execute_result"}],"source":["ner_df"]}],"metadata":{"accelerator":"GPU","colab":{"authorship_tag":"ABX9TyOx1i+8KuAGrD/Flw84HYQm","gpuType":"T4","machine_shape":"hm","provenance":[]},"kernelspec":{"display_name":"Python 3","name":"python3"},"language_info":{"codemirror_mode":{"name":"ipython","version":3},"file_extension":".py","mimetype":"text/x-python","name":"python","nbconvert_exporter":"python","pygments_lexer":"ipython3","version":"3.9.19"}},"nbformat":4,"nbformat_minor":0}
