{ "cells": [ { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "3kZD2vXl4MXU", "outputId": "6c74a96a-46fe-4b49-ceb9-cac5582c45e5" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/content/drive/MyDrive/QA_pt\n", "Get:1 http://security.ubuntu.com/ubuntu bionic-security InRelease [88.7 kB]\n", "Get:2 https://cloud.r-project.org/bin/linux/ubuntu bionic-cran40/ InRelease [3,626 B]\n", "Ign:3 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 InRelease\n", "Hit:4 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 InRelease\n", "Hit:5 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 Release\n", "Get:6 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic InRelease [15.9 kB]\n", "Hit:7 http://archive.ubuntu.com/ubuntu bionic InRelease\n", "Get:8 http://archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB]\n", "Hit:9 http://ppa.launchpad.net/cran/libgit2/ubuntu bionic InRelease\n", "Hit:10 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu bionic InRelease\n", "Get:11 http://archive.ubuntu.com/ubuntu bionic-backports InRelease [83.3 kB]\n", "Hit:12 http://ppa.launchpad.net/graphics-drivers/ppa/ubuntu bionic InRelease\n", "Get:13 http://security.ubuntu.com/ubuntu bionic-security/restricted amd64 Packages [1,226 kB]\n", "Get:14 http://security.ubuntu.com/ubuntu bionic-security/main amd64 Packages [3,040 kB]\n", "Get:16 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic/main Sources [2,213 kB]\n", "Get:17 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic/main amd64 Packages [1,132 kB]\n", "Get:18 http://archive.ubuntu.com/ubuntu bionic-updates/main amd64 Packages [3,472 kB]\n", "Get:19 http://archive.ubuntu.com/ubuntu bionic-updates/restricted amd64 Packages [1,267 kB]\n", "Fetched 12.6 MB in 5s (2,341 kB/s)\n", "Reading package lists... Done\n", "Building dependency tree \n", "Reading state information... Done\n", "8 packages can be upgraded. Run 'apt list --upgradable' to see them.\n", "Reading package lists... Done\n", "Building dependency tree \n", "Reading state information... Done\n", "The following package was automatically installed and is no longer required:\n", " libnvidia-common-460\n", "Use 'apt autoremove' to remove it.\n", "The following additional packages will be installed:\n", " aspell aspell-en dictionaries-common emacsen-common libaspell15\n", " libenchant1c2a libhunspell-1.6-0 libtext-iconv-perl\n", "Suggested packages:\n", " aspell-doc spellutils wordlist hunspell libreoffice-writer libenchant-voikko\n", "The following NEW packages will be installed:\n", " aspell aspell-en dictionaries-common emacsen-common enchant hunspell-pt-br\n", " libaspell15 libenchant-dev libenchant1c2a libhunspell-1.6-0\n", " libtext-iconv-perl\n", "0 upgraded, 11 newly installed, 0 to remove and 8 not upgraded.\n", "Need to get 2,359 kB of archives.\n", "After this operation, 11.0 MB of additional disk space will be used.\n", "Get:1 http://archive.ubuntu.com/ubuntu bionic/main amd64 libtext-iconv-perl amd64 1.7-5build6 [13.0 kB]\n", "Get:2 http://archive.ubuntu.com/ubuntu bionic-updates/main amd64 libaspell15 amd64 0.60.7~20110707-4ubuntu0.2 [310 kB]\n", "Get:3 http://archive.ubuntu.com/ubuntu bionic/main amd64 emacsen-common all 2.0.8 [17.6 kB]\n", "Get:4 http://archive.ubuntu.com/ubuntu bionic/main amd64 dictionaries-common all 1.27.2 [186 kB]\n", "Get:5 http://archive.ubuntu.com/ubuntu bionic-updates/main amd64 aspell amd64 0.60.7~20110707-4ubuntu0.2 [87.7 kB]\n", "Get:6 http://archive.ubuntu.com/ubuntu bionic/main amd64 aspell-en all 2017.08.24-0-0.1 [298 kB]\n", "Get:7 http://archive.ubuntu.com/ubuntu bionic/main amd64 hunspell-pt-br all 1:6.0.3-3 [1,163 kB]\n", "Get:8 http://archive.ubuntu.com/ubuntu bionic/main amd64 libhunspell-1.6-0 amd64 1.6.2-1 [154 kB]\n", "Get:9 http://archive.ubuntu.com/ubuntu bionic/main amd64 libenchant1c2a amd64 1.6.0-11.1 [64.4 kB]\n", "Get:10 http://archive.ubuntu.com/ubuntu bionic/main amd64 enchant amd64 1.6.0-11.1 [12.2 kB]\n", "Get:11 http://archive.ubuntu.com/ubuntu bionic/main amd64 libenchant-dev amd64 1.6.0-11.1 [52.2 kB]\n", "Fetched 2,359 kB in 1s (2,208 kB/s)\n", "Preconfiguring packages ...\n", "Selecting previously unselected package libtext-iconv-perl.\n", "(Reading database ... 123941 files and directories currently installed.)\n", "Preparing to unpack .../00-libtext-iconv-perl_1.7-5build6_amd64.deb ...\n", "Unpacking libtext-iconv-perl (1.7-5build6) ...\n", "Selecting previously unselected package libaspell15:amd64.\n", "Preparing to unpack .../01-libaspell15_0.60.7~20110707-4ubuntu0.2_amd64.deb ...\n", "Unpacking libaspell15:amd64 (0.60.7~20110707-4ubuntu0.2) ...\n", "Selecting previously unselected package emacsen-common.\n", "Preparing to unpack .../02-emacsen-common_2.0.8_all.deb ...\n", "Unpacking emacsen-common (2.0.8) ...\n", "Selecting previously unselected package dictionaries-common.\n", "Preparing to unpack .../03-dictionaries-common_1.27.2_all.deb ...\n", "Adding 'diversion of /usr/share/dict/words to /usr/share/dict/words.pre-dictionaries-common by dictionaries-common'\n", "Unpacking dictionaries-common (1.27.2) ...\n", "Selecting previously unselected package aspell.\n", "Preparing to unpack .../04-aspell_0.60.7~20110707-4ubuntu0.2_amd64.deb ...\n", "Unpacking aspell (0.60.7~20110707-4ubuntu0.2) ...\n", "Selecting previously unselected package aspell-en.\n", "Preparing to unpack .../05-aspell-en_2017.08.24-0-0.1_all.deb ...\n", "Unpacking aspell-en (2017.08.24-0-0.1) ...\n", "Selecting previously unselected package hunspell-pt-br.\n", "Preparing to unpack .../06-hunspell-pt-br_1%3a6.0.3-3_all.deb ...\n", "Unpacking hunspell-pt-br (1:6.0.3-3) ...\n", "Selecting previously unselected package libhunspell-1.6-0:amd64.\n", "Preparing to unpack .../07-libhunspell-1.6-0_1.6.2-1_amd64.deb ...\n", "Unpacking libhunspell-1.6-0:amd64 (1.6.2-1) ...\n", "Selecting previously unselected package libenchant1c2a:amd64.\n", "Preparing to unpack .../08-libenchant1c2a_1.6.0-11.1_amd64.deb ...\n", "Unpacking libenchant1c2a:amd64 (1.6.0-11.1) ...\n", "Selecting previously unselected package enchant.\n", "Preparing to unpack .../09-enchant_1.6.0-11.1_amd64.deb ...\n", "Unpacking enchant (1.6.0-11.1) ...\n", "Selecting previously unselected package libenchant-dev.\n", "Preparing to unpack .../10-libenchant-dev_1.6.0-11.1_amd64.deb ...\n", "Unpacking libenchant-dev (1.6.0-11.1) ...\n", "Setting up libhunspell-1.6-0:amd64 (1.6.2-1) ...\n", "Setting up libaspell15:amd64 (0.60.7~20110707-4ubuntu0.2) ...\n", "Setting up emacsen-common (2.0.8) ...\n", "Setting up libtext-iconv-perl (1.7-5build6) ...\n", "Setting up dictionaries-common (1.27.2) ...\n", "Setting up hunspell-pt-br (1:6.0.3-3) ...\n", "Setting up aspell (0.60.7~20110707-4ubuntu0.2) ...\n", "Setting up libenchant1c2a:amd64 (1.6.0-11.1) ...\n", "Setting up aspell-en (2017.08.24-0-0.1) ...\n", "Setting up enchant (1.6.0-11.1) ...\n", "Setting up libenchant-dev (1.6.0-11.1) ...\n", "Processing triggers for man-db (2.8.3-2ubuntu0.1) ...\n", "Processing triggers for libc-bin (2.27-3ubuntu1.6) ...\n", "Processing triggers for dictionaries-common (1.27.2) ...\n", "aspell-autobuildhash: processing: en [en-common].\n", "aspell-autobuildhash: processing: en [en-variant_0].\n", "aspell-autobuildhash: processing: en [en-variant_1].\n", "aspell-autobuildhash: processing: en [en-variant_2].\n", "aspell-autobuildhash: processing: en [en-w_accents-only].\n", "aspell-autobuildhash: processing: en [en-wo_accents-only].\n", "aspell-autobuildhash: processing: en [en_AU-variant_0].\n", "aspell-autobuildhash: processing: en [en_AU-variant_1].\n", "aspell-autobuildhash: processing: en [en_AU-w_accents-only].\n", "aspell-autobuildhash: processing: en [en_AU-wo_accents-only].\n", "aspell-autobuildhash: processing: en [en_CA-variant_0].\n", "aspell-autobuildhash: processing: en [en_CA-variant_1].\n", "aspell-autobuildhash: processing: en [en_CA-w_accents-only].\n", "aspell-autobuildhash: processing: en [en_CA-wo_accents-only].\n", "aspell-autobuildhash: processing: en [en_GB-ise-w_accents-only].\n", "aspell-autobuildhash: processing: en [en_GB-ise-wo_accents-only].\n", "aspell-autobuildhash: processing: en [en_GB-ize-w_accents-only].\n", "aspell-autobuildhash: processing: en [en_GB-ize-wo_accents-only].\n", "aspell-autobuildhash: processing: en [en_GB-variant_0].\n", "aspell-autobuildhash: processing: en [en_GB-variant_1].\n", "aspell-autobuildhash: processing: en [en_US-w_accents-only].\n", "aspell-autobuildhash: processing: en [en_US-wo_accents-only].\n", "Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/\n", "Collecting pyenchant\n", " Downloading pyenchant-3.2.2-py3-none-any.whl (55 kB)\n", "\u001b[K |████████████████████████████████| 55 kB 2.2 MB/s \n", "\u001b[?25hCollecting datasets\n", " Downloading datasets-2.6.1-py3-none-any.whl (441 kB)\n", "\u001b[K |████████████████████████████████| 441 kB 9.4 MB/s \n", "\u001b[?25hRequirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.7/dist-packages (from datasets) (1.21.6)\n", "Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.7/dist-packages (from datasets) (6.0)\n", "Requirement already satisfied: tqdm>=4.62.1 in /usr/local/lib/python3.7/dist-packages (from datasets) (4.64.1)\n", "Requirement already satisfied: aiohttp in /usr/local/lib/python3.7/dist-packages (from datasets) (3.8.3)\n", "Requirement already satisfied: pandas in /usr/local/lib/python3.7/dist-packages (from datasets) (1.3.5)\n", "Requirement already satisfied: packaging in /usr/local/lib/python3.7/dist-packages (from datasets) (21.3)\n", "Requirement already satisfied: pyarrow>=6.0.0 in /usr/local/lib/python3.7/dist-packages (from datasets) (6.0.1)\n", "Collecting xxhash\n", " Downloading xxhash-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (212 kB)\n", "\u001b[K |████████████████████████████████| 212 kB 47.8 MB/s \n", "\u001b[?25hCollecting huggingface-hub<1.0.0,>=0.2.0\n", " Downloading huggingface_hub-0.10.1-py3-none-any.whl (163 kB)\n", "\u001b[K |████████████████████████████████| 163 kB 55.7 MB/s \n", "\u001b[?25hRequirement already satisfied: fsspec[http]>=2021.11.1 in /usr/local/lib/python3.7/dist-packages (from datasets) (2022.10.0)\n", "Requirement already satisfied: importlib-metadata in /usr/local/lib/python3.7/dist-packages (from datasets) (4.13.0)\n", "Collecting responses<0.19\n", " Downloading responses-0.18.0-py3-none-any.whl (38 kB)\n", "Collecting multiprocess\n", " Downloading multiprocess-0.70.14-py37-none-any.whl (115 kB)\n", "\u001b[K |████████████████████████████████| 115 kB 54.3 MB/s \n", "\u001b[?25hCollecting dill<0.3.6\n", " Downloading dill-0.3.5.1-py2.py3-none-any.whl (95 kB)\n", "\u001b[K |████████████████████████████████| 95 kB 5.1 MB/s \n", "\u001b[?25hRequirement already satisfied: requests>=2.19.0 in /usr/local/lib/python3.7/dist-packages (from datasets) (2.23.0)\n", "Requirement already satisfied: typing-extensions>=3.7.4 in /usr/local/lib/python3.7/dist-packages (from aiohttp->datasets) (4.1.1)\n", "Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.7/dist-packages (from aiohttp->datasets) (6.0.2)\n", "Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.7/dist-packages (from aiohttp->datasets) (1.8.1)\n", "Requirement already satisfied: charset-normalizer<3.0,>=2.0 in /usr/local/lib/python3.7/dist-packages (from aiohttp->datasets) (2.1.1)\n", "Requirement already satisfied: asynctest==0.13.0 in /usr/local/lib/python3.7/dist-packages (from aiohttp->datasets) (0.13.0)\n", "Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.7/dist-packages (from aiohttp->datasets) (22.1.0)\n", "Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.7/dist-packages (from aiohttp->datasets) (1.2.0)\n", "Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.7/dist-packages (from aiohttp->datasets) (1.3.1)\n", "Requirement already satisfied: async-timeout<5.0,>=4.0.0a3 in /usr/local/lib/python3.7/dist-packages (from aiohttp->datasets) (4.0.2)\n", "Requirement already satisfied: filelock in /usr/local/lib/python3.7/dist-packages (from huggingface-hub<1.0.0,>=0.2.0->datasets) (3.8.0)\n", "Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.7/dist-packages (from packaging->datasets) (3.0.9)\n", "Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->datasets) (3.0.4)\n", "Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->datasets) (1.24.3)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->datasets) (2022.9.24)\n", "Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->datasets) (2.10)\n", "Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1\n", " Downloading urllib3-1.25.11-py2.py3-none-any.whl (127 kB)\n", "\u001b[K |████████████████████████████████| 127 kB 54.2 MB/s \n", "\u001b[?25hRequirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata->datasets) (3.10.0)\n", "Collecting multiprocess\n", " Downloading multiprocess-0.70.13-py37-none-any.whl (115 kB)\n", "\u001b[K |████████████████████████████████| 115 kB 55.2 MB/s \n", "\u001b[?25hRequirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.7/dist-packages (from pandas->datasets) (2022.5)\n", "Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.7/dist-packages (from pandas->datasets) (2.8.2)\n", "Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.7/dist-packages (from python-dateutil>=2.7.3->pandas->datasets) (1.15.0)\n", "Installing collected packages: urllib3, dill, xxhash, responses, multiprocess, huggingface-hub, pyenchant, datasets\n", " Attempting uninstall: urllib3\n", " Found existing installation: urllib3 1.24.3\n", " Uninstalling urllib3-1.24.3:\n", " Successfully uninstalled urllib3-1.24.3\n", " Attempting uninstall: dill\n", " Found existing installation: dill 0.3.6\n", " Uninstalling dill-0.3.6:\n", " Successfully uninstalled dill-0.3.6\n", "Successfully installed datasets-2.6.1 dill-0.3.5.1 huggingface-hub-0.10.1 multiprocess-0.70.13 pyenchant-3.2.2 responses-0.18.0 urllib3-1.25.11 xxhash-3.1.0\n" ] } ], "source": [ "workdir = \"/content/drive/MyDrive/QA_pt\"\n", "%mkdir {workdir}\n", "!export HF_DATASETS_CACHE=\"{workdir}/cache_huggingface\"\n", "\n", "cache_dir=f\"{workdir}/cache_huggingface\"\n", "\n", "%cd {workdir}\n", "\n", "!apt update -y\n", "!apt install -y enchant libenchant-dev hunspell-pt-br\n", "!pip install pyenchant datasets" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "cl0mVatz7EIo" }, "outputs": [], "source": [ "from datasets import load_dataset\n", "import enchant\n", "import re\n", "\n", "def is_portuguese(text: str) -> bool:\n", " words = re.split(r'\\s', text)\n", " words = [w for w in words if w.isalpha()]\n", " pt_words = [w for w in words if spell.check(w)]\n", "\n", " return len(pt_words) >= 3\n", "\n", "spell = enchant.Dict(\"pt_BR\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 363, "referenced_widgets": [ "39519ff2453c440ba6e82b6eb79e16c5", "794b5bbd816f4d069981459e847fe251", "8d392d3b957b460ba744f4b58d233450", "5a58cd791fc2485dbbbe6237fa6d5fd6", "6c172d851a8f4a3a9724cd3857280678", "22d8500e60ef47c1a96530eb72d9cd61", "70d042af273c4ac6a5c02bec875bd69d", "87c1586197ff420dbafde5bd872d2459", "7a4cb950b4a842d6b9e8a325dfc777c3", "d20d2b8e4d334963826f86aec167b2cb", "ef313f6dc35f4fef97b38f8dbb032ceb", "da700f026a894b249a0a29cde51eeb5a", "47843bdedddd498db3d5074ef939dd83", "b9b92087f5b1405eba29551f7dc32a0f", "9650446ad3674ceeac52e0668f43053b", "a97981e290974c078e7ffae855cf952d", "1d7539b118884f7bb33f9bbf4af7ee5e", "ddbca05e6c394081959293618a7bc6d9", "346b724e17044100836e0e6f90b8a697", "13913324db1d48f9a35936dec2382937", "21bfd3d878614d6e93f8263e9cbe7146", "b0539a44e071452dbf6b3a2dd2c76247", "5e18bed5622e4a75b11583c6d445f2c7", "74136fb55b334458b283b87a2d6ed1a8", "f77e5deb9d434b0eb7ba65b8702ac1ad", "7e5e142ca2f041fa9b0493d4e34b0723", "a3f69bba949a4057bdd7bb37bdac4448", "d8c7a6775dbb43c69de391e6437b4a6d", "3bf46651b667460caa0a50a70f242d5d", "772667b7753046459edb66e41607adfa", "19039359d80840449890df08547f708e", "c3c74690294d4681bdd9bb5e3eb93b40", "560112fda53241b2a46cdeac7d2860d1", "b6a6ae9b1168400a9556747472e4969d", "9271d89ac1044b1c93f3c6adc0726d63", "1d5dc6da5d8f4a7f83f78487d3fe62ab", "c63eb4e2888e4273a170a20dd1c97e49", "c008c2ad4cca493eac6ed494568d063a", "ace6c743bccb4c869022fe051a9e87a5", "94c2e9f4174a4b75a4734daa70976f2a", "2f8f0dda26234ee38dcd6e56a11ab91e", "859ff0b578554530bb534bd1a892bf52", "683eb672a3d2471f887fa4fa2e854c49", "bd7d747c737a43ee8136f8290f329dbe", "15a3c38d5ce746218936bb7161b212f3", "8aaf9b807eeb4b059ee7456b3a3d6eff", "34072f77354f4a9bbf1dbcefded34cfa", "348c21e10bde4a6d83bbe8ebe9d6151d", "e3d718c7affd44deb2026b07203484e3", "b774f861aa0a44e49c6e9e136a89f3be", "e0e478ffdb7441da86a1f026f3fef6fd", "59051aa3327b428fae12ee47f0432427", "933c2bc9e68f431c95a0e7e40faee7dd", "64f04aaa8ff04d9f9955dd97cc0fbc03", "0647bacb379a42a0971375d29ab3c124", "17459dc7bc434fa982d9e72dbc2e0035", "342d5898644d4bd69350a8daead30ab3", "29ca22015ea74c788d10eab0cc48fef4", "24d17de8b0af401a8c965fe5b7c5471f", "6a1b1a24d1b74543bdd9bca802f4c73a", "51667258bba04cecb647bf4ceadda6d9", "efa5756d82704c8483d9732e967589e0", "81194bc0cf284738a74ce19f25ccfad1", "e3969e656fae4d55998491e732c6e03d", "cd99ace3cebd4fe2b96d0d41e6bbfa8f", "ad1e0908e02c4c9c99d7b3d278190a39", "578699de768744719bc4a2c577027442", "bd4bc55ec52a452a8c8d6f614db9b1c3", "253da471dfdc405f810a5d4e87ad09f5", "12d7e13c8b4a4cf28abded56f2af6857", "da0ea8191b69456f86637e7f95b262a5", "3de45dedbc7b494184390c4ef1faf89d", "acb2634405034d72bfbf2a76531c9b08", "97daa1fd120e4fbe8e1b4f67bfc36bbf", "65cc089a3ee74d7b9a213ab4a40fbe5f", "ea2670b5bfaa4d8bbebf3e22864e02ea", "0e226f5b3abb4d228ccf9d18a6ea1056", "04b5ab0c6c254d428cce47096ca8c6cf", "28b941b46df44de5b5831428ae175b5e", "57cab6070edf41adb1564b14eb294790", "ed8e421fb2a34e69862803b207f70af1", "8e8a3d0d25d54bc1bb5ab0612dae0a3b", "adce5b335a5746439c59d5fce8ef6292", "256aa79b119445e39788766fe83fe3d9", "3969e5cf1ce148b1b86d56de13c42a77", "e411398f66464abda327d17ecb8bbf24", "1c4de5e2a1ab4b8b828e53738fcb9e55", "887f98f8f933473e920ecc7e33f89d32" ] }, "id": "LiFfjiC94FeE", "outputId": "2492824c-caaa-4b82-ab06-8893f3371171" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "WARNING:datasets.builder:Using custom data configuration pt-all-question-language=pt\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Downloading and preparing dataset mqa/pt-all-question to /content/drive/MyDrive/QA_pt/cache_huggingface/clips___mqa/pt-all-question-language=pt/0.0.0/7eda4cdcbd6f009259fc516f204d776915a5f54ea2ad414c3dcddfaacd4dfe0b...\n" ] }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "39519ff2453c440ba6e82b6eb79e16c5", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Downloading data: 0%| | 0.00/863M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "da700f026a894b249a0a29cde51eeb5a", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Downloading data: 0%| | 0.00/191M [00:00, ?B/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "5e18bed5622e4a75b11583c6d445f2c7", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Generating train split: 0 examples [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "Dataset mqa downloaded and prepared to /content/drive/MyDrive/QA_pt/cache_huggingface/clips___mqa/pt-all-question-language=pt/0.0.0/7eda4cdcbd6f009259fc516f204d776915a5f54ea2ad414c3dcddfaacd4dfe0b. Subsequent calls will reuse this data.\n" ] }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "b6a6ae9b1168400a9556747472e4969d", "version_major": 2, "version_minor": 0 }, "text/plain": [ " 0%| | 0/1 [00:00, ?it/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stderr", "output_type": "stream", "text": [ "WARNING:datasets.builder:Using custom data configuration pt-faq-question-language=pt,scope=faq\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Downloading and preparing dataset mqa/pt-faq-question to /content/drive/MyDrive/QA_pt/cache_huggingface/clips___mqa/pt-faq-question-language=pt,scope=faq/0.0.0/7eda4cdcbd6f009259fc516f204d776915a5f54ea2ad414c3dcddfaacd4dfe0b...\n" ] }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "15a3c38d5ce746218936bb7161b212f3", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Generating train split: 0 examples [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "Dataset mqa downloaded and prepared to /content/drive/MyDrive/QA_pt/cache_huggingface/clips___mqa/pt-faq-question-language=pt,scope=faq/0.0.0/7eda4cdcbd6f009259fc516f204d776915a5f54ea2ad414c3dcddfaacd4dfe0b. Subsequent calls will reuse this data.\n" ] }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "17459dc7bc434fa982d9e72dbc2e0035", "version_major": 2, "version_minor": 0 }, "text/plain": [ " 0%| | 0/1 [00:00, ?it/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stderr", "output_type": "stream", "text": [ "WARNING:datasets.builder:Using custom data configuration pt-cqa-question-language=pt,scope=cqa\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Downloading and preparing dataset mqa/pt-cqa-question to /content/drive/MyDrive/QA_pt/cache_huggingface/clips___mqa/pt-cqa-question-language=pt,scope=cqa/0.0.0/7eda4cdcbd6f009259fc516f204d776915a5f54ea2ad414c3dcddfaacd4dfe0b...\n" ] }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "578699de768744719bc4a2c577027442", "version_major": 2, "version_minor": 0 }, "text/plain": [ "Generating train split: 0 examples [00:00, ? examples/s]" ] }, "metadata": {}, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "Dataset mqa downloaded and prepared to /content/drive/MyDrive/QA_pt/cache_huggingface/clips___mqa/pt-cqa-question-language=pt,scope=cqa/0.0.0/7eda4cdcbd6f009259fc516f204d776915a5f54ea2ad414c3dcddfaacd4dfe0b. Subsequent calls will reuse this data.\n" ] }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "04b5ab0c6c254d428cce47096ca8c6cf", "version_major": 2, "version_minor": 0 }, "text/plain": [ " 0%| | 0/1 [00:00, ?it/s]" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "all_data = load_dataset(\"clips/mqa\", language=\"pt\", cache_dir=cache_dir)\n", "filtered_data = all_data.filter(lambda example: '?' in example['name'] and is_portuguese(example['name']))" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 49, "referenced_widgets": [ "8e6e5d2885c743f7bbe30211f5a83eb8", "23ad95fd1bd349a09b86464455143251", "4c956c5660e04cfa8da9c5a4dd0adb8d", "ee49b9eaff5a4c7c97903469e39b5a74", "48d1c37e9fca48e5b7f56486aed206d5", "83466bdcbb374d7fa2420b60866f4a72", "3772b28257ea40479334c461641865e7", "1a8e6da0b36c405aa02e04b72443c8d7", "d972daf6ce104293931285fafd157e84", "9963fd4eb202458d903d3010d60dac4d", "0bac84c4a05d4af3a91b13bf102025c4" ] }, "id": "6QWGmWErmiKF", "outputId": "8c230dfa-4b10-48da-e112-05c3c5f1e763" }, "outputs": [ { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "8e6e5d2885c743f7bbe30211f5a83eb8", "version_major": 2, "version_minor": 0 }, "text/plain": [ " 0%| | 0/5477 [00:00, ?ba/s]" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "COLUMNS = [\n", " 'id',\n", " 'domain',\n", " 'bucket',\n", " 'question_title',\n", " 'question_text',\n", " 'answer_title',\n", " 'answer_text',\n", " 'is_accepted',\n", "]\n", "\n", "def format(data):\n", " prepro_data = {column: list() for column in COLUMNS}\n", "\n", " for question_idx in range(len(data['id'])):\n", "\n", " for answer_idx, answer in enumerate(data['answers'][question_idx]):\n", " prepro_data['id'].append(data['id'][question_idx] + '-' + str(answer_idx))\n", "\n", "\n", " for column in ['domain', 'bucket']:\n", " prepro_data[column].append(data[column][question_idx])\n", "\n", " prepro_data['question_title'].append(data['name'][question_idx])\n", " prepro_data['question_text'].append(data['text'][question_idx])\n", " prepro_data['answer_title'].append(answer['name'])\n", " prepro_data['answer_text'].append(answer['text'])\n", " prepro_data['is_accepted'].append(answer['is_accepted'])\n", " \n", " return prepro_data\n", "\n", "prepro_data = filtered_data.map(\n", " format,\n", " batched=True,\n", " remove_columns=[\n", " 'text',\n", " 'name',\n", " 'answers',\n", " ]\n", ")" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "-wwtYLlR8vpH" }, "outputs": [], "source": [ "prepro_data.save_to_disk('prepro_data')" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "dqYa1-Z37Joo" }, "outputs": [], "source": [ "df = prepro_data['train'].to_pandas()\n", "df.to_csv('QA_PT.tsv', sep='\\t', index=None)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "jIQYY09nfGo2", "outputId": "99da26af-a218-4dec-dad7-5c65814c606e" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001b[0m\u001b[01;34mcache_huggingface\u001b[0m/ \u001b[01;34mprepro_data\u001b[0m/ QA_PT.tsv\n" ] } ], "source": [ "%ls" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "hG5SKkFJeokt", "outputId": "68d4eb12-7917-45ba-db9e-12eb964818a2" }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/usr/local/lib/python3.7/dist-packages/IPython/core/interactiveshell.py:3326: DtypeWarning: Columns (4,5) have mixed types.Specify dtype option on import or set low_memory=False.\n", " exec(code_obj, self.user_global_ns, self.user_ns)\n" ] } ], "source": [ "import pandas as pd\n", "\n", "df = pd.read_table('QA_PT.tsv', lineterminator='\\n')" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "colab": { "base_uri": "https://localhost:8080/", "height": 641 }, "id": "Haa8vmKk7c9I", "outputId": "f70a34e8-2b0b-47b0-a8eb-474055a9190a" }, "outputs": [ { "data": { "text/html": [ "\n", "
\n", " | question_title | \n", "question_text | \n", "answer_title | \n", "answer_text | \n", "is_accepted | \n", "
---|---|---|---|---|---|
0 | \n", "o que é um arranjo? | \n", "NaN | \n", "NaN | \n", "um arranjo é onde as pessoas são diretas umas ... | \n", "True | \n", "
1 | \n", "qual a quantidade de vezes que o shih tzu tem ... | \n", "NaN | \n", "NaN | \n", "a quantidade de vezes que um shih tzu deve com... | \n", "True | \n", "
2 | \n", "o que eu posso dar para o shih tzu comer além ... | \n", "NaN | \n", "NaN | \n", "É indicado que a alimentação de um cachorro se... | \n", "True | \n", "
3 | \n", "qual é a melhor marca de ração para shih tzu? | \n", "NaN | \n", "NaN | \n", "dentre quais as melhores rações para shih tzu ... | \n", "True | \n", "
4 | \n", "como dar ração para filhotes de shih tzu? | \n", "NaN | \n", "NaN | \n", "a oferta da ração para um shih tzu filhote var... | \n", "True | \n", "
... | \n", "... | \n", "... | \n", "... | \n", "... | \n", "... | \n", "
5753875 | \n", "No Stimulsoft o @CAMPO(campo) só pode ser util... | \n", "Seria possível criar uma variável qualquer par... | \n", "NaN | \n", "Para \"pegar\" o valor do campo (sem ser no SQL ... | \n", "False | \n", "
5753876 | \n", "Como alterar o valor do progresso de um proces... | \n", "Tenho um BusinessComponent que executa um proc... | \n", "NaN | \n", "A classe ProcessLog é indicado para realizar ... | \n", "True | \n", "
5753877 | \n", "No Builder, para que serve os campos \"Herdar d... | \n", "No Builder, para que serve os campos \"Herdar d... | \n", "NaN | \n", "Serve para herdar campos de uma tabela relacio... | \n", "True | \n", "
5753878 | \n", "Existe algum método que retorne o arquivo do B... | \n", "Preciso realizar uma rotina que irá ler vários... | \n", "NaN | \n", "Utilize a classe Benner.Tecnologia.Common.File... | \n", "True | \n", "
5753879 | \n", "Quais os package sources preciso incluir no Vi... | \n", "Quero compilar o BEF. Quais os servidores nuge... | \n", "NaN | \n", "Seriam esses:\\n\\n\\nNuget Benner: <http://desen... | \n", "True | \n", "
5753880 rows × 5 columns
\n", "