lbourdois commited on
Commit
e28abfb
1 Parent(s): 6aacd4a

Add multilingual to the language tag

Browse files

Hi! A PR to add multilingual to the language tag to improve the referencing.

Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -1,16 +1,16 @@
1
  ---
2
- language:
3
  - en
4
  - fr
5
  - ro
6
  - de
7
- datasets:
8
- - c4
9
  tags:
10
  - summarization
11
  - translation
12
-
13
- license: apache-2.0
14
  ---
15
 
16
  [Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html)
@@ -65,7 +65,7 @@ Authors: *Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang
65
 
66
  **Abstract**
67
 
68
- Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new Colossal Clean Crawled Corpus”, we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our dataset, pre-trained models, and code.
69
 
70
  ![model image](https://camo.githubusercontent.com/623b4dea0b653f2ad3f36c71ebfe749a677ac0a1/68747470733a2f2f6d69726f2e6d656469756d2e636f6d2f6d61782f343030362f312a44304a31674e51663876727255704b657944387750412e706e67)
71
 
 
1
  ---
2
+ language:
3
  - en
4
  - fr
5
  - ro
6
  - de
7
+ - multilingual
8
+ license: apache-2.0
9
  tags:
10
  - summarization
11
  - translation
12
+ datasets:
13
+ - c4
14
  ---
15
 
16
  [Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html)
 
65
 
66
  **Abstract**
67
 
68
+ Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new Colossal Clean Crawled Corpus�, we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our dataset, pre-trained models, and code.
69
 
70
  ![model image](https://camo.githubusercontent.com/623b4dea0b653f2ad3f36c71ebfe749a677ac0a1/68747470733a2f2f6d69726f2e6d656469756d2e636f6d2f6d61782f343030362f312a44304a31674e51663876727255704b657944387750412e706e67)
71