Abinaya Mahendiran commited on
Commit
80d39c5
1 Parent(s): 39d6806

Updated README

Browse files
Files changed (1) hide show
  1. README.md +27 -18
README.md CHANGED
@@ -3,7 +3,7 @@
3
  language: ta
4
  license: MIT
5
  datasets:
6
- - OSCAR
7
  - IndicNLP
8
  widget:
9
  - text: 'ஒரு ஊரிலே ஒரு காக்கைக்கு'
@@ -11,51 +11,60 @@ widget:
11
  ---
12
  # GPT2-Tamil
13
 
14
- This repository is created as part of the Flax/Jax community week by Huggingface. The aim of this project is to train a language model using GPT-2 specifically for Tamil language.
15
 
16
  ## Setup:
17
  To setup the project, run the following command,
18
- ``` pip install -r requirements.txt
 
19
  ```
 
 
 
20
 
21
  ## Dataset Used:
22
- The GTP-2 model is trained using OSCAR (Tamil) and IndicNLP (Tamil) dataset
 
 
 
23
 
24
- ## Train model:
25
  To perform training, do the following steps,
26
 
27
  - Export the model directory (where you want to store the model artifacts like config, tokenizer, etc.)
28
- ```
29
- export MODEL_DIR=<model_dir>
30
  ```
31
  - Create the config.json by running the following command,
32
- ```
33
- python src/create_config.py
34
  ```
35
  - Create the tokenizer by running the following command,
36
- ```
37
- python src/train_tokenizer.py
38
  ```
39
  - Once the config and tokenizer is created, run the following script to start training the flax model
40
- ```
41
- python scripts/train_gpt2-oscar-tamil.sh
42
  ```
43
 
44
- ## Inference:
45
  To perform language generation using the model, pipeline can be used directly.
46
 
47
  - First convert the flax model to pytorch using the following command,
48
- ```
49
- python src/convert_flax_to_pytorch.py
50
  ```
51
  - Use the following snippet to perform language generation,
52
- ```
53
  from transformers import AutoTokenizer, AutoModelWithLMHead, pipeline
54
  model_name = 'abinayam/gpt-2-tamil'
55
  model = AutoModelWithLMHead.from_pretrained(model_name)
56
  tokenizer = AutoTokenizer.from_pretrained(model_name)
 
57
  input_text = "ஒரு ஊரிலே ஒரு காக்கைக்கு"
58
  max_len = 300
 
59
  generator = pipeline('text-generation', model=model, tokenizer=tokenizer)
60
- sequence = generator(input_text, max_length=max_len)
61
  ```
 
3
  language: ta
4
  license: MIT
5
  datasets:
6
+ - oscar
7
  - IndicNLP
8
  widget:
9
  - text: 'ஒரு ஊரிலே ஒரு காக்கைக்கு'
 
11
  ---
12
  # GPT2-Tamil
13
 
14
+ This repository is created as part of the Flax/Jax community week by Huggingface. The aim of this project is to pretrain a language model using GPT-2 specifically for Tamil language.
15
 
16
  ## Setup:
17
  To setup the project, run the following command,
18
+ ```python
19
+ pip install -r requirements.txt
20
  ```
21
+
22
+ ## Model
23
+ Pretrained model on Tamil language using a causal language modeling (CLM) objective.
24
 
25
  ## Dataset Used:
26
+ The GTP-2 model is trained on [oscar dataset - ta](https://huggingface.co/datasets/oscar) and [IndicNLP dataset - ta](https://indicnlp.ai4bharat.org/corpora/)
27
+
28
+ ## Intended uses & limitations
29
+ You can use the raw model for next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=gpt) to look for fine-tuned versions on a task that interests you.
30
 
31
+ ## How to pretrain the model:
32
  To perform training, do the following steps,
33
 
34
  - Export the model directory (where you want to store the model artifacts like config, tokenizer, etc.)
35
+ ```python
36
+ >>> export MODEL_DIR=<model_dir>
37
  ```
38
  - Create the config.json by running the following command,
39
+ ```python
40
+ >>> python src/create_config.py
41
  ```
42
  - Create the tokenizer by running the following command,
43
+ ```python
44
+ >>> python src/train_tokenizer.py
45
  ```
46
  - Once the config and tokenizer is created, run the following script to start training the flax model
47
+ ```python
48
+ >>> python scripts/train_gpt2-oscar-tamil.sh
49
  ```
50
 
51
+ ## How to use:
52
  To perform language generation using the model, pipeline can be used directly.
53
 
54
  - First convert the flax model to pytorch using the following command,
55
+ ```python
56
+ python src/convert_flax_to_pytorch.py
57
  ```
58
  - Use the following snippet to perform language generation,
59
+ ```python
60
  from transformers import AutoTokenizer, AutoModelWithLMHead, pipeline
61
  model_name = 'abinayam/gpt-2-tamil'
62
  model = AutoModelWithLMHead.from_pretrained(model_name)
63
  tokenizer = AutoTokenizer.from_pretrained(model_name)
64
+ set_seed(42)
65
  input_text = "ஒரு ஊரிலே ஒரு காக்கைக்கு"
66
  max_len = 300
67
+ no_seq = 5
68
  generator = pipeline('text-generation', model=model, tokenizer=tokenizer)
69
+ sequence = generator(input_text, max_length=max_len, num_return_sequences=no_seq)
70
  ```