File size: 1,746 Bytes
b0d7d9e
 
 
ad74172
 
 
 
 
b885b1d
 
95bdd1f
 
 
b0d7d9e
 
 
 
ad74172
b0d7d9e
 
ad74172
b0d7d9e
ad74172
8441bfc
ad74172
 
b0d7d9e
 
 
 
ad74172
b0d7d9e
 
ad74172
b0d7d9e
 
ad74172
 
 
 
 
b0d7d9e
ad74172
 
 
b0d7d9e
ad74172
 
 
b0d7d9e
ad74172
 
 
 
b0d7d9e
ad74172
 
b0d7d9e
ad74172
b0d7d9e
 
 
 
 
28d87ad
 
b0d7d9e
 
 
 
28d87ad
 
b0d7d9e
 
28d87ad
 
 
 
 
 
b0d7d9e
 
ad74172
28d87ad
b0d7d9e
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
library_name: peft
base_model: t5-small
license: apache-2.0
datasets:
- opus100
tags:
- translation
- safetensors
- transformers
language:
- en
- fr
---

# Model Card for Model ID

A language translation model fine-tuned on **opus100** dataset for *English to French* translation.


## Model Description

- **Model type:** Language Model
- **Language(s) (NLP):** English, French
- **License:** Apache 2.0
- **Finetuned from model:** [T5-small](https://huggingface.co/t5-small)


## Uses

The model is intended to use for English to French translation related tasks. 


## How to Get Started with the Model


Install necessary libraries
```
pip install transformers peft accelerate
```
Use the code below to get started with the model.

```python
from peft import PeftModel, PeftConfig
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("dmedhi/eng2french-t5-small")
model = AutoModelForSeq2SeqLM.from_pretrained("t5-small")
model = PeftModel.from_pretrained(model, "dmedhi/eng2french-t5-small")

context = tokenizer(["Do you want coffee?"], return_tensors='pt')
output = model.generate(**context)
result = tokenizer.decode(output[0], skip_special_tokens=True)
print(result)

# Output
# Tu veux du café?

```

## Training Details

### Training Data

- Dataset used: [Opus100](https://huggingface.co/datasets/opus100)
- Subset: "en-fr"


## Evaluation

- global_step=5000
- training_loss=1.295289501953125

#### Metrics
- train_runtime = 1672.4371
- train_samples_per_second = 23.917
- train_steps_per_second = 2.99
- total_flos = 685071170273280.0 
- train_loss = 1.295289501953125
- epoch = 20.0


## Compute Instance
- Google Colab - T4 GPU (Free)


### Framework versions

- PEFT 0.7.1