solwol commited on
Commit
787d71b
1 Parent(s): 8915537

Upload model

Browse files
Files changed (3) hide show
  1. README.md +11 -42
  2. pytorch_adapter.bin +1 -1
  3. pytorch_model_head.bin +1 -1
README.md CHANGED
@@ -1,27 +1,24 @@
1
  ---
2
  tags:
3
  - adapter-transformers
 
4
  - xlm-roberta
5
- - adapterhub:am/wikipedia-amharic-20240320"
6
  datasets:
7
  - wikipedia
8
- pipeline_tag: fill-mask
9
- language:
10
- - am
11
  ---
12
 
13
- # Adapter `solwol/xml-roberta-amharic` for xlm-roberta-base
14
 
15
- An [adapter](https://adapterhub.ml) for the `xlm-roberta-base` model that was trained on the [fill-mask/wikipedia-amharic](https://adapterhub.ml/explore/fill-mask/wikipedia-amharic/) dataset and includes a prediction head for masked lm.
16
 
17
  This adapter was created for usage with the **[Adapters](https://github.com/Adapter-Hub/adapters)** library.
18
 
19
  ## Usage
20
 
21
- First, install `transformers`, `adapters`:
22
 
23
  ```
24
- pip install -U transformers adapters
25
  ```
26
 
27
  Now, the adapter can be loaded and activated like this:
@@ -32,43 +29,15 @@ from adapters import AutoAdapterModel
32
  model = AutoAdapterModel.from_pretrained("xlm-roberta-base")
33
  adapter_name = model.load_adapter("solwol/xml-roberta-base-adapter-amharic", source="hf", set_active=True)
34
  ```
35
- Next, to perform fill mask tasks:
36
 
37
- ```python
38
 
39
- from transformers import AutoTokenizer, FillMaskPipeline
40
 
41
- tokenizer = AutoTokenizer.from_pretrained("xlm-roberta-base")
42
- fillmask = FillMaskPipeline(model=model, tokenizer=tokenizer)
43
 
44
- inputs = ["መልካም አዲስ <mask> ይሁን",
45
- "የኢትዮጵያ ዋና <mask> አዲስ አበባ ነው",
46
- "ኬንያ የ ኢትዮጵያ አዋሳኝ <mask> አንዷ ናት",
47
- "አጼ ምኒሊክ የኢትዮጵያ <mask> ነበሩ"]
48
 
49
- outputs = fillmask(inputs)
50
- outputs[0]
51
 
52
- [{'score': 0.31237369775772095,
53
- 'token': 17733,
54
- 'token_str': 'ቀን',
55
- 'sequence': 'መልካም አዲስ ቀን ይሁን'},
56
- {'score': 0.17704728245735168,
57
- 'token': 19202,
58
- 'token_str': 'አበባ',
59
- 'sequence': 'መልካም አዲስ አበባ ይሁን'},
60
- {'score': 0.17629213631153107,
61
- 'token': 98040,
62
- 'token_str': 'አመት',
63
- 'sequence': 'መልካም አዲስ አመት ይሁን'},
64
- {'score': 0.08915291726589203,
65
- 'token': 25186,
66
- 'token_str': 'ዓመት',
67
- 'sequence': 'መልካም አዲስ ዓመት ይሁን'},
68
- {'score': 0.060819510370492935,
69
- 'token': 118502,
70
- 'token_str': 'ሳምንት',
71
- 'sequence': 'መልካም አዲስ ሳምንት ይሁን'}]
72
- ```
73
- ## Fine-tuning data
74
- Used some of wikipedia's amharic dataset; snapshot date="20240320"
 
1
  ---
2
  tags:
3
  - adapter-transformers
4
+ - adapterhub:am/wikipedia-amharic-20240320
5
  - xlm-roberta
 
6
  datasets:
7
  - wikipedia
 
 
 
8
  ---
9
 
10
+ # Adapter `solwol/xml-roberta-base-adapter-amharic` for xlm-roberta-base
11
 
12
+ An [adapter](https://adapterhub.ml) for the `xlm-roberta-base` model that was trained on the [am/wikipedia-amharic-20240320](https://adapterhub.ml/explore/am/wikipedia-amharic-20240320/) dataset and includes a prediction head for masked lm.
13
 
14
  This adapter was created for usage with the **[Adapters](https://github.com/Adapter-Hub/adapters)** library.
15
 
16
  ## Usage
17
 
18
+ First, install `adapters`:
19
 
20
  ```
21
+ pip install -U adapters
22
  ```
23
 
24
  Now, the adapter can be loaded and activated like this:
 
29
  model = AutoAdapterModel.from_pretrained("xlm-roberta-base")
30
  adapter_name = model.load_adapter("solwol/xml-roberta-base-adapter-amharic", source="hf", set_active=True)
31
  ```
 
32
 
33
+ ## Architecture & Training
34
 
35
+ <!-- Add some description here -->
36
 
37
+ ## Evaluation results
 
38
 
39
+ <!-- Add some description here -->
 
 
 
40
 
41
+ ## Citation
 
42
 
43
+ <!-- Add some description here -->
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
pytorch_adapter.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9240986c3c0e72fedf932394d0a324e78c589e2dd7f824fbbb23aa66f36c1bba
3
  size 3595942
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:19b2102e3d5281e0b0edab0aa840238bce1a2f508d2432472e23fa0254dbc684
3
  size 3595942
pytorch_model_head.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:206c015e77a027a055ad4fadc17d41e93834541133eb79ef5e42c996845a00ed
3
  size 771377398
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f5e691f1e0011e3b4002f7353a8c7c1f59365483822efbcb0bb95fea0a947c57
3
  size 771377398