yoh commited on
Commit
d30c6b8
1 Parent(s): d0bf236

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -7
README.md CHANGED
@@ -17,10 +17,10 @@ This adapter was created for usage with the **[adapter-transformers](https://git
17
 
18
  ## Usage
19
 
20
- First, install `adapter-transformers`:
21
 
22
  ```
23
- pip install -U adapter-transformers
24
  ```
25
  _Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. [More](https://docs.adapterhub.ml/installation.html)_
26
 
@@ -28,18 +28,24 @@ Now, the adapter can be loaded and activated like this:
28
 
29
  ```python
30
  from transformers import AutoAdapterModel
31
-
32
- model = AutoAdapterModel.from_pretrained("distilroberta-base")
33
- adapter_name = model.load_adapter("yoh/distilroberta-base-sept-adapter", source="hf", set_active=True)
 
 
 
 
 
 
34
  ```
35
 
36
  ## Architecture & Training
37
 
38
- <!-- Add some description here -->
39
 
40
  ## Evaluation results
41
 
42
- <!-- Add some description here -->
43
 
44
  ## Citation
45
  ```bibtex
 
17
 
18
  ## Usage
19
 
20
+ First, install `adapter-transformers` and `sentence-transformers`:
21
 
22
  ```
23
+ pip install -U adapter-transformers sentence-transformers
24
  ```
25
  _Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. [More](https://docs.adapterhub.ml/installation.html)_
26
 
 
28
 
29
  ```python
30
  from transformers import AutoAdapterModel
31
+ from sentence_transformers import SentenceTransformer, models
32
+
33
+ # Load pre-trained model
34
+ word_embedding_model = models.Transformer("distilroberta-base")
35
+ # Load and activate adapter
36
+ word_embedding_model.auto_model.load_adapter("yoh/distilroberta-base-sept-adapter", source="hf", set_active=True)
37
+ # Create sentence transformer
38
+ pooling_model = models.Pooling(word_embedding_model.get_word_embedding_dimension(), pooling_mode='mean')
39
+ model = SentenceTransformer(modules=[word_embedding_model, pooling_model])
40
  ```
41
 
42
  ## Architecture & Training
43
 
44
+ See this [paper](https://arxiv.org/abs/2311.00408)
45
 
46
  ## Evaluation results
47
 
48
+ See this [paper](https://arxiv.org/abs/2311.00408)
49
 
50
  ## Citation
51
  ```bibtex