netapy commited on
Commit
3654b93
1 Parent(s): 9676528

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -15
README.md CHANGED
@@ -9,12 +9,13 @@ language:
9
  - en
10
  ---
11
 
12
- # Solon Embeddings — base 0.1
13
 
14
  SOTA Open source french embedding model.
15
 
16
  | Model | Mean Score |
17
  | --- | --- |
 
18
  | cohere/embed-multilingual-v3 | 0.7402 |
19
  | **OrdalieTech/Solon-embeddings-base-0.1** | 0.7306 |
20
  | openai/ada-002 | 0.7290 |
@@ -29,20 +30,16 @@ SOTA Open source french embedding model.
29
  | EuropeanParliament/eubert_embedding_v1 | 0.5063 |
30
 
31
  These results have been obtained through 9 french benchmarks on a variety of text similarity tasks (classification, reranking, STS) :
32
- - AmazonReviewsClassification
33
- - MassiveIntentClassification
34
- - MassiveScenarioClassification
35
- - MTOPDomainClassification
36
- - MTOPIntentClassification
37
- - STS22
38
- - MiraclFRRerank
39
- - OrdalieFRSTS
40
- - OrdalieFRReranking
41
 
42
  We created OrdalieFRSTS and OrdalieFRReranking to enhance the benchmarking capabilities of French STS and reranking assessments.
43
 
44
- (evaluation script currently available here : github.com/netapy/mteb)
45
-
46
- --------
47
-
48
- **(Large version comming soon...)**
 
9
  - en
10
  ---
11
 
12
+ # Solon Embeddings — Base 0.1
13
 
14
  SOTA Open source french embedding model.
15
 
16
  | Model | Mean Score |
17
  | --- | --- |
18
+ | **OrdalieTech/Solon-embeddings-large-0.1** | 0.7490 |
19
  | cohere/embed-multilingual-v3 | 0.7402 |
20
  | **OrdalieTech/Solon-embeddings-base-0.1** | 0.7306 |
21
  | openai/ada-002 | 0.7290 |
 
30
  | EuropeanParliament/eubert_embedding_v1 | 0.5063 |
31
 
32
  These results have been obtained through 9 french benchmarks on a variety of text similarity tasks (classification, reranking, STS) :
33
+ - AmazonReviewsClassification (MTEB)
34
+ - MassiveIntentClassification (MTEB)
35
+ - MassiveScenarioClassification (MTEB)
36
+ - MTOPDomainClassification (MTEB)
37
+ - MTOPIntentClassification (MTEB)
38
+ - STS22 (MTEB)
39
+ - MiraclFRRerank (Miracl)
40
+ - OrdalieFRSTS (Ordalie)
41
+ - OrdalieFRReranking (Ordalie)
42
 
43
  We created OrdalieFRSTS and OrdalieFRReranking to enhance the benchmarking capabilities of French STS and reranking assessments.
44
 
45
+ (evaluation script available here : github.com/OrdalieTech/mteb)