VityaVitalich
commited on
Commit
•
b507f9a
1
Parent(s):
de574c9
Update README.md
Browse files
README.md
CHANGED
@@ -16,12 +16,12 @@ base_model: meta-llama/Llama-2-7b-hf
|
|
16 |
|
17 |
# Model Card for TaxoLLaMA-bench
|
18 |
|
|
|
|
|
19 |
TaxoLLaMA-bench is a lightweight fine-tune of LLaMA2-7b model, aimed at solving multiple Lexical Semantics task with focus on Taxonomy related tasks, achieving SoTA results on multiple benchmarks.
|
20 |
It was pretrained with instructive dataset, collected from WordNet 3.0 to generate hypernyms for a given hyponym.
|
21 |
This model also could be used for identifying hypernymy with perplexity, that is useful for Lexical Entailment or Taxonomy Construction.
|
22 |
|
23 |
-
[!TIP]
|
24 |
-
It was not pretrained on the data used in later benchmarks from the paper. This model should be use to replicate results on Taxonomy test datasets or to boost them. For other task we strongly recommend using TaxoLLaMA
|
25 |
|
26 |
For more details, read paper: [TaxoLLaMA: WordNet-based Model for Solving Multiple Lexical Sematic Tasks](google.com)
|
27 |
|
|
|
16 |
|
17 |
# Model Card for TaxoLLaMA-bench
|
18 |
|
19 |
+
#### This model was not pretrained on the data used in later benchmarks from the paper. This model should be used to replicate results on taxonomy test datasets or to boost them. For other task we strongly recommend using [TaxoLLaMA](https://huggingface.co/VityaVitalich/TaxoLLaMA)
|
20 |
+
|
21 |
TaxoLLaMA-bench is a lightweight fine-tune of LLaMA2-7b model, aimed at solving multiple Lexical Semantics task with focus on Taxonomy related tasks, achieving SoTA results on multiple benchmarks.
|
22 |
It was pretrained with instructive dataset, collected from WordNet 3.0 to generate hypernyms for a given hyponym.
|
23 |
This model also could be used for identifying hypernymy with perplexity, that is useful for Lexical Entailment or Taxonomy Construction.
|
24 |
|
|
|
|
|
25 |
|
26 |
For more details, read paper: [TaxoLLaMA: WordNet-based Model for Solving Multiple Lexical Sematic Tasks](google.com)
|
27 |
|