Shaltiel commited on
Commit
f8ab320
1 Parent(s): 55af0f5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -3
README.md CHANGED
@@ -13,11 +13,11 @@ inference:
13
 
14
  [<img src="https://i.ibb.co/5Lbwyr1/dicta-logo.jpg" width="300px"/>](https://dicta.org.il)
15
 
16
- # Model Card for DictaLM-2.0
17
 
18
  The DictaLM-2.0 Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters trained to specialize in Hebrew text.
19
 
20
- For full details of this model please read our [release blog post](https://dicta.org.il/dicta-lm).
21
 
22
  This is the full-precision base model.
23
  You can view and access the full collection of base/instruct unquantized/quantized versions of `DictaLM-2.0` [here](https://huggingface.co/collections/dicta-il/dicta-lm-20-collection-661bbda397df671e4a430c27).
@@ -98,5 +98,13 @@ DictaLM 2.0 is a pretrained base model and therefore does not have any moderatio
98
  If you use this model, please cite:
99
 
100
  ```bibtex
101
- [Will be added soon]
 
 
 
 
 
 
 
 
102
  ```
 
13
 
14
  [<img src="https://i.ibb.co/5Lbwyr1/dicta-logo.jpg" width="300px"/>](https://dicta.org.il)
15
 
16
+ # Adapting LLMs to Hebrew: Unveiling DictaLM 2.0 with Enhanced Vocabulary and Instruction Capabilities
17
 
18
  The DictaLM-2.0 Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters trained to specialize in Hebrew text.
19
 
20
+ For full details of this model please read our [release blog post](https://dicta.org.il/dicta-lm) or the [technical report](https://arxiv.org/abs/2407.07080).
21
 
22
  This is the full-precision base model.
23
  You can view and access the full collection of base/instruct unquantized/quantized versions of `DictaLM-2.0` [here](https://huggingface.co/collections/dicta-il/dicta-lm-20-collection-661bbda397df671e4a430c27).
 
98
  If you use this model, please cite:
99
 
100
  ```bibtex
101
+ @misc{shmidman2024adaptingllmshebrewunveiling,
102
+ title={Adapting LLMs to Hebrew: Unveiling DictaLM 2.0 with Enhanced Vocabulary and Instruction Capabilities},
103
+ author={Shaltiel Shmidman and Avi Shmidman and Amir DN Cohen and Moshe Koppel},
104
+ year={2024},
105
+ eprint={2407.07080},
106
+ archivePrefix={arXiv},
107
+ primaryClass={cs.CL},
108
+ url={https://arxiv.org/abs/2407.07080},
109
+ }
110
  ```