Update README.md
Browse files
README.md
CHANGED
@@ -65,5 +65,18 @@ The model is a decoder-only transformer architecture with the following modifica
|
|
65 |
* **ReLU Activation Function**: ReLU([Glorot et al., 2011](https://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf)) activation functions are adopted in feed-forward networks.
|
66 |
* **Tokenizer**: We use the SmolLM([Allal et al., 2024](https://huggingface.co/blog/smollm))'s tokenizer with a vocabulary size of 49,152.
|
67 |
|
68 |
-
##
|
69 |
-
* This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-0.5B-Instruct/blob/main/LICENSE) License
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
65 |
* **ReLU Activation Function**: ReLU([Glorot et al., 2011](https://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf)) activation functions are adopted in feed-forward networks.
|
66 |
* **Tokenizer**: We use the SmolLM([Allal et al., 2024](https://huggingface.co/blog/smollm))'s tokenizer with a vocabulary size of 49,152.
|
67 |
|
68 |
+
## License
|
69 |
+
* This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-0.5B-Instruct/blob/main/LICENSE) License.、
|
70 |
+
|
71 |
+
## Citation
|
72 |
+
```
|
73 |
+
@misc{yi2024phonelmanefficientcapablesmall,
|
74 |
+
title={PhoneLM:an Efficient and Capable Small Language Model Family through Principled Pre-training},
|
75 |
+
author={Rongjie Yi and Xiang Li and Weikai Xie and Zhenyan Lu and Chenghua Wang and Ao Zhou and Shangguang Wang and Xiwen Zhang and Mengwei Xu},
|
76 |
+
year={2024},
|
77 |
+
eprint={2411.05046},
|
78 |
+
archivePrefix={arXiv},
|
79 |
+
primaryClass={cs.CL},
|
80 |
+
url={https://arxiv.org/abs/2411.05046},
|
81 |
+
}
|
82 |
+
```
|