task619_ohsumed_abstract_title_generation
Browse files
README.md
CHANGED
@@ -80,7 +80,7 @@ Use the code below to get started with the model.
|
|
80 |
|
81 |
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
|
82 |
|
83 |
-
|
84 |
|
85 |
### Training Procedure
|
86 |
|
@@ -175,7 +175,15 @@ Carbon emissions can be estimated using the [Machine Learning Impact calculator]
|
|
175 |
|
176 |
**BibTeX:**
|
177 |
|
178 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
179 |
|
180 |
**APA:**
|
181 |
|
|
|
80 |
|
81 |
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
|
82 |
|
83 |
+
https://huggingface.co/datasets/Lots-of-LoRAs/task619_ohsumed_abstract_title_generation sourced from https://github.com/allenai/natural-instructions
|
84 |
|
85 |
### Training Procedure
|
86 |
|
|
|
175 |
|
176 |
**BibTeX:**
|
177 |
|
178 |
+
@misc{brüelgabrielsson2024compressserveservingthousands,
|
179 |
+
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
|
180 |
+
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
|
181 |
+
year={2024},
|
182 |
+
eprint={2407.00066},
|
183 |
+
archivePrefix={arXiv},
|
184 |
+
primaryClass={cs.DC},
|
185 |
+
url={https://arxiv.org/abs/2407.00066},
|
186 |
+
}
|
187 |
|
188 |
**APA:**
|
189 |
|