Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,6 @@ datasets:
|
|
7 |
|
8 |
# LLongMA-2-13b-storysummarizer
|
9 |
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
|
10 |
-
|
11 |
This is a fine-tuned version of [conceptofmind/LLongMA-2-13b](https://huggingface.co/conceptofmind/LLongMA-2-13b) intended for summarization and literary analysis of fiction stories.
|
12 |
It contains custom modeling code to use Flash Attention 2 during inference, which provides a significant speedup, especially at longer context lengths.
|
13 |
To enable, pass `trust_remote_code=True,use_flash_attention=True` to `AutoModelForCausalLM`.
|
|
|
7 |
|
8 |
# LLongMA-2-13b-storysummarizer
|
9 |
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
|
|
|
10 |
This is a fine-tuned version of [conceptofmind/LLongMA-2-13b](https://huggingface.co/conceptofmind/LLongMA-2-13b) intended for summarization and literary analysis of fiction stories.
|
11 |
It contains custom modeling code to use Flash Attention 2 during inference, which provides a significant speedup, especially at longer context lengths.
|
12 |
To enable, pass `trust_remote_code=True,use_flash_attention=True` to `AutoModelForCausalLM`.
|