Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,7 @@ datasets:
|
|
7 |
|
8 |
# mpt-7b-storysummarizer
|
9 |
|
10 |
-
This is a fine-tuned version of [mosaicml/mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) on [emozilla/booksum-summary-analysis_gptneox-8192](emozilla/booksum-summary-analysis_gptneox-8192), which is adapted from [kmfoda/booksum](https://huggingface.co/datasets/kmfoda/booksum).
|
11 |
The training run was performed using [llm-foundry](https://github.com/mosaicml/llm-foundry) on an 8xA100 80 GB node at 8192 context length using [this configuration](https://gist.github.com/jquesnelle/f9fb28b8102cba8e79a6c08f132fbf49). The run can be viewed on [wandb](https://wandb.ai/emozilla/booksum/runs/457ym4r9).
|
12 |
|
13 |
## How to Use
|
|
|
7 |
|
8 |
# mpt-7b-storysummarizer
|
9 |
|
10 |
+
This is a fine-tuned version of [mosaicml/mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) on [emozilla/booksum-summary-analysis_gptneox-8192](https://huggingface.co/datasets/emozilla/booksum-summary-analysis_gptneox-8192), which is adapted from [kmfoda/booksum](https://huggingface.co/datasets/kmfoda/booksum).
|
11 |
The training run was performed using [llm-foundry](https://github.com/mosaicml/llm-foundry) on an 8xA100 80 GB node at 8192 context length using [this configuration](https://gist.github.com/jquesnelle/f9fb28b8102cba8e79a6c08f132fbf49). The run can be viewed on [wandb](https://wandb.ai/emozilla/booksum/runs/457ym4r9).
|
12 |
|
13 |
## How to Use
|