Text loading

#8
by antonmks - opened

Could you guys give us the python code for loading 'Great Gatsby' and getting the output ?
Would also be nice if you specified what kind of resources (memory/cpu/gpu) one needs to run the models ?

I would love a little demo as well!

I want to be able to load in TVTropes somehow so that it can know how to apply one.

Mosaic ML, Inc. org

Should have something for you sometime soon (days, not weeks)

Mosaic ML, Inc. org

Still planning on updating our library's generation tooling to support the Great Gatsby demo (assuming your hardware can handle it). Will keep this issue open until that's out.

In the meantime, you can have fun w/ StoryWriter up to 10k tokens at https://huggingface.co/spaces/mosaicml/mpt-7b-storywriter

Hi, may I ask for the same, that is, the 'Great Gatsby' input sample and the python code for loading it? Thank you!

Mosaic ML, Inc. org
β€’
edited Jun 3, 2023

If you'd like to try using mosaicml/mpt-7b-storywriter to ingest large documents, I would recommend using this script hf_generate.py in our LLM Foundry: https://github.com/mosaicml/llm-foundry/tree/main/scripts/inference#interactive-generation-with-hf-models

You can pass a prompt file that will be ingested as a single document, like --prompt file::path/to/gatsby.txt . See here for code

abhi-mosaic changed discussion status to closed

Sign up or log in to comment