Transformers
mpt
Composer
MosaicML
llm-foundry
text-generation-inference

fastapi example

#3
by matthoffner - opened

Thanks for your work keeping these up to date and adding the new models! Its very easy to spin up a space with these now thanks to ctransformers, here is an example of the StoryWriter: https://huggingface.co/spaces/matthoffner/storywriter

Awesome! It's great to hear you can get this working easily. Thanks for posting

I build a package for llama.cpp that wraps it in a node module, it gives you a socket client and a hostable server that responds and interchanges duplex communications between the STDOUT from the terminal to markdown in the wappclient. Its fully working and it has examples on how to add more models/prompts

PS: Any idea if i can use langchain without the 65K storywriting mode? Im very new to the scene, and haven't gotten into the actual trainings. i only do conversions, Frankenstein shit together really. But ive never known myself to produce more if i dont know theres more
jarvis-demo.png

Sign up or log in to comment