Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
csabakecskemeti 
posted an update 8 days ago
Post
1787
Check out my idea:
LLmaaS - Local LLM as a Service

With LLmaaS, I propose leveraging locally running LLMs as a service, providing a standardized way for websites to access and utilize them for LLM-powered operations directly on the user’s device.

Demo, code, more detailed description.
https://devquasar.com/llmaas/
https://github.com/csabakecskemeti/LLmaaS
https://youtu.be/OOWGr8jcP5Q

Call for contributors
Join me a develop the LLmaaS proxy to make this a generic purpose tool to leverage local LLMs on web. Build in security measures.
I'm looking for help to make the proxy more generic support multiple local LLM services without any change on the HTML side.
Also looking for ideas how to make the HTML par more modular and easy to use.

This is obviously a prototype.
Security is a big concern here, but is believe it’s possible to put together a proxy that is safe and does not allow anything else than forward generate requests between browser and local llm.

·

Yes I did an tried it with chrome canary.
(I even have a demo page that utilizes it but now can’t recall the name will share later)

It’s working fine but :

  • still not available just in experimental chrome
  • How about different browsers
  • you’ve locked in with one model
  • what is you hosting your local AI on another local machine

All in all the chrome built it AI provides less flexibility on my view.

Appreciate the comment though