langchain

#1
by wilmerhenao - opened

Does it integrate well with langchain? Are there examples?

Amazon Web Services org

Since we host this model using Huggingface text generation inference, to the best of my knowledge, you can refer to https://python.langchain.com/docs/integrations/llms/huggingface_textgen_inference to see the example how to use it in langchain. Cheers!

Would love a tutorial on how to set this up for hosting and creating an api endpoint for querying via http on aws. or make it an inference endpoint deployable on huggingface (please also on eu aws computing centers, for DSGVO and DPA approval). Cheers!

Amazon Web Services org
Amazon Web Services org

Hi @dm-mschubert , we have a notebook to deploy FalconLite onto a SageMaker endpoint running on AWS - https://github.com/awslabs/extending-the-context-length-of-open-source-llms/blob/main/custom-tgi-ecr/deploy.ipynb
Feel free to give it a try and let us know if any issues. Thanks

Hi @dm-mschubert , we have a notebook to deploy FalconLite onto a SageMaker endpoint running on AWS - https://github.com/awslabs/extending-the-context-length-of-open-source-llms/blob/main/custom-tgi-ecr/deploy.ipynb
Feel free to give it a try and let us know if any issues. Thanks

Which SageMaker Image and Kernel should we use for https://github.com/awslabs/extending-the-context-length-of-open-source-llms/blob/main/custom-tgi-ecr/deploy.ipynb?

Sign up or log in to comment