id stringlengths 14 16 | text stringlengths 31 2.73k | source stringlengths 56 166 |
|---|---|---|
302ea95dbf72-10 | "answer the question (in Italian)"
"If you do update it, please update the sources as well. "
"If the context isn't useful, return the original answer."
)
refine_prompt = PromptTemplate(
input_variables=["question", "existing_answer", "context_str"],
template=refine_template,
)
question_template = (
... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/qa_with_sources.html |
302ea95dbf72-11 | "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha onorato la sua carriera e ha contribuito a costruire un consenso. Ha ricevuto un ampio sostegno, dall'Ordine Fraterno della Polizia a ex giudici nominati da democratici e repubblicani. Inoltre, ha sottolineato l'impor... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/qa_with_sources.html |
302ea95dbf72-12 | "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha onorato la sua carriera e ha contribuito a costruire un consenso. Ha ricevuto un ampio sostegno, dall'Ordine Fraterno della Polizia a ex giudici nominati da democratici e repubblicani. Inoltre, ha sottolineato l'impor... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/qa_with_sources.html |
302ea95dbf72-13 | "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha onorato la sua carriera e ha contribuito a costruire un consenso. Ha ricevuto un ampio sostegno, dall'Ordine Fraterno della Polizia a ex giudici nominati da democratici e repubblicani. Inoltre, ha sottolineato l'impor... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/qa_with_sources.html |
302ea95dbf72-14 | 'output_text': "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha onorato la sua carriera e ha contribuito a costruire un consenso. Ha ricevuto un ampio sostegno, dall'Ordine Fraterno della Polizia a ex giudici nominati da democratici e repubblicani. Inoltre, ha sotto... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/qa_with_sources.html |
302ea95dbf72-15 | 'score': '100'},
{'answer': ' This document does not answer the question', 'score': '0'},
{'answer': ' This document does not answer the question', 'score': '0'},
{'answer': ' This document does not answer the question', 'score': '0'}]
Custom Prompts
You can also use your own prompts with this chain. In this example... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/qa_with_sources.html |
302ea95dbf72-16 | result
{'source': 30,
'intermediate_steps': [{'answer': ' Il presidente ha detto che Justice Breyer ha dedicato la sua vita a servire questo paese e ha onorato la sua carriera.',
'score': '100'},
{'answer': ' Il presidente non ha detto nulla sulla Giustizia Breyer.',
'score': '100'},
{'answer': ' Non so.', '... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/qa_with_sources.html |
da3c9bdc175e-0 | .ipynb
.pdf
Hypothetical Document Embeddings
Contents
Multiple generations
Using our own prompts
Using HyDE
Hypothetical Document Embeddings#
This notebook goes over how to use Hypothetical Document Embeddings (HyDE), as described in this paper.
At a high level, HyDE is an embedding technique that takes queries, gene... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/hyde.html |
da3c9bdc175e-1 | Using our own prompts#
Besides using preconfigured prompts, we can also easily construct our own prompts and use those in the LLMChain that is generating the documents. This can be useful if we know the domain our queries will be in, as we can condition the prompt to generate text more similar to that.
In the example b... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/hyde.html |
da3c9bdc175e-2 | print(docs[0].page_content)
In state after state, new laws have been passed, not only to suppress the vote, but to subvert entire elections.
We cannot let this happen.
Tonight. I call on the Senate to: Pass the Freedom to Vote Act. Pass the John Lewis Voting Rights Act. And while you’re at it, pass the Disclose Act s... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/hyde.html |
31a5a21f2f33-0 | .ipynb
.pdf
Summarization
Contents
Prepare Data
Quickstart
The stuff Chain
The map_reduce Chain
The refine Chain
Summarization#
This notebook walks through how to use LangChain for summarization over a list of documents. It covers three different chain types: stuff, map_reduce, and refine. For a more in depth explana... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-1 | chain.run(docs)
' In response to Russian aggression in Ukraine, the United States and its allies are taking action to hold Putin accountable, including economic sanctions, asset seizures, and military assistance. The US is also providing economic and humanitarian aid to Ukraine, and has passed the American Rescue Plan ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-2 | chain.run(docs)
"\n\nIn questa serata, il Presidente degli Stati Uniti ha annunciato una serie di misure per affrontare la crisi in Ucraina, causata dall'aggressione di Putin. Ha anche annunciato l'invio di aiuti economici, militari e umanitari all'Ucraina. Ha anche annunciato che gli Stati Uniti e i loro alleati stann... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-3 | chain = load_summarize_chain(OpenAI(temperature=0), chain_type="map_reduce", return_intermediate_steps=True)
chain({"input_documents": docs}, return_only_outputs=True)
{'map_steps': [" In response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sancti... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-4 | prompt_template = """Write a concise summary of the following:
{text}
CONCISE SUMMARY IN ITALIAN:"""
PROMPT = PromptTemplate(template=prompt_template, input_variables=["text"])
chain = load_summarize_chain(OpenAI(temperature=0), chain_type="map_reduce", return_intermediate_steps=True, map_prompt=PROMPT, combine_prompt=... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-5 | "\n\nStiamo unendo le nostre forze con quelle dei nostri alleati europei per sequestrare yacht, appartamenti di lusso e jet privati di Putin. Abbiamo chiuso lo spazio aereo americano ai voli russi e stiamo fornendo più di un miliardo di dollari in assistenza all'Ucraina. Abbiamo anche mobilitato le nostre forze terrest... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-6 | "\n\nIl Presidente Biden ha lottato per passare l'American Rescue Plan per aiutare le persone che soffrivano a causa della pandemia. Il piano ha fornito sollievo economico immediato a milioni di americani, ha aiutato a mettere cibo sulla loro tavola, a mantenere un tetto sopra le loro teste e a ridurre il costo dell'as... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-7 | The refine Chain#
This sections shows results of using the refine Chain to do summarization.
chain = load_summarize_chain(llm, chain_type="refine")
chain.run(docs)
"\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Put... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-8 | chain({"input_documents": docs}, return_only_outputs=True)
{'refine_steps': [" In response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-9 | "\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs and seize their ill-gotten gains. We are ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-10 | 'output_text': "\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs and seize their ill-gotten... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-11 | "(only if needed) with some more context below.\n"
"------------\n"
"{text}\n"
"------------\n"
"Given the new context, refine the original summary in Italian"
"If the context isn't useful, return the original summary."
)
refine_prompt = PromptTemplate(
input_variables=["existing_answer", "text"... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-12 | "\n\nQuesta sera, ci incontriamo come democratici, repubblicani e indipendenti, ma soprattutto come americani. La Russia di Putin ha cercato di scuotere le fondamenta del mondo libero, ma ha sottovalutato la forza della gente ucraina. Insieme ai nostri alleati, stiamo imponendo sanzioni economiche, tagliando l'accesso ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-13 | "\n\nQuesta sera, ci incontriamo come democratici, repubblicani e indipendenti, ma soprattutto come americani. La Russia di Putin ha cercato di scuotere le fondamenta del mondo libero, ma ha sottovalutato la forza della gente ucraina. Insieme ai nostri alleati, stiamo imponendo sanzioni economiche, tagliando l'accesso ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
31a5a21f2f33-14 | 'output_text': "\n\nQuesta sera, ci incontriamo come democratici, repubblicani e indipendenti, ma soprattutto come americani. La Russia di Putin ha cercato di scuotere le fondamenta del mondo libero, ma ha sottovalutato la forza della gente ucraina. Insieme ai nostri alleati, stiamo imponendo sanzioni economiche, tagli... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/summarize.html |
e454e9323c55-0 | .ipynb
.pdf
Graph QA
Contents
Create the graph
Querying the graph
Save the graph
Graph QA#
This notebook goes over how to do question answering over a graph data structure.
Create the graph#
In this section, we construct an example graph. At the moment, this works best for small pieces of text.
from langchain.indexes... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/graph_qa.html |
e454e9323c55-1 | 'is the ground on which')]
Querying the graph#
We can now use the graph QA chain to ask question of the graph
from langchain.chains import GraphQAChain
chain = GraphQAChain.from_llm(OpenAI(temperature=0), graph=graph, verbose=True)
chain.run("what is Intel going to build?")
> Entering new GraphQAChain chain...
Entities... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/graph_qa.html |
e454e9323c55-2 | By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Apr 18, 2023. | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/graph_qa.html |
3c853f83f78c-0 | .ipynb
.pdf
Retrieval Question Answering with Sources
Contents
Chain Type
Retrieval Question Answering with Sources#
This notebook goes over how to do question-answering with sources over an Index. It does this by using the RetrievalQAWithSourcesChain, which does the lookup of the documents from an Index.
from langch... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/vector_db_qa_with_sources.html |
3c853f83f78c-1 | 'sources': '31-pl'}
Chain Type#
You can easily specify different chain types to load and use in the RetrievalQAWithSourcesChain chain. For a more detailed walkthrough of these types, please see this notebook.
There are two ways to load different chain types. First, you can specify the chain type argument in the from_ch... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/vector_db_qa_with_sources.html |
3c853f83f78c-2 | {'answer': ' The president honored Justice Breyer for his service and mentioned his legacy of excellence.\n',
'sources': '31-pl'}
previous
Retrieval Question/Answering
next
Vector DB Text Generation
Contents
Chain Type
By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Apr 18, ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/vector_db_qa_with_sources.html |
4023424651da-0 | .ipynb
.pdf
Question Answering
Contents
Prepare Data
Quickstart
The stuff Chain
The map_reduce Chain
The refine Chain
The map-rerank Chain
Question Answering#
This notebook walks through how to use LangChain for question answering over a list of documents. It covers four different types of chains: stuff, map_reduce, ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-1 | from langchain.llms import OpenAI
Quickstart#
If you just want to get started as quickly as possible, this is the recommended way to do it:
chain = load_qa_chain(OpenAI(temperature=0), chain_type="stuff")
query = "What did the president say about Justice Breyer"
chain.run(input_documents=docs, question=query)
' The pre... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-2 | chain({"input_documents": docs, "question": query}, return_only_outputs=True)
{'output_text': ' Il presidente ha detto che Justice Breyer ha dedicato la sua vita a servire questo paese e ha ricevuto una vasta gamma di supporto.'}
The map_reduce Chain#
This sections shows results of using the map_reduce Chain to do ques... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-3 | ' None',
' None'],
'output_text': ' The president said that Justice Breyer is an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court, and thanked him for his service.'}
Custom Prompts
You can also use your own prompts with this chain. In this example, we will respond in Ital... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-4 | chain({"input_documents": docs, "question": query}, return_only_outputs=True)
{'intermediate_steps': ["\nStasera vorrei onorare qualcuno che ha dedicato la sua vita a servire questo paese: il giustizia Stephen Breyer - un veterano dell'esercito, uno studioso costituzionale e un giustizia in uscita della Corte Suprema d... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-5 | chain({"input_documents": docs, "question": query}, return_only_outputs=True)
{'output_text': '\n\nThe president said that he wanted to honor Justice Breyer for his dedication to serving the country, his legacy of excellence, and his commitment to advancing liberty and justice, as well as for his support of the Equalit... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-6 | '\n\nThe president said that he wanted to honor Justice Breyer for his dedication to serving the country, his legacy of excellence, and his commitment to advancing liberty and justice, as well as for his support of the Equality Act and his commitment to protecting the rights of LGBTQ+ Americans. He also praised Justice... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-7 | template=refine_prompt_template,
)
initial_qa_template = (
"Context information is below. \n"
"---------------------\n"
"{context_str}"
"\n---------------------\n"
"Given the context information and not prior knowledge, "
"answer the question: {question}\nYour answer should be in Italian.\n"
)
i... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-8 | "\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha reso omaggio al suo servizio e ha sostenuto la nomina di una top litigatrice in pratica privata, un ex difensore pubblico federale e una famiglia di insegnanti e agenti di polizia delle scuole pubbliche. Ha anche sottol... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-9 | 'output_text': "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha reso omaggio al suo servizio e ha sostenuto la nomina di una top litigatrice in pratica privata, un ex difensore pubblico federale e una famiglia di insegnanti e agenti di polizia delle scuole pubbliche... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-10 | {'answer': ' This document does not answer the question', 'score': '0'},
{'answer': ' This document does not answer the question', 'score': '0'},
{'answer': ' This document does not answer the question', 'score': '0'}]
Custom Prompts
You can also use your own prompts with this chain. In this example, we will respond ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
4023424651da-11 | 'score': '100'},
{'answer': ' Il presidente non ha detto nulla sulla Giustizia Breyer.',
'score': '100'},
{'answer': ' Non so.', 'score': '0'},
{'answer': ' Non so.', 'score': '0'}],
'output_text': ' Il presidente ha detto che Justice Breyer ha dedicato la sua vita a servire questo paese.'}
previous
Question ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/question_answering.html |
1d966834b269-0 | .ipynb
.pdf
Chat Over Documents with Chat History
Contents
Return Source Documents
ConversationalRetrievalChain with search_distance
ConversationalRetrievalChain with map_reduce
ConversationalRetrievalChain with Question Answering with sources
ConversationalRetrievalChain with streaming to stdout
get_chat_history Fun... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/chat_vector_db.html |
1d966834b269-1 | Using DuckDB in-memory for database. Data will be transient.
We now initialize the ConversationalRetrievalChain
qa = ConversationalRetrievalChain.from_llm(OpenAI(temperature=0), vectorstore.as_retriever())
Here’s an example of asking a question with no chat history
chat_history = []
query = "What did the president say ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/chat_vector_db.html |
1d966834b269-2 | result['source_documents'][0]
Document(page_content='Tonight. I call on the Senate to: Pass the Freedom to Vote Act. Pass the John Lewis Voting Rights Act. And while you’re at it, pass the Disclose Act so Americans can know who is funding our elections. \n\nTonight, I’d like to honor someone who has dedicated his life ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/chat_vector_db.html |
1d966834b269-3 | from langchain.chains.question_answering import load_qa_chain
from langchain.chains.conversational_retrieval.prompts import CONDENSE_QUESTION_PROMPT
llm = OpenAI(temperature=0)
question_generator = LLMChain(llm=llm, prompt=CONDENSE_QUESTION_PROMPT)
doc_chain = load_qa_chain(llm, chain_type="map_reduce")
chain = Convers... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/chat_vector_db.html |
1d966834b269-4 | combine_docs_chain=doc_chain,
)
chat_history = []
query = "What did the president say about Ketanji Brown Jackson"
result = chain({"question": query, "chat_history": chat_history})
result['answer']
" The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/chat_vector_db.html |
1d966834b269-5 | chat_history = []
query = "What did the president say about Ketanji Brown Jackson"
result = qa({"question": query, "chat_history": chat_history})
The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, and from ... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/chat_vector_db.html |
1d966834b269-6 | result['answer']
" The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, and from a family of public school educators and police officers. He also said that she is a consensus builder and has received a broad r... | https://langchain-cn.readthedocs.io/en/latest/modules/chains/index_examples/chat_vector_db.html |
ba5d737408be-0 | .rst
.pdf
Chat Models
Chat Models#
Note
Conceptual Guide
Chat models are a variation on language models.
While chat models use language models under the hood, the interface they expose is a bit different.
Rather than expose a “text in, text out” API, they expose an interface where “chat messages” are the inputs and out... | https://langchain-cn.readthedocs.io/en/latest/modules/models/chat.html |
8c140444a1b8-0 | .rst
.pdf
Text Embedding Models
Text Embedding Models#
Note
Conceptual Guide
This documentation goes over how to use the Embedding class in LangChain.
The Embedding class is a class designed for interfacing with embeddings. There are lots of Embedding providers (OpenAI, Cohere, Hugging Face, etc) - this class is design... | https://langchain-cn.readthedocs.io/en/latest/modules/models/text_embedding.html |
421f5e70f519-0 | .rst
.pdf
LLMs (大语言模型)
LLMs (大语言模型)#
Note
概念指南
大型语言模型(LLM)是 LangChain 的核心组件。
LangChain 不提供 LLM,而是提供一个标准接口,通过该接口,您可以与各种 LLM 进行交互。
以下是文档的部分内容::
Getting Started: LangChain 的 LLM 类功能概述。
How-To Guides: 一系列指南,重点介绍如何使用我们的 LLM 类(流处理、异步等)来实现各种目标。
Integrations: 一系列示例,介绍如何将不同的 LLM 提供者(OpenAI、Hugging Face 等)与 LangChain 集成。
Referen... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms.html |
07e8b513ff17-0 | .ipynb
.pdf
Getting Started
Getting Started#
这篇笔记将介绍如何使用 LangChain 中的 LLM 类。
LLM 类是专门用于与 LLM(linguistic language model)进行接口交互的类。存在许多 LLM 供应商(例如 OpenAI、Cohere、Hugging Face 等),这个类旨在提供一个通用的标准接口。在文档的这一部分,我们将重点介绍通用 LLM 功能。有关使用特定 LLM 包装器的示例的详细信息,请参见 [How-To 部分]。(how_to_guides.rst).
这篇笔记中,我们将使用 OpenAI 的 LLM 包装器,尽管突出显示的所有功能对于所... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/getting_started.html |
07e8b513ff17-1 | len(llm_result.generations)
30
llm_result.generations[0]
[Generation(text='\n\nWhy did the chicken cross the road?\n\nTo get to the other side!'),
Generation(text='\n\nWhy did the chicken cross the road?\n\nTo get to the other side.')]
llm_result.generations[-1]
[Generation(text="\n\nWhat if love neverspeech\n\nWhat i... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/getting_started.html |
07e8b513ff17-2 | 'total_tokens': 4023,
'prompt_tokens': 120}}
Number of Tokens: 您还可以估计一个文本在该模型中有多少令牌。这是有用的,因为模型具有上下文长度(并且令牌越多越昂贵),这意味着您需要知道您要传递的文本有多长。
请注意,默认情况下,使用 tiktoken 估计令牌 (除了<3.8的版本以外,均使用HuggingFace令牌处理器)
llm.get_num_tokens("what a joke")
3
previous
LLMs (大语言模型)
next
通用功能
By Harrison Chase
© Copyright 2023, Harrison... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/getting_started.html |
5d7f0621cfd3-0 | .rst
.pdf
通用功能
通用功能#
一系列关于如何使用LLM模型的例子。
如何使用 LLM 的异步 API
如何写一个自定义的LLM包装器
How (and why) to use the fake LLM
How to cache LLM calls
How to serialize LLM classes
如何实现 LLM 和 Chat Model 的流式响应
How to track token usage
previous
Getting Started
next
如何使用 LLM 的异步 API
By Harrison Chase
© Copyright 2023, Harrison Chase... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/how_to_guides.html |
4efca2c03f4c-0 | .rst
.pdf
Integrations
Integrations#
The examples here are all “how-to” guides for how to integrate with various LLM providers.
AI21
Aleph Alpha
Anthropic
Azure OpenAI LLM Example
Banana
CerebriumAI LLM Example
Cohere
DeepInfra LLM Example
ForefrontAI LLM Example
GooseAI LLM Example
GPT4All
Hugging Face Hub
Llama-cpp
M... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations.html |
4d89ee47a374-0 | .ipynb
.pdf
Banana
Banana#
This example goes over how to use LangChain to interact with Banana models
import os
from langchain.llms import Banana
from langchain import PromptTemplate, LLMChain
os.environ["BANANA_API_KEY"] = "YOUR_API_KEY"
template = """Question: {question}
Answer: Let's think step by step."""
prompt = ... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/banana.html |
faa8ca0ec386-0 | .ipynb
.pdf
Hugging Face Hub
Hugging Face Hub#
This example showcases how to connect to the Hugging Face Hub.
from langchain import PromptTemplate, HuggingFaceHub, LLMChain
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["question"])
ll... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/huggingface_hub.html |
7be991750fd2-0 | .ipynb
.pdf
Replicate
Contents
Setup
Calling a model
Chaining Calls
Replicate#
This example goes over how to use LangChain to interact with Replicate models
import os
from langchain.llms import Replicate
from langchain import PromptTemplate, LLMChain
os.environ["REPLICATE_API_TOKEN"] = "YOUR REPLICATE API TOKEN"
Setu... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/replicate.html |
7be991750fd2-1 | prompt = """
Answer the following yes/no question by reasoning step by step.
Can a dog drive a car?
"""
llm(prompt)
'The legal driving age of dogs is 2. Cars are designed for humans to drive. Therefore, the final answer is yes.'
We can call any replicate model using this syntax. For example, we can call stable diffusi... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/replicate.html |
7be991750fd2-2 | text2image = Replicate(model="stability-ai/stable-diffusion:db21e45d3f7023abc2a46ee38a23973f6dce16bb082a930b0c49861f96d1e5bf")
First prompt in the chain
prompt = PromptTemplate(
input_variables=["product"],
template="What is a good name for a company that makes {product}?",
)
chain = LLMChain(llm=llm, prompt=pr... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/replicate.html |
7be991750fd2-3 | > Finished chain.
https://replicate.delivery/pbxt/BedAP1PPBwXFfkmeD7xDygXO4BcvApp1uvWOwUdHM4tcQfvCB/out-0.png
response = requests.get("https://replicate.delivery/pbxt/eq6foRJngThCAEBqse3nL3Km2MBfLnWQNd0Hy2SQRo2LuprCB/out-0.png")
img = Image.open(BytesIO(response.content))
img
previous
PromptLayer OpenAI
next
SageMakerE... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/replicate.html |
c6ca641a2123-0 | .ipynb
.pdf
StochasticAI
StochasticAI#
This example goes over how to use LangChain to interact with StochasticAI models
from langchain.llms import StochasticAI
from langchain import PromptTemplate, LLMChain
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=template... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/stochasticai.html |
d91097bc895c-0 | .ipynb
.pdf
DeepInfra LLM Example
Contents
Imports
Set the Environment API Key
Create the DeepInfra instance
Create a Prompt Template
Initiate the LLMChain
Run the LLMChain
DeepInfra LLM Example#
This notebook goes over how to use Langchain with DeepInfra.
Imports#
import os
from langchain.llms import DeepInfra
from ... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/deepinfra_example.html |
d91097bc895c-1 | By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Apr 18, 2023. | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/deepinfra_example.html |
489393408d76-0 | .ipynb
.pdf
ForefrontAI LLM Example
Contents
Imports
Set the Environment API Key
Create the ForefrontAI instance
Create a Prompt Template
Initiate the LLMChain
Run the LLMChain
ForefrontAI LLM Example#
This notebook goes over how to use Langchain with ForefrontAI.
Imports#
import os
from langchain.llms import Forefro... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/forefrontai_example.html |
489393408d76-1 | By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Apr 18, 2023. | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/forefrontai_example.html |
6059dd0eb6ab-0 | .ipynb
.pdf
Anthropic
Anthropic#
This example goes over how to use LangChain to interact with Anthropic models
from langchain.llms import Anthropic
from langchain import PromptTemplate, LLMChain
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_vari... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/anthropic_example.html |
f8601ef3942e-0 | .ipynb
.pdf
PromptLayer OpenAI
Contents
Install PromptLayer
Imports
Set the Environment API Key
Use the PromptLayerOpenAI LLM like normal
Using PromptLayer Track
PromptLayer OpenAI#
This example showcases how to connect to PromptLayer to start recording your OpenAI requests.
Install PromptLayer#
The promptlayer packa... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/promptlayer_openai.html |
f8601ef3942e-1 | for res in llm_results.generations:
pl_request_id = res[0].generation_info["pl_request_id"]
promptlayer.track.score(request_id=pl_request_id, score=100)
Using this allows you to track the performance of your model in the PromptLayer dashboard. If you are using a prompt template, you can attach a template to a r... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/promptlayer_openai.html |
4b970ef57330-0 | .ipynb
.pdf
Petals LLM Example
Contents
Install petals
Imports
Set the Environment API Key
Create the Petals instance
Create a Prompt Template
Initiate the LLMChain
Run the LLMChain
Petals LLM Example#
This notebook goes over how to use Langchain with Petals.
Install petals#
The petals package is required to use the ... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/petals_example.html |
4b970ef57330-1 | By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Apr 18, 2023. | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/petals_example.html |
415d1f183b08-0 | .ipynb
.pdf
Modal
Modal#
This example goes over how to use LangChain to interact with Modal models
from langchain.llms import Modal
from langchain import PromptTemplate, LLMChain
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["question... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/modal.html |
d65befe845ec-0 | .ipynb
.pdf
Manifest
Contents
Compare HF Models
Manifest#
This notebook goes over how to use Manifest and LangChain.
For more detailed information on manifest, and how to use it with local hugginface models like in this example, see https://github.com/HazyResearch/manifest
from manifest import Manifest
from langchain... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/manifest.html |
d65befe845ec-1 | state_of_the_union = f.read()
mp_chain.run(state_of_the_union)
'President Obama delivered his annual State of the Union address on Tuesday night, laying out his priorities for the coming year. Obama said the government will provide free flu vaccines to all Americans, ending the government shutdown and allowing business... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/manifest.html |
d65befe845ec-2 | )
manifest3 = ManifestWrapper(
client=Manifest(
client_name="huggingface",
client_connection="http://127.0.0.1:5002"
),
llm_kwargs={"temperature": 0.01}
)
llms = [manifest1, manifest2, manifest3]
model_lab = ModelLaboratory(llms)
model_lab.compare("What color is a flamingo?")
Input:
What col... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/manifest.html |
87528590717f-0 | .ipynb
.pdf
GooseAI LLM Example
Contents
Install openai
Imports
Set the Environment API Key
Create the GooseAI instance
Create a Prompt Template
Initiate the LLMChain
Run the LLMChain
GooseAI LLM Example#
This notebook goes over how to use Langchain with GooseAI.
Install openai#
The openai package is required to use ... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/gooseai_example.html |
87528590717f-1 | Initiate the LLMChain
Run the LLMChain
By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Apr 18, 2023. | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/gooseai_example.html |
5eb48723a764-0 | .ipynb
.pdf
Writer
Writer#
This example goes over how to use LangChain to interact with Writer models
from langchain.llms import Writer
from langchain import PromptTemplate, LLMChain
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["ques... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/writer.html |
5bbba680ae47-0 | .ipynb
.pdf
Cohere
Cohere#
This example goes over how to use LangChain to interact with Cohere models
from langchain.llms import Cohere
from langchain import PromptTemplate, LLMChain
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["ques... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/cohere.html |
5bbba680ae47-1 | llm_chain.run(question)
" Let's start with the year that Justin Beiber was born. You know that he was born in 1994. We have to go back one year. 1993.\n\n1993 was the year that the Dallas Cowboys won the Super Bowl. They won over the Buffalo Bills in Super Bowl 26.\n\nNow, let's do it backwards. According to our inform... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/cohere.html |
11baf78ca572-0 | .ipynb
.pdf
Llama-cpp
Llama-cpp#
This notebook goes over how to run llama-cpp within LangChain
!pip install llama-cpp-python
from langchain.llms import LlamaCpp
from langchain import PromptTemplate, LLMChain
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=templat... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/llamacpp.html |
43e61a02d66e-0 | .ipynb
.pdf
GPT4All
Contents
Specify Model
GPT4All#
This example goes over how to use LangChain to interact with GPT4All models
%pip install pyllamacpp > /dev/null
from langchain import PromptTemplate, LLMChain
from langchain.llms import GPT4All
from langchain.callbacks.base import CallbackManager
from langchain.call... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/gpt4all.html |
43e61a02d66e-1 | # # This is a large file, so be prepared to wait.
# with open(local_path, 'wb') as f:
# for chunk in tqdm(response.iter_content(chunk_size=8192)):
# if chunk:
# f.write(chunk)
# Callbacks support token-wise streaming
callback_manager = CallbackManager([StreamingStdOutCallbackHandler()])
# Verbos... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/gpt4all.html |
bae98bc90d47-0 | .ipynb
.pdf
Azure OpenAI LLM Example
Contents
API configuration
Deployments
Azure OpenAI LLM Example#
This notebook goes over how to use Langchain with Azure OpenAI.
The Azure OpenAI API is compatible with OpenAI’s API. The openai Python package makes it easy to use both OpenAI and Azure OpenAI. You can call Azure ... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/azure_openai_example.html |
bae98bc90d47-1 | import openai
response = openai.Completion.create(
engine="text-davinci-002-prod",
prompt="This is a test",
max_tokens=5
)
# Import Azure OpenAI
from langchain.llms import AzureOpenAI
# Create an instance of Azure OpenAI
# Replace the deployment name with your own
llm = AzureOpenAI(deployment_name="text-dav... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/azure_openai_example.html |
0342a5fcb021-0 | .ipynb
.pdf
SageMakerEndpoint
SageMakerEndpoint#
This notebooks goes over how to use an LLM hosted on a SageMaker endpoint.
!pip3 install langchain boto3
from langchain.docstore.document import Document
example_doc_1 = """
Peter and Elizabeth took a taxi to attend the night party in the city. While in the party, Elizab... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/sagemaker.html |
0342a5fcb021-1 | return response_json[0]["generated_text"]
content_handler = ContentHandler()
chain = load_qa_chain(
llm=SagemakerEndpoint(
endpoint_name="endpoint-name",
credentials_profile_name="credentials-profile-name",
region_name="us-west-2",
model_kwargs={"temperature":1e-10},
conte... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/sagemaker.html |
e3f17e832137-0 | .ipynb
.pdf
Aleph Alpha
Aleph Alpha#
This example goes over how to use LangChain to interact with Aleph Alpha models
from langchain.llms import AlephAlpha
from langchain import PromptTemplate, LLMChain
template = """Q: {question}
A:"""
prompt = PromptTemplate(template=template, input_variables=["question"])
llm = Aleph... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/aleph_alpha.html |
fb5ac161944d-0 | .ipynb
.pdf
Self-Hosted Models via Runhouse
Self-Hosted Models via Runhouse#
This example goes over how to use LangChain and Runhouse to interact with models hosted on your own GPU, or on-demand GPUs on AWS, GCP, AWS, or Lambda.
For more information, see Runhouse or the Runhouse docs.
from langchain.llms import SelfHos... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/self_hosted_examples.html |
fb5ac161944d-1 | INFO | 2023-02-17 05:42:24,016 | Time to send message: 0.48 seconds
"\n\nLet's say we're talking sports teams who won the Super Bowl in the year Justin Beiber"
You can also load more custom models through the SelfHostedHuggingFaceLLM interface:
llm = SelfHostedHuggingFaceLLM(
model_id="google/flan-t5-small",
ta... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/self_hosted_examples.html |
fb5ac161944d-2 | llm("Who is the current US president?")
INFO | 2023-02-17 05:42:59,219 | Running _generate_text via gRPC
INFO | 2023-02-17 05:42:59,522 | Time to send message: 0.3 seconds
'john w. bush'
You can send your pipeline directly over the wire to your model, but this will only work for small models (<2 Gb), and will be pretty... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/self_hosted_examples.html |
1bf384de0e20-0 | .ipynb
.pdf
OpenAI
OpenAI#
This example goes over how to use LangChain to interact with OpenAI models
from langchain.llms import OpenAI
from langchain import PromptTemplate, LLMChain
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["ques... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/openai.html |
ffd49c33bcc4-0 | .ipynb
.pdf
AI21
AI21#
This example goes over how to use LangChain to interact with AI21 models
from langchain.llms import AI21
from langchain import PromptTemplate, LLMChain
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["question"])
... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/ai21.html |
8b1846a5c8e9-0 | .ipynb
.pdf
CerebriumAI LLM Example
Contents
Install cerebrium
Imports
Set the Environment API Key
Create the CerebriumAI instance
Create a Prompt Template
Initiate the LLMChain
Run the LLMChain
CerebriumAI LLM Example#
This notebook goes over how to use Langchain with CerebriumAI.
Install cerebrium#
The cerebrium pa... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/cerebriumai_example.html |
8b1846a5c8e9-1 | llm_chain.run(question)
previous
Banana
next
Cohere
Contents
Install cerebrium
Imports
Set the Environment API Key
Create the CerebriumAI instance
Create a Prompt Template
Initiate the LLMChain
Run the LLMChain
By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Apr 18, 2023. | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/integrations/cerebriumai_example.html |
8d4cff8ae975-0 | .ipynb
.pdf
How to cache LLM calls
Contents
In Memory Cache
SQLite Cache
Redis Cache
GPTCache
SQLAlchemy Cache
Custom SQLAlchemy Schemas
Optional Caching
Optional Caching in Chains
How to cache LLM calls#
This notebook covers how to cache results of individual LLM calls.
from langchain.llms import OpenAI
In Memory Ca... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/examples/llm_caching.html |
8d4cff8ae975-1 | llm("Tell me a joke")
CPU times: user 17 ms, sys: 9.76 ms, total: 26.7 ms
Wall time: 825 ms
'\n\nWhy did the chicken cross the road?\n\nTo get to the other side.'
%%time
# The second time it is, so it goes faster
llm("Tell me a joke")
CPU times: user 2.46 ms, sys: 1.23 ms, total: 3.7 ms
Wall time: 2.67 ms
'\n\nWhy did ... | https://langchain-cn.readthedocs.io/en/latest/modules/models/llms/examples/llm_caching.html |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.