id stringlengths 14 16 | source stringlengths 49 117 | text stringlengths 16 2.73k |
|---|---|---|
d59c2a0ac5ba-0 | https://python.langchain.com/en/latest/modules/prompts/prompt_templates/examples/custom_prompt_template.html | .ipynb
.pdf
How to create a custom prompt template
Contents
Why are custom prompt templates needed?
Creating a Custom Prompt Template
Use the custom prompt template
How to create a custom prompt template#
Let’s suppose we want the LLM to generate English language explanations of a function given its name. To achieve ... |
d59c2a0ac5ba-1 | https://python.langchain.com/en/latest/modules/prompts/prompt_templates/examples/custom_prompt_template.html | return inspect.getsource(function_name)
Next, we’ll create a custom prompt template that takes in the function name as input, and formats the prompt template to provide the source code of the function.
from langchain.prompts import StringPromptTemplate
from pydantic import BaseModel, validator
class FunctionExplainerPr... |
d59c2a0ac5ba-2 | https://python.langchain.com/en/latest/modules/prompts/prompt_templates/examples/custom_prompt_template.html | Function Name: get_source_code
Source Code:
def get_source_code(function_name):
# Get the source code of the function
return inspect.getsource(function_name)
Explanation:
previous
Connecting to a Feature Store
next
How to create a prompt template that uses few shot examples
Con... |
cbe51a8d26c8-0 | https://python.langchain.com/en/latest/modules/chains/how_to_guides.html | .rst
.pdf
How-To Guides
How-To Guides#
A chain is made up of links, which can be either primitives or other chains.
Primitives can be either prompts, models, arbitrary functions, or other chains.
The examples here are broken up into three sections:
Generic Functionality
Covers both generic chains (that are useful in a ... |
2a283c13ea6b-0 | https://python.langchain.com/en/latest/modules/chains/getting_started.html | .ipynb
.pdf
Getting Started
Contents
Why do we need chains?
Quick start: Using LLMChain
Different ways of calling chains
Add memory to chains
Debug Chain
Combine chains with the SequentialChain
Create a custom chain with the Chain class
Getting Started#
In this tutorial, we will learn about creating simple chains in ... |
2a283c13ea6b-1 | https://python.langchain.com/en/latest/modules/chains/getting_started.html | If there are multiple variables, you can input them all at once using a dictionary.
prompt = PromptTemplate(
input_variables=["company", "product"],
template="What is a good name for {company} that makes {product}?",
)
chain = LLMChain(llm=llm, prompt=prompt)
print(chain.run({
'company': "ABC Startup",
... |
2a283c13ea6b-2 | https://python.langchain.com/en/latest/modules/chains/getting_started.html | By default, __call__ returns both the input and output key values. You can configure it to only return output key values by setting return_only_outputs to True.
llm_chain("corny", return_only_outputs=True)
{'text': 'Why did the tomato turn red? Because it saw the salad dressing!'}
If the Chain only outputs one output k... |
2a283c13ea6b-3 | https://python.langchain.com/en/latest/modules/chains/getting_started.html | # -> The first three colors of a rainbow are red, orange, and yellow.
conversation.run("And the next 4?")
# -> The next four colors of a rainbow are green, blue, indigo, and violet.
'The next four colors of a rainbow are green, blue, indigo, and violet.'
Essentially, BaseMemory defines an interface of how langchain sto... |
2a283c13ea6b-4 | https://python.langchain.com/en/latest/modules/chains/getting_started.html | The next step after calling a language model is to make a series of calls to a language model. We can do this using sequential chains, which are chains that execute their links in a predefined order. Specifically, we will use the SimpleSequentialChain. This is the simplest type of a sequential chain, where each step ha... |
2a283c13ea6b-5 | https://python.langchain.com/en/latest/modules/chains/getting_started.html | In order to create a custom chain:
Start by subclassing the Chain class,
Fill out the input_keys and output_keys properties,
Add the _call method that shows how to execute the chain.
These steps are demonstrated in the example below:
from langchain.chains import LLMChain
from langchain.chains.base import Chain
from typ... |
2a283c13ea6b-6 | https://python.langchain.com/en/latest/modules/chains/getting_started.html | concat_output = concat_chain.run("colorful socks")
print(f"Concatenated output:\n{concat_output}")
Concatenated output:
Funky Footwear Company
"Brighten Up Your Day with Our Colorful Socks!"
That’s it! For more details about how to do cool things with Chains, check out the how-to guide for chains.
previous
Chains
next
... |
75c99154e29d-0 | https://python.langchain.com/en/latest/modules/chains/index_examples/graph_qa.html | .ipynb
.pdf
Graph QA
Contents
Create the graph
Querying the graph
Save the graph
Graph QA#
This notebook goes over how to do question answering over a graph data structure.
Create the graph#
In this section, we construct an example graph. At the moment, this works best for small pieces of text.
from langchain.indexes... |
75c99154e29d-1 | https://python.langchain.com/en/latest/modules/chains/index_examples/graph_qa.html | We can now use the graph QA chain to ask question of the graph
from langchain.chains import GraphQAChain
chain = GraphQAChain.from_llm(OpenAI(temperature=0), graph=graph, verbose=True)
chain.run("what is Intel going to build?")
> Entering new GraphQAChain chain...
Entities Extracted:
Intel
Full Context:
Intel is going... |
8dac55b8912b-0 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa_with_sources.html | .ipynb
.pdf
Retrieval Question Answering with Sources
Contents
Chain Type
Retrieval Question Answering with Sources#
This notebook goes over how to do question-answering with sources over an Index. It does this by using the RetrievalQAWithSourcesChain, which does the lookup of the documents from an Index.
from langch... |
8dac55b8912b-1 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa_with_sources.html | You can easily specify different chain types to load and use in the RetrievalQAWithSourcesChain chain. For a more detailed walkthrough of these types, please see this notebook.
There are two ways to load different chain types. First, you can specify the chain type argument in the from_chain_type method. This allows you... |
8dac55b8912b-2 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa_with_sources.html | 'sources': '31-pl'}
previous
Retrieval Question/Answering
next
Vector DB Text Generation
Contents
Chain Type
By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Jun 04, 2023. |
f4f4bd6e9e73-0 | https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html | .ipynb
.pdf
Chat Over Documents with Chat History
Contents
Pass in chat history
Using a different model for condensing the question
Return Source Documents
ConversationalRetrievalChain with search_distance
ConversationalRetrievalChain with map_reduce
ConversationalRetrievalChain with Question Answering with sources
C... |
f4f4bd6e9e73-1 | https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html | We can now create a memory object, which is neccessary to track the inputs/outputs and hold a conversation.
from langchain.memory import ConversationBufferMemory
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
We now initialize the ConversationalRetrievalChain
qa = ConversationalRetri... |
f4f4bd6e9e73-2 | https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html | " The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, and from a family of public school educators and police officers. He also said that she is a consensus builder and has received a broad range of support f... |
f4f4bd6e9e73-3 | https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html | result = qa({"question": query, "chat_history": chat_history})
chat_history = [(query, result["answer"])]
query = "Did he mention who she suceeded"
result = qa({"question": query, "chat_history": chat_history})
Return Source Documents#
You can also easily return source documents from the ConversationalRetrievalChain. T... |
f4f4bd6e9e73-4 | https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html | vectordbkwargs = {"search_distance": 0.9}
qa = ConversationalRetrievalChain.from_llm(OpenAI(temperature=0), vectorstore.as_retriever(), return_source_documents=True)
chat_history = []
query = "What did the president say about Ketanji Brown Jackson"
result = qa({"question": query, "chat_history": chat_history, "vectordb... |
f4f4bd6e9e73-5 | https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html | You can also use this chain with the question answering with sources chain.
from langchain.chains.qa_with_sources import load_qa_with_sources_chain
llm = OpenAI(temperature=0)
question_generator = LLMChain(llm=llm, prompt=CONDENSE_QUESTION_PROMPT)
doc_chain = load_qa_with_sources_chain(llm, chain_type="map_reduce")
cha... |
f4f4bd6e9e73-6 | https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html | streaming_llm = OpenAI(streaming=True, callbacks=[StreamingStdOutCallbackHandler()], temperature=0)
question_generator = LLMChain(llm=llm, prompt=CONDENSE_QUESTION_PROMPT)
doc_chain = load_qa_chain(streaming_llm, chain_type="stuff", prompt=QA_PROMPT)
qa = ConversationalRetrievalChain(
retriever=vectorstore.as_retri... |
f4f4bd6e9e73-7 | https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html | query = "What did the president say about Ketanji Brown Jackson"
result = qa({"question": query, "chat_history": chat_history})
result['answer']
" The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, and from ... |
95359b756a1e-0 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | .ipynb
.pdf
Question Answering with Sources
Contents
Prepare Data
Quickstart
The stuff Chain
The map_reduce Chain
The refine Chain
The map-rerank Chain
Question Answering with Sources#
This notebook walks through how to use LangChain for question answering with sources over a list of documents. It covers four differe... |
95359b756a1e-1 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | from langchain.llms import OpenAI
Quickstart#
If you just want to get started as quickly as possible, this is the recommended way to do it:
chain = load_qa_with_sources_chain(OpenAI(temperature=0), chain_type="stuff")
query = "What did the president say about Justice Breyer"
chain({"input_documents": docs, "question": ... |
95359b756a1e-2 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | chain = load_qa_with_sources_chain(OpenAI(temperature=0), chain_type="stuff", prompt=PROMPT)
query = "What did the president say about Justice Breyer"
chain({"input_documents": docs, "question": query}, return_only_outputs=True)
{'output_text': '\nNon so cosa abbia detto il presidente riguardo a Justice Breyer.\nSOURCE... |
95359b756a1e-3 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | You can also use your own prompts with this chain. In this example, we will respond in Italian.
question_prompt_template = """Use the following portion of a long document to see if any of the text is relevant to answer the question.
Return any relevant text in Italian.
{context}
Question: {question}
Relevant text, if ... |
95359b756a1e-4 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | ' Non rilevante.',
" Non c'è testo pertinente."],
'output_text': ' Non conosco la risposta. SOURCES: 30, 31, 33, 20.'}
Batch Size
When using the map_reduce chain, one thing to keep in mind is the batch size you are using during the map step. If this is too high, it could cause rate limiting errors. You can control t... |
95359b756a1e-5 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | {'output_text': "\n\nThe president said that he was honoring Justice Breyer for his dedication to serving the country and that he was a retiring Justice of the United States Supreme Court. He also thanked him for his service and praised his career as a top litigator in private practice, a former federal public defender... |
95359b756a1e-6 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | '\n\nThe president said that he was honoring Justice Breyer for his dedication to serving the country and that he was a retiring Justice of the United States Supreme Court. He also thanked Justice Breyer for his service, noting his background as a top litigator in private practice, a former federal public defender, and... |
95359b756a1e-7 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | '\n\nThe president said that he was honoring Justice Breyer for his dedication to serving the country and that he was a retiring Justice of the United States Supreme Court. He also thanked Justice Breyer for his service, noting his background as a top litigator in private practice, a former federal public defender, and... |
95359b756a1e-8 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | 'output_text': '\n\nThe president said that he was honoring Justice Breyer for his dedication to serving the country and that he was a retiring Justice of the United States Supreme Court. He also thanked Justice Breyer for his service, noting his background as a top litigator in private practice, a former federal publi... |
95359b756a1e-9 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | "If you do update it, please update the sources as well. "
"If the context isn't useful, return the original answer."
)
refine_prompt = PromptTemplate(
input_variables=["question", "existing_answer", "context_str"],
template=refine_template,
)
question_template = (
"Context information is below. \n"
... |
95359b756a1e-10 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha onorato la sua carriera e ha contribuito a costruire un consenso. Ha ricevuto un ampio sostegno, dall'Ordine Fraterno della Polizia a ex giudici nominati da democratici e repubblicani. Inoltre, ha sottolineato l'impor... |
95359b756a1e-11 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha onorato la sua carriera e ha contribuito a costruire un consenso. Ha ricevuto un ampio sostegno, dall'Ordine Fraterno della Polizia a ex giudici nominati da democratici e repubblicani. Inoltre, ha sottolineato l'impor... |
95359b756a1e-12 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha onorato la sua carriera e ha contribuito a costruire un consenso. Ha ricevuto un ampio sostegno, dall'Ordine Fraterno della Polizia a ex giudici nominati da democratici e repubblicani. Inoltre, ha sottolineato l'impor... |
95359b756a1e-13 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | 'output_text': "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha onorato la sua carriera e ha contribuito a costruire un consenso. Ha ricevuto un ampio sostegno, dall'Ordine Fraterno della Polizia a ex giudici nominati da democratici e repubblicani. Inoltre, ha sotto... |
95359b756a1e-14 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | {'answer': ' This document does not answer the question', 'score': '0'},
{'answer': ' This document does not answer the question', 'score': '0'},
{'answer': ' This document does not answer the question', 'score': '0'}]
Custom Prompts
You can also use your own prompts with this chain. In this example, we will respond ... |
95359b756a1e-15 | https://python.langchain.com/en/latest/modules/chains/index_examples/qa_with_sources.html | 'intermediate_steps': [{'answer': ' Il presidente ha detto che Justice Breyer ha dedicato la sua vita a servire questo paese e ha onorato la sua carriera.',
'score': '100'},
{'answer': ' Il presidente non ha detto nulla sulla Giustizia Breyer.',
'score': '100'},
{'answer': ' Non so.', 'score': '0'},
{'answe... |
234efdcac7cd-0 | https://python.langchain.com/en/latest/modules/chains/index_examples/analyze_document.html | .ipynb
.pdf
Analyze Document
Contents
Summarize
Question Answering
Analyze Document#
The AnalyzeDocumentChain is more of an end to chain. This chain takes in a single document, splits it up, and then runs it through a CombineDocumentsChain. This can be used as more of an end-to-end chain.
with open("../../state_of_th... |
234efdcac7cd-1 | https://python.langchain.com/en/latest/modules/chains/index_examples/analyze_document.html | qa_document_chain = AnalyzeDocumentChain(combine_docs_chain=qa_chain)
qa_document_chain.run(input_document=state_of_the_union, question="what did the president say about justice breyer?")
' The president thanked Justice Breyer for his service.'
previous
Transformation Chain
next
Chat Over Documents with Chat History
C... |
05626d4a23b6-0 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | .ipynb
.pdf
Question Answering
Contents
Prepare Data
Quickstart
The stuff Chain
The map_reduce Chain
The refine Chain
The map-rerank Chain
Question Answering#
This notebook walks through how to use LangChain for question answering over a list of documents. It covers four different types of chains: stuff, map_reduce, ... |
05626d4a23b6-1 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | If you just want to get started as quickly as possible, this is the recommended way to do it:
chain = load_qa_chain(OpenAI(temperature=0), chain_type="stuff")
query = "What did the president say about Justice Breyer"
chain.run(input_documents=docs, question=query)
' The president said that Justice Breyer has dedicated ... |
05626d4a23b6-2 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | {'output_text': ' Il presidente ha detto che Justice Breyer ha dedicato la sua vita a servire questo paese e ha ricevuto una vasta gamma di supporto.'}
The map_reduce Chain#
This sections shows results of using the map_reduce Chain to do question answering.
chain = load_qa_chain(OpenAI(temperature=0), chain_type="map_r... |
05626d4a23b6-3 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | 'output_text': ' The president said that Justice Breyer is an Army veteran, Constitutional scholar, and retiring Justice of the United States Supreme Court, and thanked him for his service.'}
Custom Prompts
You can also use your own prompts with this chain. In this example, we will respond in Italian.
question_prompt_t... |
05626d4a23b6-4 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | {'intermediate_steps': ["\nStasera vorrei onorare qualcuno che ha dedicato la sua vita a servire questo paese: il giustizia Stephen Breyer - un veterano dell'esercito, uno studioso costituzionale e un giustizia in uscita della Corte Suprema degli Stati Uniti. Giustizia Breyer, grazie per il tuo servizio.",
'\nNessun ... |
05626d4a23b6-5 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | {'output_text': '\n\nThe president said that he wanted to honor Justice Breyer for his dedication to serving the country, his legacy of excellence, and his commitment to advancing liberty and justice, as well as for his support of the Equality Act and his commitment to protecting the rights of LGBTQ+ Americans. He also... |
05626d4a23b6-6 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | '\n\nThe president said that he wanted to honor Justice Breyer for his dedication to serving the country, his legacy of excellence, and his commitment to advancing liberty and justice, as well as for his support of the Equality Act and his commitment to protecting the rights of LGBTQ+ Americans. He also praised Justice... |
05626d4a23b6-7 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | "Context information is below. \n"
"---------------------\n"
"{context_str}"
"\n---------------------\n"
"Given the context information and not prior knowledge, "
"answer the question: {question}\nYour answer should be in Italian.\n"
)
initial_qa_prompt = PromptTemplate(
input_variables=["contex... |
05626d4a23b6-8 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | "\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha reso omaggio al suo servizio e ha sostenuto la nomina di una top litigatrice in pratica privata, un ex difensore pubblico federale e una famiglia di insegnanti e agenti di polizia delle scuole pubbliche. Ha anche sottol... |
05626d4a23b6-9 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | 'output_text': "\n\nIl presidente ha detto che Justice Breyer ha dedicato la sua vita al servizio di questo paese, ha reso omaggio al suo servizio e ha sostenuto la nomina di una top litigatrice in pratica privata, un ex difensore pubblico federale e una famiglia di insegnanti e agenti di polizia delle scuole pubbliche... |
05626d4a23b6-10 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | {'answer': ' This document does not answer the question', 'score': '0'},
{'answer': ' This document does not answer the question', 'score': '0'}]
Custom Prompts
You can also use your own prompts with this chain. In this example, we will respond in Italian.
from langchain.output_parsers import RegexParser
output_parser... |
05626d4a23b6-11 | https://python.langchain.com/en/latest/modules/chains/index_examples/question_answering.html | {'answer': ' Il presidente non ha detto nulla sulla Giustizia Breyer.',
'score': '100'},
{'answer': ' Non so.', 'score': '0'},
{'answer': ' Non so.', 'score': '0'}],
'output_text': ' Il presidente ha detto che Justice Breyer ha dedicato la sua vita a servire questo paese.'}
previous
Question Answering with Sour... |
5d862a065ad4-0 | https://python.langchain.com/en/latest/modules/chains/index_examples/hyde.html | .ipynb
.pdf
Hypothetical Document Embeddings
Contents
Multiple generations
Using our own prompts
Using HyDE
Hypothetical Document Embeddings#
This notebook goes over how to use Hypothetical Document Embeddings (HyDE), as described in this paper.
At a high level, HyDE is an embedding technique that takes queries, gene... |
5d862a065ad4-1 | https://python.langchain.com/en/latest/modules/chains/index_examples/hyde.html | Besides using preconfigured prompts, we can also easily construct our own prompts and use those in the LLMChain that is generating the documents. This can be useful if we know the domain our queries will be in, as we can condition the prompt to generate text more similar to that.
In the example below, let’s condition i... |
5d862a065ad4-2 | https://python.langchain.com/en/latest/modules/chains/index_examples/hyde.html | In state after state, new laws have been passed, not only to suppress the vote, but to subvert entire elections.
We cannot let this happen.
Tonight. I call on the Senate to: Pass the Freedom to Vote Act. Pass the John Lewis Voting Rights Act. And while you’re at it, pass the Disclose Act so Americans can know who is ... |
b02e0946a3ee-0 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | .ipynb
.pdf
Summarization
Contents
Prepare Data
Quickstart
The stuff Chain
The map_reduce Chain
The custom MapReduceChain
The refine Chain
Summarization#
This notebook walks through how to use LangChain for summarization over a list of documents. It covers three different chain types: stuff, map_reduce, and refine. F... |
b02e0946a3ee-1 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | ' In response to Russian aggression in Ukraine, the United States and its allies are taking action to hold Putin accountable, including economic sanctions, asset seizures, and military assistance. The US is also providing economic and humanitarian aid to Ukraine, and has passed the American Rescue Plan and the Bipartis... |
b02e0946a3ee-2 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | "\n\nIn questa serata, il Presidente degli Stati Uniti ha annunciato una serie di misure per affrontare la crisi in Ucraina, causata dall'aggressione di Putin. Ha anche annunciato l'invio di aiuti economici, militari e umanitari all'Ucraina. Ha anche annunciato che gli Stati Uniti e i loro alleati stanno imponendo sanz... |
b02e0946a3ee-3 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | chain({"input_documents": docs}, return_only_outputs=True)
{'map_steps': [" In response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after th... |
b02e0946a3ee-4 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | PROMPT = PromptTemplate(template=prompt_template, input_variables=["text"])
chain = load_summarize_chain(OpenAI(temperature=0), chain_type="map_reduce", return_intermediate_steps=True, map_prompt=PROMPT, combine_prompt=PROMPT)
chain({"input_documents": docs}, return_only_outputs=True)
{'intermediate_steps': ["\n\nQuest... |
b02e0946a3ee-5 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | "\n\nStiamo unendo le nostre forze con quelle dei nostri alleati europei per sequestrare yacht, appartamenti di lusso e jet privati di Putin. Abbiamo chiuso lo spazio aereo americano ai voli russi e stiamo fornendo più di un miliardo di dollari in assistenza all'Ucraina. Abbiamo anche mobilitato le nostre forze terrest... |
b02e0946a3ee-6 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | "\n\nIl Presidente Biden ha lottato per passare l'American Rescue Plan per aiutare le persone che soffrivano a causa della pandemia. Il piano ha fornito sollievo economico immediato a milioni di americani, ha aiutato a mettere cibo sulla loro tavola, a mantenere un tetto sopra le loro teste e a ridurre il costo dell'as... |
b02e0946a3ee-7 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | You can also use prompt with multi input. In this example, we will use a MapReduce chain to answer specifc question about our code.
from langchain.chains.combine_documents.map_reduce import MapReduceDocumentsChain
from langchain.chains.combine_documents.stuff import StuffDocumentsChain
map_template_string = """Give the... |
b02e0946a3ee-8 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | for idx in range(iter_num):
if list[idx]>list[idx+1]:
temp = list[idx]
list[idx] = list[idx+1]
list[idx+1] = temp
return list
##
def insertion_sort(InputList):
for i in range(1, len(InputList)):
j = i-1
nxt_element = InputList[i]
while (InputList[j] > n... |
b02e0946a3ee-9 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | "\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs and seize their ill-gotten gains. We are ... |
b02e0946a3ee-10 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | "\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs and seize their ill-gotten gains. We are ... |
b02e0946a3ee-11 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | "\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs and seize their ill-gotten gains. We are ... |
b02e0946a3ee-12 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | 'output_text': "\n\nIn response to Russia's aggression in Ukraine, the United States has united with other freedom-loving nations to impose economic sanctions and hold Putin accountable. The U.S. Department of Justice is also assembling a task force to go after the crimes of Russian oligarchs and seize their ill-gotten... |
b02e0946a3ee-13 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | "------------\n"
"{text}\n"
"------------\n"
"Given the new context, refine the original summary in Italian"
"If the context isn't useful, return the original summary."
)
refine_prompt = PromptTemplate(
input_variables=["existing_answer", "text"],
template=refine_template,
)
chain = load_summari... |
b02e0946a3ee-14 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | "\n\nQuesta sera, ci incontriamo come democratici, repubblicani e indipendenti, ma soprattutto come americani. La Russia di Putin ha cercato di scuotere le fondamenta del mondo libero, ma ha sottovalutato la forza della gente ucraina. Insieme ai nostri alleati, stiamo imponendo sanzioni economiche, tagliando l'accesso ... |
b02e0946a3ee-15 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | "\n\nQuesta sera, ci incontriamo come democratici, repubblicani e indipendenti, ma soprattutto come americani. La Russia di Putin ha cercato di scuotere le fondamenta del mondo libero, ma ha sottovalutato la forza della gente ucraina. Insieme ai nostri alleati, stiamo imponendo sanzioni economiche, tagliando l'accesso ... |
b02e0946a3ee-16 | https://python.langchain.com/en/latest/modules/chains/index_examples/summarize.html | 'output_text': "\n\nQuesta sera, ci incontriamo come democratici, repubblicani e indipendenti, ma soprattutto come americani. La Russia di Putin ha cercato di scuotere le fondamenta del mondo libero, ma ha sottovalutato la forza della gente ucraina. Insieme ai nostri alleati, stiamo imponendo sanzioni economiche, tagli... |
28b23f534c36-0 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html | .ipynb
.pdf
Retrieval Question/Answering
Contents
Chain Type
Custom Prompts
Return Source Documents
Retrieval Question/Answering#
This example showcases question answering over an index.
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.vectorstores import Chroma
from langchain.text_splitter imp... |
28b23f534c36-1 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html | There are two ways to load different chain types. First, you can specify the chain type argument in the from_chain_type method. This allows you to pass in the name of the chain type you want to use. For example, in the below we change the chain type to map_reduce.
qa = RetrievalQA.from_chain_type(llm=OpenAI(), chain_ty... |
28b23f534c36-2 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html | " The president said that Ketanji Brown Jackson is one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, and from a family of public school educators and police officers. He also said that she is a consensus builder and has received a broad range of support f... |
28b23f534c36-3 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html | qa = RetrievalQA.from_chain_type(llm=OpenAI(), chain_type="stuff", retriever=docsearch.as_retriever(), return_source_documents=True)
query = "What did the president say about Ketanji Brown Jackson"
result = qa({"query": query})
result["result"]
" The president said that Ketanji Brown Jackson is one of the nation's top ... |
28b23f534c36-4 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html | Document(page_content='A former top litigator in private practice. A former federal public defender. And from a family of public school educators and police officers. A consensus builder. Since she’s been nominated, she’s received a broad range of support—from the Fraternal Order of Police to former judges appointed by... |
28b23f534c36-5 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html | Document(page_content='And for our LGBTQ+ Americans, let’s finally get the bipartisan Equality Act to my desk. The onslaught of state laws targeting transgender Americans and their families is wrong. \n\nAs I said last year, especially to our younger transgender Americans, I will always have your back as your President... |
28b23f534c36-6 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_qa.html | Document(page_content='Tonight, I’m announcing a crackdown on these companies overcharging American businesses and consumers. \n\nAnd as Wall Street firms take over more nursing homes, quality in those homes has gone down and costs have gone up. \n\nThat ends on my watch. \n\nMedicare is going to set higher standards ... |
aa9deb6dcd10-0 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html | .ipynb
.pdf
Vector DB Text Generation
Contents
Prepare Data
Set Up Vector DB
Set Up LLM Chain with Custom Prompt
Generate Text
Vector DB Text Generation#
This notebook walks through how to use LangChain for text generation over a vector index. This is useful if we want to generate text that is able to draw from a lar... |
aa9deb6dcd10-1 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html | github_url = f"https://github.com/{repo_owner}/{repo_name}/blob/{git_sha}/{relative_path}"
yield Document(page_content=f.read(), metadata={"source": github_url})
sources = get_github_docs("yirenlu92", "deno-manual-forked")
source_chunks = []
splitter = CharacterTextSplitter(separator=" ", chunk_size=102... |
aa9deb6dcd10-2 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html | Finally, we write a function to apply our inputs to the chain. The function takes an input parameter topic. We find the documents in the vector index that correspond to that topic, and use them as additional context in our simple LLM chain.
def generate_blog_post(topic):
docs = search_index.similarity_search(topic,... |
aa9deb6dcd10-3 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html | [{'text': '\n\nEnvironment variables are a great way to store and access sensitive information in your Deno applications. Deno offers built-in support for environment variables with `Deno.env`, and you can also use a `.env` file to store and access environment variables.\n\nUsing `Deno.env` is simple. It has getter and... |
aa9deb6dcd10-4 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html | the command. We can then access this variable in our code using the `Deno.env.get()` function. For example, if we ran the following command:\n\n```\nVAR=hello && deno eval "console.log(\'Deno: \' + Deno.env.get(\'VAR'}, {'text': '\n\nEnvironment variables are a powerful tool for developers, allowing them to store and a... |
aa9deb6dcd10-5 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html | can be used by programs. They are typically used to store configuration information, such as the location of a database or the name of a user. In Deno, environment variables are stored in the `Deno.env` object. This object is similar to the `process.env` object in Node.js, and it allows you to access and set environmen... |
aa9deb6dcd10-6 | https://python.langchain.com/en/latest/modules/chains/index_examples/vector_db_text_generation.html | previous
Retrieval Question Answering with Sources
next
API Chains
Contents
Prepare Data
Set Up Vector DB
Set Up LLM Chain with Custom Prompt
Generate Text
By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Jun 04, 2023. |
265f7c50c38e-0 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | .ipynb
.pdf
FLARE
Contents
Imports
Retriever
FLARE Chain
FLARE#
This notebook is an implementation of Forward-Looking Active REtrieval augmented generation (FLARE).
Please see the original repo here.
The basic idea is:
Start answering a question
If you start generating tokens the model is uncertain about, look up rel... |
265f7c50c38e-1 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | Imports#
import os
os.environ["SERPER_API_KEY"] = ""
import re
import numpy as np
from langchain.schema import BaseRetriever
from langchain.utilities import GoogleSerperAPIWrapper
from langchain.embeddings import OpenAIEmbeddings
from langchain.chat_models import ChatOpenAI
from langchain.llms import OpenAI
from langch... |
265f7c50c38e-2 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | Given a user input and an existing partial response as context, ask a question to which the answer is the given term/entity/phrase:
>>> USER INPUT: explain in great detail the difference between the langchain framework and baby agi
>>> EXISTING PARTIAL RESPONSE:
The Langchain Framework is a decentralized platform for... |
265f7c50c38e-3 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | Baby AGI, on the other hand, is an artificial general intelligence (AGI) platform. It uses a combination of deep learning and reinforcement learning to create an AI system that can learn and adapt to new tasks. Baby AGI is designed to be a general-purpose AI system that can be used for a variety of applications, includ... |
265f7c50c38e-4 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | >>> USER INPUT: explain in great detail the difference between the langchain framework and baby agi
>>> EXISTING PARTIAL RESPONSE:
The Langchain Framework is a decentralized platform for natural language processing (NLP) applications. It uses a blockchain-based distributed ledger to store and process data, allowing f... |
265f7c50c38e-5 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | Baby AGI, on the other hand, is an artificial general intelligence (AGI) platform. It uses a combination of deep learning and reinforcement learning to create an AI system that can learn and adapt to new tasks. Baby AGI is designed to be a general-purpose AI system that can be used for a variety of applications, includ... |
265f7c50c38e-6 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | >>> USER INPUT: explain in great detail the difference between the langchain framework and baby agi
>>> EXISTING PARTIAL RESPONSE:
The Langchain Framework is a decentralized platform for natural language processing (NLP) applications. It uses a blockchain-based distributed ledger to store and process data, allowing f... |
265f7c50c38e-7 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | Baby AGI, on the other hand, is an artificial general intelligence (AGI) platform. It uses a combination of deep learning and reinforcement learning to create an AI system that can learn and adapt to new tasks. Baby AGI is designed to be a general-purpose AI system that can be used for a variety of applications, includ... |
265f7c50c38e-8 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | >>> CONTEXT: LangChain: Software. LangChain is a software development framework designed to simplify the creation of applications using large language models. LangChain Initial release date: October 2022. LangChain Programming languages: Python and JavaScript. LangChain Developer(s): Harrison Chase. LangChain License: ... |
265f7c50c38e-9 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | LangChain is a framework for including AI from large language models inside data pipelines and applications. This tutorial provides an overview of what you ... Missing: secure | Must include:secure. Blockchain is the best way to secure the data of the shared community. Utilizing the capabilities of the blockchain nobod... |
265f7c50c38e-10 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | LangChain is a framework for including AI from large language models inside data pipelines and applications. This tutorial provides an overview of what you ... LangChain is an intuitive framework created to assist in developing applications driven by a language model, such as OpenAI or Hugging Face. This documentation ... |
265f7c50c38e-11 | https://python.langchain.com/en/latest/modules/chains/examples/flare.html | Blockchain is one type of a distributed ledger. Distributed ledgers use independent computers (referred to as nodes) to record, share and ... Missing: Langchain | Must include:Langchain. Blockchain is used in distributed storage software where huge data is broken down into chunks. This is available in encrypted data ac... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.