Unnamed: 0
int64
0
4.66k
page content
stringlengths
23
2k
description
stringlengths
8
925
output
stringlengths
38
2.93k
4,600
Interface | 🦜️🔗 Langchain
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Interface | 🦜️🔗 Langchain
4,601
Skip to main content🦜️🔗 LangChainDocsUse casesIntegrationsAPICommunityChat our docsLangSmithJS/TS DocsSearchCTRLKGet startedIntroductionInstallationQuickstartLangChain Expression LanguageInterfaceHow toCookbookLangChain Expression Language (LCEL)Why use LCEL?ModulesModel I/​ORetrievalChainsMemoryAgentsCallbacksModulesSecurityGuidesMoreLangChain Expression LanguageInterfaceOn this pageInterfaceIn an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:stream: stream back chunks of the responseinvoke: call the chain on an inputbatch: call the chain on a list of inputsThese also have corresponding async methods:astream: stream back chunks of the response asyncainvoke: call the chain on an input asyncabatch: call the chain on a list of inputs asyncastream_log: stream back intermediate steps as they happen, in addition to the final responseThe type of the input varies by component:ComponentInput TypePromptDictionaryRetrieverSingle stringLLM, ChatModelSingle string, list of chat messages or a PromptValueToolSingle string, or dictionary, depending on the toolOutputParserThe output of an LLM or ChatModelThe output type also varies by component:ComponentOutput TypeLLMStringChatModelChatMessagePromptPromptValueRetrieverList of documentsToolDepends on the toolOutputParserDepends on the parserAll runnables expose properties to inspect the input and output types:input_schema: an input Pydantic model auto-generated from the structure of the Runnableoutput_schema: an output Pydantic model auto-generated from the structure of the RunnableLet's take a look at these methods! To do so, we'll create a super simple PromptTemplate + ChatModel chain.from langchain.prompts import
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Skip to main content🦜️🔗 LangChainDocsUse casesIntegrationsAPICommunityChat our docsLangSmithJS/TS DocsSearchCTRLKGet startedIntroductionInstallationQuickstartLangChain Expression LanguageInterfaceHow toCookbookLangChain Expression Language (LCEL)Why use LCEL?ModulesModel I/​ORetrievalChainsMemoryAgentsCallbacksModulesSecurityGuidesMoreLangChain Expression LanguageInterfaceOn this pageInterfaceIn an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:stream: stream back chunks of the responseinvoke: call the chain on an inputbatch: call the chain on a list of inputsThese also have corresponding async methods:astream: stream back chunks of the response asyncainvoke: call the chain on an input asyncabatch: call the chain on a list of inputs asyncastream_log: stream back intermediate steps as they happen, in addition to the final responseThe type of the input varies by component:ComponentInput TypePromptDictionaryRetrieverSingle stringLLM, ChatModelSingle string, list of chat messages or a PromptValueToolSingle string, or dictionary, depending on the toolOutputParserThe output of an LLM or ChatModelThe output type also varies by component:ComponentOutput TypeLLMStringChatModelChatMessagePromptPromptValueRetrieverList of documentsToolDepends on the toolOutputParserDepends on the parserAll runnables expose properties to inspect the input and output types:input_schema: an input Pydantic model auto-generated from the structure of the Runnableoutput_schema: an output Pydantic model auto-generated from the structure of the RunnableLet's take a look at these methods! To do so, we'll create a super simple PromptTemplate + ChatModel chain.from langchain.prompts import
4,602
+ ChatModel chain.from langchain.prompts import ChatPromptTemplatefrom langchain.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")chain = prompt | modelInput Schema‚ÄãA description of the inputs accepted by a Runnable.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: + ChatModel chain.from langchain.prompts import ChatPromptTemplatefrom langchain.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")chain = prompt | modelInput Schema‚ÄãA description of the inputs accepted by a Runnable.
4,603
This is a Pydantic model dynamically generated from the structure of any Runnable. You can call .schema() on it to obtain a JSONSchema representation.# The input schema of the chain is the input schema of its first part, the prompt.chain.input_schema.schema() {'title': 'PromptInput', 'type': 'object', 'properties': {'topic': {'title': 'Topic', 'type': 'string'}}}Output Schema‚ÄãA description of the outputs produced by a Runnable. This is a Pydantic model dynamically generated from the structure of any Runnable.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: This is a Pydantic model dynamically generated from the structure of any Runnable. You can call .schema() on it to obtain a JSONSchema representation.# The input schema of the chain is the input schema of its first part, the prompt.chain.input_schema.schema() {'title': 'PromptInput', 'type': 'object', 'properties': {'topic': {'title': 'Topic', 'type': 'string'}}}Output Schema‚ÄãA description of the outputs produced by a Runnable. This is a Pydantic model dynamically generated from the structure of any Runnable.
4,604
You can call .schema() on it to obtain a JSONSchema representation.# The output schema of the chain is the output schema of its last part, in this case a ChatModel, which outputs a ChatMessagechain.output_schema.schema() {'title': 'ChatOpenAIOutput', 'anyOf': [{'$ref': '#/definitions/HumanMessageChunk'}, {'$ref': '#/definitions/AIMessageChunk'}, {'$ref': '#/definitions/ChatMessageChunk'}, {'$ref': '#/definitions/FunctionMessageChunk'}, {'$ref': '#/definitions/SystemMessageChunk'}], 'definitions': {'HumanMessageChunk': {'title': 'HumanMessageChunk', 'description': 'A Human Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'human', 'enum': ['human'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'AIMessageChunk': {'title': 'AIMessageChunk', 'description': 'A Message chunk from an AI.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'ai', 'enum': ['ai'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'ChatMessageChunk': {'title': 'ChatMessageChunk', 'description': 'A Chat Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: You can call .schema() on it to obtain a JSONSchema representation.# The output schema of the chain is the output schema of its last part, in this case a ChatModel, which outputs a ChatMessagechain.output_schema.schema() {'title': 'ChatOpenAIOutput', 'anyOf': [{'$ref': '#/definitions/HumanMessageChunk'}, {'$ref': '#/definitions/AIMessageChunk'}, {'$ref': '#/definitions/ChatMessageChunk'}, {'$ref': '#/definitions/FunctionMessageChunk'}, {'$ref': '#/definitions/SystemMessageChunk'}], 'definitions': {'HumanMessageChunk': {'title': 'HumanMessageChunk', 'description': 'A Human Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'human', 'enum': ['human'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'AIMessageChunk': {'title': 'AIMessageChunk', 'description': 'A Message chunk from an AI.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'ai', 'enum': ['ai'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'ChatMessageChunk': {'title': 'ChatMessageChunk', 'description': 'A Chat Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title':
4,605
'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'chat', 'enum': ['chat'], 'type': 'string'}, 'role': {'title': 'Role', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'role']}, 'FunctionMessageChunk': {'title': 'FunctionMessageChunk', 'description': 'A Function Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'function', 'enum': ['function'], 'type': 'string'}, 'name': {'title': 'Name', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'name']}, 'SystemMessageChunk': {'title': 'SystemMessageChunk', 'description': 'A System Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'system', 'enum': ['system'], 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}}}Stream‚Äãfor s in chain.stream({"topic": "bears"}): print(s.content, end="", flush=True) Why don't bears wear shoes? Because they have bear feet!Invoke‚Äãchain.invoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")Batch‚Äãchain.batch([{"topic": "bears"}, {"topic": "cats"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'chat', 'enum': ['chat'], 'type': 'string'}, 'role': {'title': 'Role', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'role']}, 'FunctionMessageChunk': {'title': 'FunctionMessageChunk', 'description': 'A Function Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'function', 'enum': ['function'], 'type': 'string'}, 'name': {'title': 'Name', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'name']}, 'SystemMessageChunk': {'title': 'SystemMessageChunk', 'description': 'A System Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'system', 'enum': ['system'], 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}}}Stream‚Äãfor s in chain.stream({"topic": "bears"}): print(s.content, end="", flush=True) Why don't bears wear shoes? Because they have bear feet!Invoke‚Äãchain.invoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")Batch‚Äãchain.batch([{"topic": "bears"}, {"topic": "cats"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have
4,606
don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Why don't cats play poker in the wild?\n\nToo many cheetahs!")]You can set the number of concurrent requests by using the max_concurrency parameterchain.batch([{"topic": "bears"}, {"topic": "cats"}], config={"max_concurrency": 5}) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Sure, here's a cat joke for you:\n\nWhy don't cats play poker in the wild?\n\nToo many cheetahs!")]Async Stream‚Äãasync for s in chain.astream({"topic": "bears"}): print(s.content, end="", flush=True) Sure, here's a bear joke for you: Why don't bears wear shoes? Because they have bear feet!Async Invoke‚Äãawait chain.ainvoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes? \n\nBecause they have bear feet!")Async Batch‚Äãawait chain.abatch([{"topic": "bears"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")]Async Stream Intermediate Steps‚ÄãAll runnables also have a method .astream_log() which can be used to stream (as they happen) all or part of the intermediate steps of your chain/sequence. This is useful eg. to show progress to the user, to use intermediate results, or even just to debug your chain.You can choose to stream all steps (default), or include/exclude steps by name, tags or metadata.This method yields JSONPatch ops that when applied in the same order as received build up the RunState.class LogEntry(TypedDict): id: str """ID of the sub-run.""" name: str """Name of the object being run.""" type: str """Type of the object being run, eg. prompt, chain, llm, etc.""" tags: List[str] """List of tags for the run.""" metadata: Dict[str, Any] """Key-value pairs of metadata for the run.""" start_time: str """ISO-8601 timestamp of when the run started.""" streamed_output_str: List[str] """List of LLM tokens streamed by this run, if
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Why don't cats play poker in the wild?\n\nToo many cheetahs!")]You can set the number of concurrent requests by using the max_concurrency parameterchain.batch([{"topic": "bears"}, {"topic": "cats"}], config={"max_concurrency": 5}) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Sure, here's a cat joke for you:\n\nWhy don't cats play poker in the wild?\n\nToo many cheetahs!")]Async Stream‚Äãasync for s in chain.astream({"topic": "bears"}): print(s.content, end="", flush=True) Sure, here's a bear joke for you: Why don't bears wear shoes? Because they have bear feet!Async Invoke‚Äãawait chain.ainvoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes? \n\nBecause they have bear feet!")Async Batch‚Äãawait chain.abatch([{"topic": "bears"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")]Async Stream Intermediate Steps‚ÄãAll runnables also have a method .astream_log() which can be used to stream (as they happen) all or part of the intermediate steps of your chain/sequence. This is useful eg. to show progress to the user, to use intermediate results, or even just to debug your chain.You can choose to stream all steps (default), or include/exclude steps by name, tags or metadata.This method yields JSONPatch ops that when applied in the same order as received build up the RunState.class LogEntry(TypedDict): id: str """ID of the sub-run.""" name: str """Name of the object being run.""" type: str """Type of the object being run, eg. prompt, chain, llm, etc.""" tags: List[str] """List of tags for the run.""" metadata: Dict[str, Any] """Key-value pairs of metadata for the run.""" start_time: str """ISO-8601 timestamp of when the run started.""" streamed_output_str: List[str] """List of LLM tokens streamed by this run, if
4,607
"""List of LLM tokens streamed by this run, if applicable.""" final_output: Optional[Any] """Final output of this run. Only available after the run has finished successfully.""" end_time: Optional[str] """ISO-8601 timestamp of when the run ended. Only available after the run has finished."""class RunState(TypedDict): id: str """ID of the run.""" streamed_output: List[Any] """List of output chunks streamed by Runnable.stream()""" final_output: Optional[Any] """Final output of the run, usually the result of aggregating (`+`) streamed_output. Only available after the run has finished successfully.""" logs: Dict[str, LogEntry] """Map of run names to sub-runs. If filters were supplied, this list will contain only the runs that matched the filters."""Streaming JSONPatch chunks‚ÄãThis is useful eg. to stream the JSONPatch in an HTTP server, and then apply the ops on the client to rebuild the run state there. See LangServe for tooling to make it easier to build a webserver from any Runnable.from langchain.embeddings import OpenAIEmbeddingsfrom langchain.schema.output_parser import StrOutputParserfrom langchain.schema.runnable import RunnablePassthroughfrom langchain.vectorstores import FAISStemplate = """Answer the question based only on the following context:{context}Question: {question}"""prompt = ChatPromptTemplate.from_template(template)vectorstore = FAISS.from_texts(["harrison worked at kensho"], embedding=OpenAIEmbeddings())retriever = vectorstore.as_retriever()retrieval_chain = ( {"context": retriever.with_config(run_name='Docs'), "question": RunnablePassthrough()} | prompt | model | StrOutputParser())async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs']): print(chunk) RunLogPatch({'op': 'replace', 'path': '', 'value': {'final_output': None, 'id': 'fd6fcf62-c92c-4edf-8713-0fc5df000f62', 'logs': {},
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: """List of LLM tokens streamed by this run, if applicable.""" final_output: Optional[Any] """Final output of this run. Only available after the run has finished successfully.""" end_time: Optional[str] """ISO-8601 timestamp of when the run ended. Only available after the run has finished."""class RunState(TypedDict): id: str """ID of the run.""" streamed_output: List[Any] """List of output chunks streamed by Runnable.stream()""" final_output: Optional[Any] """Final output of the run, usually the result of aggregating (`+`) streamed_output. Only available after the run has finished successfully.""" logs: Dict[str, LogEntry] """Map of run names to sub-runs. If filters were supplied, this list will contain only the runs that matched the filters."""Streaming JSONPatch chunks‚ÄãThis is useful eg. to stream the JSONPatch in an HTTP server, and then apply the ops on the client to rebuild the run state there. See LangServe for tooling to make it easier to build a webserver from any Runnable.from langchain.embeddings import OpenAIEmbeddingsfrom langchain.schema.output_parser import StrOutputParserfrom langchain.schema.runnable import RunnablePassthroughfrom langchain.vectorstores import FAISStemplate = """Answer the question based only on the following context:{context}Question: {question}"""prompt = ChatPromptTemplate.from_template(template)vectorstore = FAISS.from_texts(["harrison worked at kensho"], embedding=OpenAIEmbeddings())retriever = vectorstore.as_retriever()retrieval_chain = ( {"context": retriever.with_config(run_name='Docs'), "question": RunnablePassthrough()} | prompt | model | StrOutputParser())async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs']): print(chunk) RunLogPatch({'op': 'replace', 'path': '', 'value': {'final_output': None, 'id': 'fd6fcf62-c92c-4edf-8713-0fc5df000f62', 'logs': {},
4,608
'logs': {}, 'streamed_output': []}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs', 'value': {'end_time': None, 'final_output': None, 'id': '8c998257-1ec8-4546-b744-c3fdb9728c41', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:35.668', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs/final_output', 'value': {'documents': [Document(page_content='harrison worked at kensho')]}}, {'op': 'add', 'path': '/logs/Docs/end_time', 'value': '2023-10-05T12:52:36.033'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'H'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'arrison'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' worked'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' at'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' Kens'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'ho'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': '.'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'replace', 'path': '/final_output', 'value': {'output': 'Harrison worked at Kensho.'}})Streaming the incremental RunState‚ÄãYou can simply pass diff=False to get incremental values of RunState.async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs'], diff=False): print(chunk) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185',
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'logs': {}, 'streamed_output': []}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs', 'value': {'end_time': None, 'final_output': None, 'id': '8c998257-1ec8-4546-b744-c3fdb9728c41', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:35.668', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs/final_output', 'value': {'documents': [Document(page_content='harrison worked at kensho')]}}, {'op': 'add', 'path': '/logs/Docs/end_time', 'value': '2023-10-05T12:52:36.033'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'H'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'arrison'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' worked'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' at'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' Kens'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'ho'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': '.'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'replace', 'path': '/final_output', 'value': {'output': 'Harrison worked at Kensho.'}})Streaming the incremental RunState‚ÄãYou can simply pass diff=False to get incremental values of RunState.async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs'], diff=False): print(chunk) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185',
4,609
'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': None, 'final_output': None, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': None, 'final_output': None, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output':
4,610
'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217',
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217',
4,611
'2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho']}) RunLog({'final_output': None, 'id':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho']}) RunLog({'final_output': None, 'id':
4,612
RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']}) RunLog({'final_output': {'output': 'Harrison worked at Kensho.'}, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {},
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']}) RunLog({'final_output': {'output': 'Harrison worked at Kensho.'}, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {},
4,613
'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']})Parallelism‚ÄãLet's take a look at how LangChain Expression Language support parallel requests as much as possible. For example, when using a RunnableParallel (often written as a dictionary) it executes each element in parallel.from langchain.schema.runnable import RunnableParallelchain1 = ChatPromptTemplate.from_template("tell me a joke about {topic}") | modelchain2 = ChatPromptTemplate.from_template("write a short (2 line) poem about {topic}") | modelcombined = RunnableParallel(joke=chain1, poem=chain2)chain1.invoke({"topic": "bears"}) CPU times: user 31.7 ms, sys: 8.59 ms, total: 40.3 ms Wall time: 1.05 s AIMessage(content="Why don't bears like fast food?\n\nBecause they can't catch it!", additional_kwargs={}, example=False)chain2.invoke({"topic": "bears"}) CPU times: user 42.9 ms, sys: 10.2 ms, total: 53 ms Wall time: 1.93 s AIMessage(content="In forest's embrace, bears roam free,\nSilent strength, nature's majesty.", additional_kwargs={}, example=False)combined.invoke({"topic": "bears"}) CPU times: user 96.3 ms, sys: 20.4 ms, total: 117 ms Wall time: 1.1 s {'joke': AIMessage(content="Why don't bears wear socks?\n\nBecause they have bear feet!", additional_kwargs={}, example=False), 'poem': AIMessage(content="In forest's embrace,\nMajestic bears leave their trace.", additional_kwargs={}, example=False)}PreviousLangChain Expression Language (LCEL)NextHow toInput SchemaOutput
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']})Parallelism‚ÄãLet's take a look at how LangChain Expression Language support parallel requests as much as possible. For example, when using a RunnableParallel (often written as a dictionary) it executes each element in parallel.from langchain.schema.runnable import RunnableParallelchain1 = ChatPromptTemplate.from_template("tell me a joke about {topic}") | modelchain2 = ChatPromptTemplate.from_template("write a short (2 line) poem about {topic}") | modelcombined = RunnableParallel(joke=chain1, poem=chain2)chain1.invoke({"topic": "bears"}) CPU times: user 31.7 ms, sys: 8.59 ms, total: 40.3 ms Wall time: 1.05 s AIMessage(content="Why don't bears like fast food?\n\nBecause they can't catch it!", additional_kwargs={}, example=False)chain2.invoke({"topic": "bears"}) CPU times: user 42.9 ms, sys: 10.2 ms, total: 53 ms Wall time: 1.93 s AIMessage(content="In forest's embrace, bears roam free,\nSilent strength, nature's majesty.", additional_kwargs={}, example=False)combined.invoke({"topic": "bears"}) CPU times: user 96.3 ms, sys: 20.4 ms, total: 117 ms Wall time: 1.1 s {'joke': AIMessage(content="Why don't bears wear socks?\n\nBecause they have bear feet!", additional_kwargs={}, example=False), 'poem': AIMessage(content="In forest's embrace,\nMajestic bears leave their trace.", additional_kwargs={}, example=False)}PreviousLangChain Expression Language (LCEL)NextHow toInput SchemaOutput
4,614
Language (LCEL)NextHow toInput SchemaOutput SchemaStreamInvokeBatchAsync StreamAsync InvokeAsync BatchAsync Stream Intermediate StepsStreaming JSONPatch chunksStreaming the incremental RunStateParallelismCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Language (LCEL)NextHow toInput SchemaOutput SchemaStreamInvokeBatchAsync StreamAsync InvokeAsync BatchAsync Stream Intermediate StepsStreaming JSONPatch chunksStreaming the incremental RunStateParallelismCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.
4,615
Interface | 🦜️🔗 Langchain
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Interface | 🦜️🔗 Langchain
4,616
Skip to main content🦜️🔗 LangChainDocsUse casesIntegrationsAPICommunityChat our docsLangSmithJS/TS DocsSearchCTRLKGet startedIntroductionInstallationQuickstartLangChain Expression LanguageInterfaceHow toCookbookLangChain Expression Language (LCEL)Why use LCEL?ModulesModel I/​ORetrievalChainsMemoryAgentsCallbacksModulesSecurityGuidesMoreLangChain Expression LanguageInterfaceOn this pageInterfaceIn an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:stream: stream back chunks of the responseinvoke: call the chain on an inputbatch: call the chain on a list of inputsThese also have corresponding async methods:astream: stream back chunks of the response asyncainvoke: call the chain on an input asyncabatch: call the chain on a list of inputs asyncastream_log: stream back intermediate steps as they happen, in addition to the final responseThe type of the input varies by component:ComponentInput TypePromptDictionaryRetrieverSingle stringLLM, ChatModelSingle string, list of chat messages or a PromptValueToolSingle string, or dictionary, depending on the toolOutputParserThe output of an LLM or ChatModelThe output type also varies by component:ComponentOutput TypeLLMStringChatModelChatMessagePromptPromptValueRetrieverList of documentsToolDepends on the toolOutputParserDepends on the parserAll runnables expose properties to inspect the input and output types:input_schema: an input Pydantic model auto-generated from the structure of the Runnableoutput_schema: an output Pydantic model auto-generated from the structure of the RunnableLet's take a look at these methods! To do so, we'll create a super simple PromptTemplate + ChatModel chain.from langchain.prompts import
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Skip to main content🦜️🔗 LangChainDocsUse casesIntegrationsAPICommunityChat our docsLangSmithJS/TS DocsSearchCTRLKGet startedIntroductionInstallationQuickstartLangChain Expression LanguageInterfaceHow toCookbookLangChain Expression Language (LCEL)Why use LCEL?ModulesModel I/​ORetrievalChainsMemoryAgentsCallbacksModulesSecurityGuidesMoreLangChain Expression LanguageInterfaceOn this pageInterfaceIn an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:stream: stream back chunks of the responseinvoke: call the chain on an inputbatch: call the chain on a list of inputsThese also have corresponding async methods:astream: stream back chunks of the response asyncainvoke: call the chain on an input asyncabatch: call the chain on a list of inputs asyncastream_log: stream back intermediate steps as they happen, in addition to the final responseThe type of the input varies by component:ComponentInput TypePromptDictionaryRetrieverSingle stringLLM, ChatModelSingle string, list of chat messages or a PromptValueToolSingle string, or dictionary, depending on the toolOutputParserThe output of an LLM or ChatModelThe output type also varies by component:ComponentOutput TypeLLMStringChatModelChatMessagePromptPromptValueRetrieverList of documentsToolDepends on the toolOutputParserDepends on the parserAll runnables expose properties to inspect the input and output types:input_schema: an input Pydantic model auto-generated from the structure of the Runnableoutput_schema: an output Pydantic model auto-generated from the structure of the RunnableLet's take a look at these methods! To do so, we'll create a super simple PromptTemplate + ChatModel chain.from langchain.prompts import
4,617
+ ChatModel chain.from langchain.prompts import ChatPromptTemplatefrom langchain.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")chain = prompt | modelInput Schema‚ÄãA description of the inputs accepted by a Runnable.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: + ChatModel chain.from langchain.prompts import ChatPromptTemplatefrom langchain.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")chain = prompt | modelInput Schema‚ÄãA description of the inputs accepted by a Runnable.
4,618
This is a Pydantic model dynamically generated from the structure of any Runnable. You can call .schema() on it to obtain a JSONSchema representation.# The input schema of the chain is the input schema of its first part, the prompt.chain.input_schema.schema() {'title': 'PromptInput', 'type': 'object', 'properties': {'topic': {'title': 'Topic', 'type': 'string'}}}Output Schema‚ÄãA description of the outputs produced by a Runnable. This is a Pydantic model dynamically generated from the structure of any Runnable.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: This is a Pydantic model dynamically generated from the structure of any Runnable. You can call .schema() on it to obtain a JSONSchema representation.# The input schema of the chain is the input schema of its first part, the prompt.chain.input_schema.schema() {'title': 'PromptInput', 'type': 'object', 'properties': {'topic': {'title': 'Topic', 'type': 'string'}}}Output Schema‚ÄãA description of the outputs produced by a Runnable. This is a Pydantic model dynamically generated from the structure of any Runnable.
4,619
You can call .schema() on it to obtain a JSONSchema representation.# The output schema of the chain is the output schema of its last part, in this case a ChatModel, which outputs a ChatMessagechain.output_schema.schema() {'title': 'ChatOpenAIOutput', 'anyOf': [{'$ref': '#/definitions/HumanMessageChunk'}, {'$ref': '#/definitions/AIMessageChunk'}, {'$ref': '#/definitions/ChatMessageChunk'}, {'$ref': '#/definitions/FunctionMessageChunk'}, {'$ref': '#/definitions/SystemMessageChunk'}], 'definitions': {'HumanMessageChunk': {'title': 'HumanMessageChunk', 'description': 'A Human Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'human', 'enum': ['human'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'AIMessageChunk': {'title': 'AIMessageChunk', 'description': 'A Message chunk from an AI.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'ai', 'enum': ['ai'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'ChatMessageChunk': {'title': 'ChatMessageChunk', 'description': 'A Chat Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: You can call .schema() on it to obtain a JSONSchema representation.# The output schema of the chain is the output schema of its last part, in this case a ChatModel, which outputs a ChatMessagechain.output_schema.schema() {'title': 'ChatOpenAIOutput', 'anyOf': [{'$ref': '#/definitions/HumanMessageChunk'}, {'$ref': '#/definitions/AIMessageChunk'}, {'$ref': '#/definitions/ChatMessageChunk'}, {'$ref': '#/definitions/FunctionMessageChunk'}, {'$ref': '#/definitions/SystemMessageChunk'}], 'definitions': {'HumanMessageChunk': {'title': 'HumanMessageChunk', 'description': 'A Human Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'human', 'enum': ['human'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'AIMessageChunk': {'title': 'AIMessageChunk', 'description': 'A Message chunk from an AI.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'ai', 'enum': ['ai'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'ChatMessageChunk': {'title': 'ChatMessageChunk', 'description': 'A Chat Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title':
4,620
'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'chat', 'enum': ['chat'], 'type': 'string'}, 'role': {'title': 'Role', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'role']}, 'FunctionMessageChunk': {'title': 'FunctionMessageChunk', 'description': 'A Function Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'function', 'enum': ['function'], 'type': 'string'}, 'name': {'title': 'Name', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'name']}, 'SystemMessageChunk': {'title': 'SystemMessageChunk', 'description': 'A System Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'system', 'enum': ['system'], 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}}}Stream‚Äãfor s in chain.stream({"topic": "bears"}): print(s.content, end="", flush=True) Why don't bears wear shoes? Because they have bear feet!Invoke‚Äãchain.invoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")Batch‚Äãchain.batch([{"topic": "bears"}, {"topic": "cats"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'chat', 'enum': ['chat'], 'type': 'string'}, 'role': {'title': 'Role', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'role']}, 'FunctionMessageChunk': {'title': 'FunctionMessageChunk', 'description': 'A Function Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'function', 'enum': ['function'], 'type': 'string'}, 'name': {'title': 'Name', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'name']}, 'SystemMessageChunk': {'title': 'SystemMessageChunk', 'description': 'A System Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'system', 'enum': ['system'], 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}}}Stream‚Äãfor s in chain.stream({"topic": "bears"}): print(s.content, end="", flush=True) Why don't bears wear shoes? Because they have bear feet!Invoke‚Äãchain.invoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")Batch‚Äãchain.batch([{"topic": "bears"}, {"topic": "cats"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have
4,621
don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Why don't cats play poker in the wild?\n\nToo many cheetahs!")]You can set the number of concurrent requests by using the max_concurrency parameterchain.batch([{"topic": "bears"}, {"topic": "cats"}], config={"max_concurrency": 5}) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Sure, here's a cat joke for you:\n\nWhy don't cats play poker in the wild?\n\nToo many cheetahs!")]Async Stream‚Äãasync for s in chain.astream({"topic": "bears"}): print(s.content, end="", flush=True) Sure, here's a bear joke for you: Why don't bears wear shoes? Because they have bear feet!Async Invoke‚Äãawait chain.ainvoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes? \n\nBecause they have bear feet!")Async Batch‚Äãawait chain.abatch([{"topic": "bears"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")]Async Stream Intermediate Steps‚ÄãAll runnables also have a method .astream_log() which can be used to stream (as they happen) all or part of the intermediate steps of your chain/sequence. This is useful eg. to show progress to the user, to use intermediate results, or even just to debug your chain.You can choose to stream all steps (default), or include/exclude steps by name, tags or metadata.This method yields JSONPatch ops that when applied in the same order as received build up the RunState.class LogEntry(TypedDict): id: str """ID of the sub-run.""" name: str """Name of the object being run.""" type: str """Type of the object being run, eg. prompt, chain, llm, etc.""" tags: List[str] """List of tags for the run.""" metadata: Dict[str, Any] """Key-value pairs of metadata for the run.""" start_time: str """ISO-8601 timestamp of when the run started.""" streamed_output_str: List[str] """List of LLM tokens streamed by this run, if
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Why don't cats play poker in the wild?\n\nToo many cheetahs!")]You can set the number of concurrent requests by using the max_concurrency parameterchain.batch([{"topic": "bears"}, {"topic": "cats"}], config={"max_concurrency": 5}) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Sure, here's a cat joke for you:\n\nWhy don't cats play poker in the wild?\n\nToo many cheetahs!")]Async Stream‚Äãasync for s in chain.astream({"topic": "bears"}): print(s.content, end="", flush=True) Sure, here's a bear joke for you: Why don't bears wear shoes? Because they have bear feet!Async Invoke‚Äãawait chain.ainvoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes? \n\nBecause they have bear feet!")Async Batch‚Äãawait chain.abatch([{"topic": "bears"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")]Async Stream Intermediate Steps‚ÄãAll runnables also have a method .astream_log() which can be used to stream (as they happen) all or part of the intermediate steps of your chain/sequence. This is useful eg. to show progress to the user, to use intermediate results, or even just to debug your chain.You can choose to stream all steps (default), or include/exclude steps by name, tags or metadata.This method yields JSONPatch ops that when applied in the same order as received build up the RunState.class LogEntry(TypedDict): id: str """ID of the sub-run.""" name: str """Name of the object being run.""" type: str """Type of the object being run, eg. prompt, chain, llm, etc.""" tags: List[str] """List of tags for the run.""" metadata: Dict[str, Any] """Key-value pairs of metadata for the run.""" start_time: str """ISO-8601 timestamp of when the run started.""" streamed_output_str: List[str] """List of LLM tokens streamed by this run, if
4,622
"""List of LLM tokens streamed by this run, if applicable.""" final_output: Optional[Any] """Final output of this run. Only available after the run has finished successfully.""" end_time: Optional[str] """ISO-8601 timestamp of when the run ended. Only available after the run has finished."""class RunState(TypedDict): id: str """ID of the run.""" streamed_output: List[Any] """List of output chunks streamed by Runnable.stream()""" final_output: Optional[Any] """Final output of the run, usually the result of aggregating (`+`) streamed_output. Only available after the run has finished successfully.""" logs: Dict[str, LogEntry] """Map of run names to sub-runs. If filters were supplied, this list will contain only the runs that matched the filters."""Streaming JSONPatch chunks‚ÄãThis is useful eg. to stream the JSONPatch in an HTTP server, and then apply the ops on the client to rebuild the run state there. See LangServe for tooling to make it easier to build a webserver from any Runnable.from langchain.embeddings import OpenAIEmbeddingsfrom langchain.schema.output_parser import StrOutputParserfrom langchain.schema.runnable import RunnablePassthroughfrom langchain.vectorstores import FAISStemplate = """Answer the question based only on the following context:{context}Question: {question}"""prompt = ChatPromptTemplate.from_template(template)vectorstore = FAISS.from_texts(["harrison worked at kensho"], embedding=OpenAIEmbeddings())retriever = vectorstore.as_retriever()retrieval_chain = ( {"context": retriever.with_config(run_name='Docs'), "question": RunnablePassthrough()} | prompt | model | StrOutputParser())async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs']): print(chunk) RunLogPatch({'op': 'replace', 'path': '', 'value': {'final_output': None, 'id': 'fd6fcf62-c92c-4edf-8713-0fc5df000f62', 'logs': {},
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: """List of LLM tokens streamed by this run, if applicable.""" final_output: Optional[Any] """Final output of this run. Only available after the run has finished successfully.""" end_time: Optional[str] """ISO-8601 timestamp of when the run ended. Only available after the run has finished."""class RunState(TypedDict): id: str """ID of the run.""" streamed_output: List[Any] """List of output chunks streamed by Runnable.stream()""" final_output: Optional[Any] """Final output of the run, usually the result of aggregating (`+`) streamed_output. Only available after the run has finished successfully.""" logs: Dict[str, LogEntry] """Map of run names to sub-runs. If filters were supplied, this list will contain only the runs that matched the filters."""Streaming JSONPatch chunks‚ÄãThis is useful eg. to stream the JSONPatch in an HTTP server, and then apply the ops on the client to rebuild the run state there. See LangServe for tooling to make it easier to build a webserver from any Runnable.from langchain.embeddings import OpenAIEmbeddingsfrom langchain.schema.output_parser import StrOutputParserfrom langchain.schema.runnable import RunnablePassthroughfrom langchain.vectorstores import FAISStemplate = """Answer the question based only on the following context:{context}Question: {question}"""prompt = ChatPromptTemplate.from_template(template)vectorstore = FAISS.from_texts(["harrison worked at kensho"], embedding=OpenAIEmbeddings())retriever = vectorstore.as_retriever()retrieval_chain = ( {"context": retriever.with_config(run_name='Docs'), "question": RunnablePassthrough()} | prompt | model | StrOutputParser())async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs']): print(chunk) RunLogPatch({'op': 'replace', 'path': '', 'value': {'final_output': None, 'id': 'fd6fcf62-c92c-4edf-8713-0fc5df000f62', 'logs': {},
4,623
'logs': {}, 'streamed_output': []}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs', 'value': {'end_time': None, 'final_output': None, 'id': '8c998257-1ec8-4546-b744-c3fdb9728c41', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:35.668', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs/final_output', 'value': {'documents': [Document(page_content='harrison worked at kensho')]}}, {'op': 'add', 'path': '/logs/Docs/end_time', 'value': '2023-10-05T12:52:36.033'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'H'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'arrison'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' worked'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' at'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' Kens'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'ho'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': '.'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'replace', 'path': '/final_output', 'value': {'output': 'Harrison worked at Kensho.'}})Streaming the incremental RunState‚ÄãYou can simply pass diff=False to get incremental values of RunState.async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs'], diff=False): print(chunk) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185',
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'logs': {}, 'streamed_output': []}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs', 'value': {'end_time': None, 'final_output': None, 'id': '8c998257-1ec8-4546-b744-c3fdb9728c41', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:35.668', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs/final_output', 'value': {'documents': [Document(page_content='harrison worked at kensho')]}}, {'op': 'add', 'path': '/logs/Docs/end_time', 'value': '2023-10-05T12:52:36.033'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'H'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'arrison'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' worked'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' at'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' Kens'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'ho'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': '.'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'replace', 'path': '/final_output', 'value': {'output': 'Harrison worked at Kensho.'}})Streaming the incremental RunState‚ÄãYou can simply pass diff=False to get incremental values of RunState.async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs'], diff=False): print(chunk) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185',
4,624
'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': None, 'final_output': None, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': None, 'final_output': None, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output':
4,625
'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217',
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217',
4,626
'2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho']}) RunLog({'final_output': None, 'id':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho']}) RunLog({'final_output': None, 'id':
4,627
RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']}) RunLog({'final_output': {'output': 'Harrison worked at Kensho.'}, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {},
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']}) RunLog({'final_output': {'output': 'Harrison worked at Kensho.'}, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {},
4,628
'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']})Parallelism‚ÄãLet's take a look at how LangChain Expression Language support parallel requests as much as possible. For example, when using a RunnableParallel (often written as a dictionary) it executes each element in parallel.from langchain.schema.runnable import RunnableParallelchain1 = ChatPromptTemplate.from_template("tell me a joke about {topic}") | modelchain2 = ChatPromptTemplate.from_template("write a short (2 line) poem about {topic}") | modelcombined = RunnableParallel(joke=chain1, poem=chain2)chain1.invoke({"topic": "bears"}) CPU times: user 31.7 ms, sys: 8.59 ms, total: 40.3 ms Wall time: 1.05 s AIMessage(content="Why don't bears like fast food?\n\nBecause they can't catch it!", additional_kwargs={}, example=False)chain2.invoke({"topic": "bears"}) CPU times: user 42.9 ms, sys: 10.2 ms, total: 53 ms Wall time: 1.93 s AIMessage(content="In forest's embrace, bears roam free,\nSilent strength, nature's majesty.", additional_kwargs={}, example=False)combined.invoke({"topic": "bears"}) CPU times: user 96.3 ms, sys: 20.4 ms, total: 117 ms Wall time: 1.1 s {'joke': AIMessage(content="Why don't bears wear socks?\n\nBecause they have bear feet!", additional_kwargs={}, example=False), 'poem': AIMessage(content="In forest's embrace,\nMajestic bears leave their trace.", additional_kwargs={}, example=False)}PreviousLangChain Expression Language (LCEL)NextHow toInput SchemaOutput
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']})Parallelism‚ÄãLet's take a look at how LangChain Expression Language support parallel requests as much as possible. For example, when using a RunnableParallel (often written as a dictionary) it executes each element in parallel.from langchain.schema.runnable import RunnableParallelchain1 = ChatPromptTemplate.from_template("tell me a joke about {topic}") | modelchain2 = ChatPromptTemplate.from_template("write a short (2 line) poem about {topic}") | modelcombined = RunnableParallel(joke=chain1, poem=chain2)chain1.invoke({"topic": "bears"}) CPU times: user 31.7 ms, sys: 8.59 ms, total: 40.3 ms Wall time: 1.05 s AIMessage(content="Why don't bears like fast food?\n\nBecause they can't catch it!", additional_kwargs={}, example=False)chain2.invoke({"topic": "bears"}) CPU times: user 42.9 ms, sys: 10.2 ms, total: 53 ms Wall time: 1.93 s AIMessage(content="In forest's embrace, bears roam free,\nSilent strength, nature's majesty.", additional_kwargs={}, example=False)combined.invoke({"topic": "bears"}) CPU times: user 96.3 ms, sys: 20.4 ms, total: 117 ms Wall time: 1.1 s {'joke': AIMessage(content="Why don't bears wear socks?\n\nBecause they have bear feet!", additional_kwargs={}, example=False), 'poem': AIMessage(content="In forest's embrace,\nMajestic bears leave their trace.", additional_kwargs={}, example=False)}PreviousLangChain Expression Language (LCEL)NextHow toInput SchemaOutput
4,629
Language (LCEL)NextHow toInput SchemaOutput SchemaStreamInvokeBatchAsync StreamAsync InvokeAsync BatchAsync Stream Intermediate StepsStreaming JSONPatch chunksStreaming the incremental RunStateParallelismCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Language (LCEL)NextHow toInput SchemaOutput SchemaStreamInvokeBatchAsync StreamAsync InvokeAsync BatchAsync Stream Intermediate StepsStreaming JSONPatch chunksStreaming the incremental RunStateParallelismCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.
4,630
Interface | 🦜️🔗 Langchain
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Interface | 🦜️🔗 Langchain
4,631
Skip to main content🦜️🔗 LangChainDocsUse casesIntegrationsAPICommunityChat our docsLangSmithJS/TS DocsSearchCTRLKGet startedIntroductionInstallationQuickstartLangChain Expression LanguageInterfaceHow toCookbookLangChain Expression Language (LCEL)Why use LCEL?ModulesModel I/​ORetrievalChainsMemoryAgentsCallbacksModulesSecurityGuidesMoreLangChain Expression LanguageInterfaceOn this pageInterfaceIn an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:stream: stream back chunks of the responseinvoke: call the chain on an inputbatch: call the chain on a list of inputsThese also have corresponding async methods:astream: stream back chunks of the response asyncainvoke: call the chain on an input asyncabatch: call the chain on a list of inputs asyncastream_log: stream back intermediate steps as they happen, in addition to the final responseThe type of the input varies by component:ComponentInput TypePromptDictionaryRetrieverSingle stringLLM, ChatModelSingle string, list of chat messages or a PromptValueToolSingle string, or dictionary, depending on the toolOutputParserThe output of an LLM or ChatModelThe output type also varies by component:ComponentOutput TypeLLMStringChatModelChatMessagePromptPromptValueRetrieverList of documentsToolDepends on the toolOutputParserDepends on the parserAll runnables expose properties to inspect the input and output types:input_schema: an input Pydantic model auto-generated from the structure of the Runnableoutput_schema: an output Pydantic model auto-generated from the structure of the RunnableLet's take a look at these methods! To do so, we'll create a super simple PromptTemplate + ChatModel chain.from langchain.prompts import
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Skip to main content🦜️🔗 LangChainDocsUse casesIntegrationsAPICommunityChat our docsLangSmithJS/TS DocsSearchCTRLKGet startedIntroductionInstallationQuickstartLangChain Expression LanguageInterfaceHow toCookbookLangChain Expression Language (LCEL)Why use LCEL?ModulesModel I/​ORetrievalChainsMemoryAgentsCallbacksModulesSecurityGuidesMoreLangChain Expression LanguageInterfaceOn this pageInterfaceIn an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:stream: stream back chunks of the responseinvoke: call the chain on an inputbatch: call the chain on a list of inputsThese also have corresponding async methods:astream: stream back chunks of the response asyncainvoke: call the chain on an input asyncabatch: call the chain on a list of inputs asyncastream_log: stream back intermediate steps as they happen, in addition to the final responseThe type of the input varies by component:ComponentInput TypePromptDictionaryRetrieverSingle stringLLM, ChatModelSingle string, list of chat messages or a PromptValueToolSingle string, or dictionary, depending on the toolOutputParserThe output of an LLM or ChatModelThe output type also varies by component:ComponentOutput TypeLLMStringChatModelChatMessagePromptPromptValueRetrieverList of documentsToolDepends on the toolOutputParserDepends on the parserAll runnables expose properties to inspect the input and output types:input_schema: an input Pydantic model auto-generated from the structure of the Runnableoutput_schema: an output Pydantic model auto-generated from the structure of the RunnableLet's take a look at these methods! To do so, we'll create a super simple PromptTemplate + ChatModel chain.from langchain.prompts import
4,632
+ ChatModel chain.from langchain.prompts import ChatPromptTemplatefrom langchain.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")chain = prompt | modelInput Schema‚ÄãA description of the inputs accepted by a Runnable.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: + ChatModel chain.from langchain.prompts import ChatPromptTemplatefrom langchain.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")chain = prompt | modelInput Schema‚ÄãA description of the inputs accepted by a Runnable.
4,633
This is a Pydantic model dynamically generated from the structure of any Runnable. You can call .schema() on it to obtain a JSONSchema representation.# The input schema of the chain is the input schema of its first part, the prompt.chain.input_schema.schema() {'title': 'PromptInput', 'type': 'object', 'properties': {'topic': {'title': 'Topic', 'type': 'string'}}}Output Schema‚ÄãA description of the outputs produced by a Runnable. This is a Pydantic model dynamically generated from the structure of any Runnable.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: This is a Pydantic model dynamically generated from the structure of any Runnable. You can call .schema() on it to obtain a JSONSchema representation.# The input schema of the chain is the input schema of its first part, the prompt.chain.input_schema.schema() {'title': 'PromptInput', 'type': 'object', 'properties': {'topic': {'title': 'Topic', 'type': 'string'}}}Output Schema‚ÄãA description of the outputs produced by a Runnable. This is a Pydantic model dynamically generated from the structure of any Runnable.
4,634
You can call .schema() on it to obtain a JSONSchema representation.# The output schema of the chain is the output schema of its last part, in this case a ChatModel, which outputs a ChatMessagechain.output_schema.schema() {'title': 'ChatOpenAIOutput', 'anyOf': [{'$ref': '#/definitions/HumanMessageChunk'}, {'$ref': '#/definitions/AIMessageChunk'}, {'$ref': '#/definitions/ChatMessageChunk'}, {'$ref': '#/definitions/FunctionMessageChunk'}, {'$ref': '#/definitions/SystemMessageChunk'}], 'definitions': {'HumanMessageChunk': {'title': 'HumanMessageChunk', 'description': 'A Human Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'human', 'enum': ['human'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'AIMessageChunk': {'title': 'AIMessageChunk', 'description': 'A Message chunk from an AI.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'ai', 'enum': ['ai'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'ChatMessageChunk': {'title': 'ChatMessageChunk', 'description': 'A Chat Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: You can call .schema() on it to obtain a JSONSchema representation.# The output schema of the chain is the output schema of its last part, in this case a ChatModel, which outputs a ChatMessagechain.output_schema.schema() {'title': 'ChatOpenAIOutput', 'anyOf': [{'$ref': '#/definitions/HumanMessageChunk'}, {'$ref': '#/definitions/AIMessageChunk'}, {'$ref': '#/definitions/ChatMessageChunk'}, {'$ref': '#/definitions/FunctionMessageChunk'}, {'$ref': '#/definitions/SystemMessageChunk'}], 'definitions': {'HumanMessageChunk': {'title': 'HumanMessageChunk', 'description': 'A Human Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'human', 'enum': ['human'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'AIMessageChunk': {'title': 'AIMessageChunk', 'description': 'A Message chunk from an AI.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'ai', 'enum': ['ai'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'ChatMessageChunk': {'title': 'ChatMessageChunk', 'description': 'A Chat Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title':
4,635
'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'chat', 'enum': ['chat'], 'type': 'string'}, 'role': {'title': 'Role', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'role']}, 'FunctionMessageChunk': {'title': 'FunctionMessageChunk', 'description': 'A Function Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'function', 'enum': ['function'], 'type': 'string'}, 'name': {'title': 'Name', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'name']}, 'SystemMessageChunk': {'title': 'SystemMessageChunk', 'description': 'A System Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'system', 'enum': ['system'], 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}}}Stream‚Äãfor s in chain.stream({"topic": "bears"}): print(s.content, end="", flush=True) Why don't bears wear shoes? Because they have bear feet!Invoke‚Äãchain.invoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")Batch‚Äãchain.batch([{"topic": "bears"}, {"topic": "cats"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'chat', 'enum': ['chat'], 'type': 'string'}, 'role': {'title': 'Role', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'role']}, 'FunctionMessageChunk': {'title': 'FunctionMessageChunk', 'description': 'A Function Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'function', 'enum': ['function'], 'type': 'string'}, 'name': {'title': 'Name', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'name']}, 'SystemMessageChunk': {'title': 'SystemMessageChunk', 'description': 'A System Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'system', 'enum': ['system'], 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}}}Stream‚Äãfor s in chain.stream({"topic": "bears"}): print(s.content, end="", flush=True) Why don't bears wear shoes? Because they have bear feet!Invoke‚Äãchain.invoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")Batch‚Äãchain.batch([{"topic": "bears"}, {"topic": "cats"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have
4,636
don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Why don't cats play poker in the wild?\n\nToo many cheetahs!")]You can set the number of concurrent requests by using the max_concurrency parameterchain.batch([{"topic": "bears"}, {"topic": "cats"}], config={"max_concurrency": 5}) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Sure, here's a cat joke for you:\n\nWhy don't cats play poker in the wild?\n\nToo many cheetahs!")]Async Stream‚Äãasync for s in chain.astream({"topic": "bears"}): print(s.content, end="", flush=True) Sure, here's a bear joke for you: Why don't bears wear shoes? Because they have bear feet!Async Invoke‚Äãawait chain.ainvoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes? \n\nBecause they have bear feet!")Async Batch‚Äãawait chain.abatch([{"topic": "bears"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")]Async Stream Intermediate Steps‚ÄãAll runnables also have a method .astream_log() which can be used to stream (as they happen) all or part of the intermediate steps of your chain/sequence. This is useful eg. to show progress to the user, to use intermediate results, or even just to debug your chain.You can choose to stream all steps (default), or include/exclude steps by name, tags or metadata.This method yields JSONPatch ops that when applied in the same order as received build up the RunState.class LogEntry(TypedDict): id: str """ID of the sub-run.""" name: str """Name of the object being run.""" type: str """Type of the object being run, eg. prompt, chain, llm, etc.""" tags: List[str] """List of tags for the run.""" metadata: Dict[str, Any] """Key-value pairs of metadata for the run.""" start_time: str """ISO-8601 timestamp of when the run started.""" streamed_output_str: List[str] """List of LLM tokens streamed by this run, if
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Why don't cats play poker in the wild?\n\nToo many cheetahs!")]You can set the number of concurrent requests by using the max_concurrency parameterchain.batch([{"topic": "bears"}, {"topic": "cats"}], config={"max_concurrency": 5}) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Sure, here's a cat joke for you:\n\nWhy don't cats play poker in the wild?\n\nToo many cheetahs!")]Async Stream‚Äãasync for s in chain.astream({"topic": "bears"}): print(s.content, end="", flush=True) Sure, here's a bear joke for you: Why don't bears wear shoes? Because they have bear feet!Async Invoke‚Äãawait chain.ainvoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes? \n\nBecause they have bear feet!")Async Batch‚Äãawait chain.abatch([{"topic": "bears"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")]Async Stream Intermediate Steps‚ÄãAll runnables also have a method .astream_log() which can be used to stream (as they happen) all or part of the intermediate steps of your chain/sequence. This is useful eg. to show progress to the user, to use intermediate results, or even just to debug your chain.You can choose to stream all steps (default), or include/exclude steps by name, tags or metadata.This method yields JSONPatch ops that when applied in the same order as received build up the RunState.class LogEntry(TypedDict): id: str """ID of the sub-run.""" name: str """Name of the object being run.""" type: str """Type of the object being run, eg. prompt, chain, llm, etc.""" tags: List[str] """List of tags for the run.""" metadata: Dict[str, Any] """Key-value pairs of metadata for the run.""" start_time: str """ISO-8601 timestamp of when the run started.""" streamed_output_str: List[str] """List of LLM tokens streamed by this run, if
4,637
"""List of LLM tokens streamed by this run, if applicable.""" final_output: Optional[Any] """Final output of this run. Only available after the run has finished successfully.""" end_time: Optional[str] """ISO-8601 timestamp of when the run ended. Only available after the run has finished."""class RunState(TypedDict): id: str """ID of the run.""" streamed_output: List[Any] """List of output chunks streamed by Runnable.stream()""" final_output: Optional[Any] """Final output of the run, usually the result of aggregating (`+`) streamed_output. Only available after the run has finished successfully.""" logs: Dict[str, LogEntry] """Map of run names to sub-runs. If filters were supplied, this list will contain only the runs that matched the filters."""Streaming JSONPatch chunks‚ÄãThis is useful eg. to stream the JSONPatch in an HTTP server, and then apply the ops on the client to rebuild the run state there. See LangServe for tooling to make it easier to build a webserver from any Runnable.from langchain.embeddings import OpenAIEmbeddingsfrom langchain.schema.output_parser import StrOutputParserfrom langchain.schema.runnable import RunnablePassthroughfrom langchain.vectorstores import FAISStemplate = """Answer the question based only on the following context:{context}Question: {question}"""prompt = ChatPromptTemplate.from_template(template)vectorstore = FAISS.from_texts(["harrison worked at kensho"], embedding=OpenAIEmbeddings())retriever = vectorstore.as_retriever()retrieval_chain = ( {"context": retriever.with_config(run_name='Docs'), "question": RunnablePassthrough()} | prompt | model | StrOutputParser())async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs']): print(chunk) RunLogPatch({'op': 'replace', 'path': '', 'value': {'final_output': None, 'id': 'fd6fcf62-c92c-4edf-8713-0fc5df000f62', 'logs': {},
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: """List of LLM tokens streamed by this run, if applicable.""" final_output: Optional[Any] """Final output of this run. Only available after the run has finished successfully.""" end_time: Optional[str] """ISO-8601 timestamp of when the run ended. Only available after the run has finished."""class RunState(TypedDict): id: str """ID of the run.""" streamed_output: List[Any] """List of output chunks streamed by Runnable.stream()""" final_output: Optional[Any] """Final output of the run, usually the result of aggregating (`+`) streamed_output. Only available after the run has finished successfully.""" logs: Dict[str, LogEntry] """Map of run names to sub-runs. If filters were supplied, this list will contain only the runs that matched the filters."""Streaming JSONPatch chunks‚ÄãThis is useful eg. to stream the JSONPatch in an HTTP server, and then apply the ops on the client to rebuild the run state there. See LangServe for tooling to make it easier to build a webserver from any Runnable.from langchain.embeddings import OpenAIEmbeddingsfrom langchain.schema.output_parser import StrOutputParserfrom langchain.schema.runnable import RunnablePassthroughfrom langchain.vectorstores import FAISStemplate = """Answer the question based only on the following context:{context}Question: {question}"""prompt = ChatPromptTemplate.from_template(template)vectorstore = FAISS.from_texts(["harrison worked at kensho"], embedding=OpenAIEmbeddings())retriever = vectorstore.as_retriever()retrieval_chain = ( {"context": retriever.with_config(run_name='Docs'), "question": RunnablePassthrough()} | prompt | model | StrOutputParser())async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs']): print(chunk) RunLogPatch({'op': 'replace', 'path': '', 'value': {'final_output': None, 'id': 'fd6fcf62-c92c-4edf-8713-0fc5df000f62', 'logs': {},
4,638
'logs': {}, 'streamed_output': []}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs', 'value': {'end_time': None, 'final_output': None, 'id': '8c998257-1ec8-4546-b744-c3fdb9728c41', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:35.668', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs/final_output', 'value': {'documents': [Document(page_content='harrison worked at kensho')]}}, {'op': 'add', 'path': '/logs/Docs/end_time', 'value': '2023-10-05T12:52:36.033'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'H'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'arrison'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' worked'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' at'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' Kens'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'ho'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': '.'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'replace', 'path': '/final_output', 'value': {'output': 'Harrison worked at Kensho.'}})Streaming the incremental RunState‚ÄãYou can simply pass diff=False to get incremental values of RunState.async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs'], diff=False): print(chunk) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185',
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'logs': {}, 'streamed_output': []}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs', 'value': {'end_time': None, 'final_output': None, 'id': '8c998257-1ec8-4546-b744-c3fdb9728c41', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:35.668', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs/final_output', 'value': {'documents': [Document(page_content='harrison worked at kensho')]}}, {'op': 'add', 'path': '/logs/Docs/end_time', 'value': '2023-10-05T12:52:36.033'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'H'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'arrison'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' worked'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' at'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' Kens'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'ho'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': '.'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'replace', 'path': '/final_output', 'value': {'output': 'Harrison worked at Kensho.'}})Streaming the incremental RunState‚ÄãYou can simply pass diff=False to get incremental values of RunState.async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs'], diff=False): print(chunk) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185',
4,639
'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': None, 'final_output': None, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': None, 'final_output': None, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output':
4,640
'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217',
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217',
4,641
'2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho']}) RunLog({'final_output': None, 'id':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho']}) RunLog({'final_output': None, 'id':
4,642
RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']}) RunLog({'final_output': {'output': 'Harrison worked at Kensho.'}, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {},
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']}) RunLog({'final_output': {'output': 'Harrison worked at Kensho.'}, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {},
4,643
'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']})Parallelism‚ÄãLet's take a look at how LangChain Expression Language support parallel requests as much as possible. For example, when using a RunnableParallel (often written as a dictionary) it executes each element in parallel.from langchain.schema.runnable import RunnableParallelchain1 = ChatPromptTemplate.from_template("tell me a joke about {topic}") | modelchain2 = ChatPromptTemplate.from_template("write a short (2 line) poem about {topic}") | modelcombined = RunnableParallel(joke=chain1, poem=chain2)chain1.invoke({"topic": "bears"}) CPU times: user 31.7 ms, sys: 8.59 ms, total: 40.3 ms Wall time: 1.05 s AIMessage(content="Why don't bears like fast food?\n\nBecause they can't catch it!", additional_kwargs={}, example=False)chain2.invoke({"topic": "bears"}) CPU times: user 42.9 ms, sys: 10.2 ms, total: 53 ms Wall time: 1.93 s AIMessage(content="In forest's embrace, bears roam free,\nSilent strength, nature's majesty.", additional_kwargs={}, example=False)combined.invoke({"topic": "bears"}) CPU times: user 96.3 ms, sys: 20.4 ms, total: 117 ms Wall time: 1.1 s {'joke': AIMessage(content="Why don't bears wear socks?\n\nBecause they have bear feet!", additional_kwargs={}, example=False), 'poem': AIMessage(content="In forest's embrace,\nMajestic bears leave their trace.", additional_kwargs={}, example=False)}PreviousLangChain Expression Language (LCEL)NextHow toInput SchemaOutput
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']})Parallelism‚ÄãLet's take a look at how LangChain Expression Language support parallel requests as much as possible. For example, when using a RunnableParallel (often written as a dictionary) it executes each element in parallel.from langchain.schema.runnable import RunnableParallelchain1 = ChatPromptTemplate.from_template("tell me a joke about {topic}") | modelchain2 = ChatPromptTemplate.from_template("write a short (2 line) poem about {topic}") | modelcombined = RunnableParallel(joke=chain1, poem=chain2)chain1.invoke({"topic": "bears"}) CPU times: user 31.7 ms, sys: 8.59 ms, total: 40.3 ms Wall time: 1.05 s AIMessage(content="Why don't bears like fast food?\n\nBecause they can't catch it!", additional_kwargs={}, example=False)chain2.invoke({"topic": "bears"}) CPU times: user 42.9 ms, sys: 10.2 ms, total: 53 ms Wall time: 1.93 s AIMessage(content="In forest's embrace, bears roam free,\nSilent strength, nature's majesty.", additional_kwargs={}, example=False)combined.invoke({"topic": "bears"}) CPU times: user 96.3 ms, sys: 20.4 ms, total: 117 ms Wall time: 1.1 s {'joke': AIMessage(content="Why don't bears wear socks?\n\nBecause they have bear feet!", additional_kwargs={}, example=False), 'poem': AIMessage(content="In forest's embrace,\nMajestic bears leave their trace.", additional_kwargs={}, example=False)}PreviousLangChain Expression Language (LCEL)NextHow toInput SchemaOutput
4,644
Language (LCEL)NextHow toInput SchemaOutput SchemaStreamInvokeBatchAsync StreamAsync InvokeAsync BatchAsync Stream Intermediate StepsStreaming JSONPatch chunksStreaming the incremental RunStateParallelismCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Language (LCEL)NextHow toInput SchemaOutput SchemaStreamInvokeBatchAsync StreamAsync InvokeAsync BatchAsync Stream Intermediate StepsStreaming JSONPatch chunksStreaming the incremental RunStateParallelismCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.
4,645
Interface | 🦜️🔗 Langchain
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Interface | 🦜️🔗 Langchain
4,646
Skip to main content🦜️🔗 LangChainDocsUse casesIntegrationsAPICommunityChat our docsLangSmithJS/TS DocsSearchCTRLKGet startedIntroductionInstallationQuickstartLangChain Expression LanguageInterfaceHow toCookbookLangChain Expression Language (LCEL)Why use LCEL?ModulesModel I/​ORetrievalChainsMemoryAgentsCallbacksModulesSecurityGuidesMoreLangChain Expression LanguageInterfaceOn this pageInterfaceIn an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:stream: stream back chunks of the responseinvoke: call the chain on an inputbatch: call the chain on a list of inputsThese also have corresponding async methods:astream: stream back chunks of the response asyncainvoke: call the chain on an input asyncabatch: call the chain on a list of inputs asyncastream_log: stream back intermediate steps as they happen, in addition to the final responseThe type of the input varies by component:ComponentInput TypePromptDictionaryRetrieverSingle stringLLM, ChatModelSingle string, list of chat messages or a PromptValueToolSingle string, or dictionary, depending on the toolOutputParserThe output of an LLM or ChatModelThe output type also varies by component:ComponentOutput TypeLLMStringChatModelChatMessagePromptPromptValueRetrieverList of documentsToolDepends on the toolOutputParserDepends on the parserAll runnables expose properties to inspect the input and output types:input_schema: an input Pydantic model auto-generated from the structure of the Runnableoutput_schema: an output Pydantic model auto-generated from the structure of the RunnableLet's take a look at these methods! To do so, we'll create a super simple PromptTemplate + ChatModel chain.from langchain.prompts import
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Skip to main content🦜️🔗 LangChainDocsUse casesIntegrationsAPICommunityChat our docsLangSmithJS/TS DocsSearchCTRLKGet startedIntroductionInstallationQuickstartLangChain Expression LanguageInterfaceHow toCookbookLangChain Expression Language (LCEL)Why use LCEL?ModulesModel I/​ORetrievalChainsMemoryAgentsCallbacksModulesSecurityGuidesMoreLangChain Expression LanguageInterfaceOn this pageInterfaceIn an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:stream: stream back chunks of the responseinvoke: call the chain on an inputbatch: call the chain on a list of inputsThese also have corresponding async methods:astream: stream back chunks of the response asyncainvoke: call the chain on an input asyncabatch: call the chain on a list of inputs asyncastream_log: stream back intermediate steps as they happen, in addition to the final responseThe type of the input varies by component:ComponentInput TypePromptDictionaryRetrieverSingle stringLLM, ChatModelSingle string, list of chat messages or a PromptValueToolSingle string, or dictionary, depending on the toolOutputParserThe output of an LLM or ChatModelThe output type also varies by component:ComponentOutput TypeLLMStringChatModelChatMessagePromptPromptValueRetrieverList of documentsToolDepends on the toolOutputParserDepends on the parserAll runnables expose properties to inspect the input and output types:input_schema: an input Pydantic model auto-generated from the structure of the Runnableoutput_schema: an output Pydantic model auto-generated from the structure of the RunnableLet's take a look at these methods! To do so, we'll create a super simple PromptTemplate + ChatModel chain.from langchain.prompts import
4,647
+ ChatModel chain.from langchain.prompts import ChatPromptTemplatefrom langchain.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")chain = prompt | modelInput Schema‚ÄãA description of the inputs accepted by a Runnable.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: + ChatModel chain.from langchain.prompts import ChatPromptTemplatefrom langchain.chat_models import ChatOpenAImodel = ChatOpenAI()prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")chain = prompt | modelInput Schema‚ÄãA description of the inputs accepted by a Runnable.
4,648
This is a Pydantic model dynamically generated from the structure of any Runnable. You can call .schema() on it to obtain a JSONSchema representation.# The input schema of the chain is the input schema of its first part, the prompt.chain.input_schema.schema() {'title': 'PromptInput', 'type': 'object', 'properties': {'topic': {'title': 'Topic', 'type': 'string'}}}Output Schema‚ÄãA description of the outputs produced by a Runnable. This is a Pydantic model dynamically generated from the structure of any Runnable.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: This is a Pydantic model dynamically generated from the structure of any Runnable. You can call .schema() on it to obtain a JSONSchema representation.# The input schema of the chain is the input schema of its first part, the prompt.chain.input_schema.schema() {'title': 'PromptInput', 'type': 'object', 'properties': {'topic': {'title': 'Topic', 'type': 'string'}}}Output Schema‚ÄãA description of the outputs produced by a Runnable. This is a Pydantic model dynamically generated from the structure of any Runnable.
4,649
You can call .schema() on it to obtain a JSONSchema representation.# The output schema of the chain is the output schema of its last part, in this case a ChatModel, which outputs a ChatMessagechain.output_schema.schema() {'title': 'ChatOpenAIOutput', 'anyOf': [{'$ref': '#/definitions/HumanMessageChunk'}, {'$ref': '#/definitions/AIMessageChunk'}, {'$ref': '#/definitions/ChatMessageChunk'}, {'$ref': '#/definitions/FunctionMessageChunk'}, {'$ref': '#/definitions/SystemMessageChunk'}], 'definitions': {'HumanMessageChunk': {'title': 'HumanMessageChunk', 'description': 'A Human Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'human', 'enum': ['human'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'AIMessageChunk': {'title': 'AIMessageChunk', 'description': 'A Message chunk from an AI.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'ai', 'enum': ['ai'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'ChatMessageChunk': {'title': 'ChatMessageChunk', 'description': 'A Chat Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: You can call .schema() on it to obtain a JSONSchema representation.# The output schema of the chain is the output schema of its last part, in this case a ChatModel, which outputs a ChatMessagechain.output_schema.schema() {'title': 'ChatOpenAIOutput', 'anyOf': [{'$ref': '#/definitions/HumanMessageChunk'}, {'$ref': '#/definitions/AIMessageChunk'}, {'$ref': '#/definitions/ChatMessageChunk'}, {'$ref': '#/definitions/FunctionMessageChunk'}, {'$ref': '#/definitions/SystemMessageChunk'}], 'definitions': {'HumanMessageChunk': {'title': 'HumanMessageChunk', 'description': 'A Human Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'human', 'enum': ['human'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'AIMessageChunk': {'title': 'AIMessageChunk', 'description': 'A Message chunk from an AI.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'ai', 'enum': ['ai'], 'type': 'string'}, 'example': {'title': 'Example', 'default': False, 'type': 'boolean'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}, 'ChatMessageChunk': {'title': 'ChatMessageChunk', 'description': 'A Chat Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title':
4,650
'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'chat', 'enum': ['chat'], 'type': 'string'}, 'role': {'title': 'Role', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'role']}, 'FunctionMessageChunk': {'title': 'FunctionMessageChunk', 'description': 'A Function Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'function', 'enum': ['function'], 'type': 'string'}, 'name': {'title': 'Name', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'name']}, 'SystemMessageChunk': {'title': 'SystemMessageChunk', 'description': 'A System Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'system', 'enum': ['system'], 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}}}Stream‚Äãfor s in chain.stream({"topic": "bears"}): print(s.content, end="", flush=True) Why don't bears wear shoes? Because they have bear feet!Invoke‚Äãchain.invoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")Batch‚Äãchain.batch([{"topic": "bears"}, {"topic": "cats"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'chat', 'enum': ['chat'], 'type': 'string'}, 'role': {'title': 'Role', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'role']}, 'FunctionMessageChunk': {'title': 'FunctionMessageChunk', 'description': 'A Function Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'function', 'enum': ['function'], 'type': 'string'}, 'name': {'title': 'Name', 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content', 'name']}, 'SystemMessageChunk': {'title': 'SystemMessageChunk', 'description': 'A System Message chunk.', 'type': 'object', 'properties': {'content': {'title': 'Content', 'type': 'string'}, 'additional_kwargs': {'title': 'Additional Kwargs', 'type': 'object'}, 'type': {'title': 'Type', 'default': 'system', 'enum': ['system'], 'type': 'string'}, 'is_chunk': {'title': 'Is Chunk', 'default': True, 'enum': [True], 'type': 'boolean'}}, 'required': ['content']}}}Stream‚Äãfor s in chain.stream({"topic": "bears"}): print(s.content, end="", flush=True) Why don't bears wear shoes? Because they have bear feet!Invoke‚Äãchain.invoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")Batch‚Äãchain.batch([{"topic": "bears"}, {"topic": "cats"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have
4,651
don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Why don't cats play poker in the wild?\n\nToo many cheetahs!")]You can set the number of concurrent requests by using the max_concurrency parameterchain.batch([{"topic": "bears"}, {"topic": "cats"}], config={"max_concurrency": 5}) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Sure, here's a cat joke for you:\n\nWhy don't cats play poker in the wild?\n\nToo many cheetahs!")]Async Stream‚Äãasync for s in chain.astream({"topic": "bears"}): print(s.content, end="", flush=True) Sure, here's a bear joke for you: Why don't bears wear shoes? Because they have bear feet!Async Invoke‚Äãawait chain.ainvoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes? \n\nBecause they have bear feet!")Async Batch‚Äãawait chain.abatch([{"topic": "bears"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")]Async Stream Intermediate Steps‚ÄãAll runnables also have a method .astream_log() which can be used to stream (as they happen) all or part of the intermediate steps of your chain/sequence. This is useful eg. to show progress to the user, to use intermediate results, or even just to debug your chain.You can choose to stream all steps (default), or include/exclude steps by name, tags or metadata.This method yields JSONPatch ops that when applied in the same order as received build up the RunState.class LogEntry(TypedDict): id: str """ID of the sub-run.""" name: str """Name of the object being run.""" type: str """Type of the object being run, eg. prompt, chain, llm, etc.""" tags: List[str] """List of tags for the run.""" metadata: Dict[str, Any] """Key-value pairs of metadata for the run.""" start_time: str """ISO-8601 timestamp of when the run started.""" streamed_output_str: List[str] """List of LLM tokens streamed by this run, if
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Why don't cats play poker in the wild?\n\nToo many cheetahs!")]You can set the number of concurrent requests by using the max_concurrency parameterchain.batch([{"topic": "bears"}, {"topic": "cats"}], config={"max_concurrency": 5}) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!"), AIMessage(content="Sure, here's a cat joke for you:\n\nWhy don't cats play poker in the wild?\n\nToo many cheetahs!")]Async Stream‚Äãasync for s in chain.astream({"topic": "bears"}): print(s.content, end="", flush=True) Sure, here's a bear joke for you: Why don't bears wear shoes? Because they have bear feet!Async Invoke‚Äãawait chain.ainvoke({"topic": "bears"}) AIMessage(content="Why don't bears wear shoes? \n\nBecause they have bear feet!")Async Batch‚Äãawait chain.abatch([{"topic": "bears"}]) [AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!")]Async Stream Intermediate Steps‚ÄãAll runnables also have a method .astream_log() which can be used to stream (as they happen) all or part of the intermediate steps of your chain/sequence. This is useful eg. to show progress to the user, to use intermediate results, or even just to debug your chain.You can choose to stream all steps (default), or include/exclude steps by name, tags or metadata.This method yields JSONPatch ops that when applied in the same order as received build up the RunState.class LogEntry(TypedDict): id: str """ID of the sub-run.""" name: str """Name of the object being run.""" type: str """Type of the object being run, eg. prompt, chain, llm, etc.""" tags: List[str] """List of tags for the run.""" metadata: Dict[str, Any] """Key-value pairs of metadata for the run.""" start_time: str """ISO-8601 timestamp of when the run started.""" streamed_output_str: List[str] """List of LLM tokens streamed by this run, if
4,652
"""List of LLM tokens streamed by this run, if applicable.""" final_output: Optional[Any] """Final output of this run. Only available after the run has finished successfully.""" end_time: Optional[str] """ISO-8601 timestamp of when the run ended. Only available after the run has finished."""class RunState(TypedDict): id: str """ID of the run.""" streamed_output: List[Any] """List of output chunks streamed by Runnable.stream()""" final_output: Optional[Any] """Final output of the run, usually the result of aggregating (`+`) streamed_output. Only available after the run has finished successfully.""" logs: Dict[str, LogEntry] """Map of run names to sub-runs. If filters were supplied, this list will contain only the runs that matched the filters."""Streaming JSONPatch chunks‚ÄãThis is useful eg. to stream the JSONPatch in an HTTP server, and then apply the ops on the client to rebuild the run state there. See LangServe for tooling to make it easier to build a webserver from any Runnable.from langchain.embeddings import OpenAIEmbeddingsfrom langchain.schema.output_parser import StrOutputParserfrom langchain.schema.runnable import RunnablePassthroughfrom langchain.vectorstores import FAISStemplate = """Answer the question based only on the following context:{context}Question: {question}"""prompt = ChatPromptTemplate.from_template(template)vectorstore = FAISS.from_texts(["harrison worked at kensho"], embedding=OpenAIEmbeddings())retriever = vectorstore.as_retriever()retrieval_chain = ( {"context": retriever.with_config(run_name='Docs'), "question": RunnablePassthrough()} | prompt | model | StrOutputParser())async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs']): print(chunk) RunLogPatch({'op': 'replace', 'path': '', 'value': {'final_output': None, 'id': 'fd6fcf62-c92c-4edf-8713-0fc5df000f62', 'logs': {},
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: """List of LLM tokens streamed by this run, if applicable.""" final_output: Optional[Any] """Final output of this run. Only available after the run has finished successfully.""" end_time: Optional[str] """ISO-8601 timestamp of when the run ended. Only available after the run has finished."""class RunState(TypedDict): id: str """ID of the run.""" streamed_output: List[Any] """List of output chunks streamed by Runnable.stream()""" final_output: Optional[Any] """Final output of the run, usually the result of aggregating (`+`) streamed_output. Only available after the run has finished successfully.""" logs: Dict[str, LogEntry] """Map of run names to sub-runs. If filters were supplied, this list will contain only the runs that matched the filters."""Streaming JSONPatch chunks‚ÄãThis is useful eg. to stream the JSONPatch in an HTTP server, and then apply the ops on the client to rebuild the run state there. See LangServe for tooling to make it easier to build a webserver from any Runnable.from langchain.embeddings import OpenAIEmbeddingsfrom langchain.schema.output_parser import StrOutputParserfrom langchain.schema.runnable import RunnablePassthroughfrom langchain.vectorstores import FAISStemplate = """Answer the question based only on the following context:{context}Question: {question}"""prompt = ChatPromptTemplate.from_template(template)vectorstore = FAISS.from_texts(["harrison worked at kensho"], embedding=OpenAIEmbeddings())retriever = vectorstore.as_retriever()retrieval_chain = ( {"context": retriever.with_config(run_name='Docs'), "question": RunnablePassthrough()} | prompt | model | StrOutputParser())async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs']): print(chunk) RunLogPatch({'op': 'replace', 'path': '', 'value': {'final_output': None, 'id': 'fd6fcf62-c92c-4edf-8713-0fc5df000f62', 'logs': {},
4,653
'logs': {}, 'streamed_output': []}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs', 'value': {'end_time': None, 'final_output': None, 'id': '8c998257-1ec8-4546-b744-c3fdb9728c41', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:35.668', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs/final_output', 'value': {'documents': [Document(page_content='harrison worked at kensho')]}}, {'op': 'add', 'path': '/logs/Docs/end_time', 'value': '2023-10-05T12:52:36.033'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'H'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'arrison'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' worked'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' at'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' Kens'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'ho'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': '.'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'replace', 'path': '/final_output', 'value': {'output': 'Harrison worked at Kensho.'}})Streaming the incremental RunState‚ÄãYou can simply pass diff=False to get incremental values of RunState.async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs'], diff=False): print(chunk) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185',
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'logs': {}, 'streamed_output': []}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs', 'value': {'end_time': None, 'final_output': None, 'id': '8c998257-1ec8-4546-b744-c3fdb9728c41', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:35.668', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}) RunLogPatch({'op': 'add', 'path': '/logs/Docs/final_output', 'value': {'documents': [Document(page_content='harrison worked at kensho')]}}, {'op': 'add', 'path': '/logs/Docs/end_time', 'value': '2023-10-05T12:52:36.033'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'H'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'arrison'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' worked'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' at'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ' Kens'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': 'ho'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': '.'}) RunLogPatch({'op': 'add', 'path': '/streamed_output/-', 'value': ''}) RunLogPatch({'op': 'replace', 'path': '/final_output', 'value': {'output': 'Harrison worked at Kensho.'}})Streaming the incremental RunState‚ÄãYou can simply pass diff=False to get incremental values of RunState.async for chunk in retrieval_chain.astream_log("where did harrison work?", include_names=['Docs'], diff=False): print(chunk) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185',
4,654
'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': None, 'final_output': None, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': None, 'final_output': None, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': []}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output':
4,655
'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217',
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217',
4,656
'2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho']}) RunLog({'final_output': None, 'id':
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho']}) RunLog({'final_output': None, 'id':
4,657
RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']}) RunLog({'final_output': {'output': 'Harrison worked at Kensho.'}, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {},
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.']}) RunLog({'final_output': None, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']}) RunLog({'final_output': {'output': 'Harrison worked at Kensho.'}, 'id': 'f95ccb87-31f1-48ea-a51c-d2dadde44185', 'logs': {'Docs': {'end_time': '2023-10-05T12:52:37.217', 'final_output': {'documents': [Document(page_content='harrison worked at kensho')]}, 'id': '621597dd-d716-4532-938d-debc21a453d1', 'metadata': {},
4,658
'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']})Parallelism‚ÄãLet's take a look at how LangChain Expression Language support parallel requests as much as possible. For example, when using a RunnableParallel (often written as a dictionary) it executes each element in parallel.from langchain.schema.runnable import RunnableParallelchain1 = ChatPromptTemplate.from_template("tell me a joke about {topic}") | modelchain2 = ChatPromptTemplate.from_template("write a short (2 line) poem about {topic}") | modelcombined = RunnableParallel(joke=chain1, poem=chain2)chain1.invoke({"topic": "bears"}) CPU times: user 31.7 ms, sys: 8.59 ms, total: 40.3 ms Wall time: 1.05 s AIMessage(content="Why don't bears like fast food?\n\nBecause they can't catch it!", additional_kwargs={}, example=False)chain2.invoke({"topic": "bears"}) CPU times: user 42.9 ms, sys: 10.2 ms, total: 53 ms Wall time: 1.93 s AIMessage(content="In forest's embrace, bears roam free,\nSilent strength, nature's majesty.", additional_kwargs={}, example=False)combined.invoke({"topic": "bears"}) CPU times: user 96.3 ms, sys: 20.4 ms, total: 117 ms Wall time: 1.1 s {'joke': AIMessage(content="Why don't bears wear socks?\n\nBecause they have bear feet!", additional_kwargs={}, example=False), 'poem': AIMessage(content="In forest's embrace,\nMajestic bears leave their trace.", additional_kwargs={}, example=False)}PreviousLangChain Expression Language (LCEL)NextHow toInput SchemaOutput
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: 'metadata': {}, 'name': 'Docs', 'start_time': '2023-10-05T12:52:36.935', 'streamed_output_str': [], 'tags': ['map:key:context', 'FAISS'], 'type': 'retriever'}}, 'streamed_output': ['', 'H', 'arrison', ' worked', ' at', ' Kens', 'ho', '.', '']})Parallelism‚ÄãLet's take a look at how LangChain Expression Language support parallel requests as much as possible. For example, when using a RunnableParallel (often written as a dictionary) it executes each element in parallel.from langchain.schema.runnable import RunnableParallelchain1 = ChatPromptTemplate.from_template("tell me a joke about {topic}") | modelchain2 = ChatPromptTemplate.from_template("write a short (2 line) poem about {topic}") | modelcombined = RunnableParallel(joke=chain1, poem=chain2)chain1.invoke({"topic": "bears"}) CPU times: user 31.7 ms, sys: 8.59 ms, total: 40.3 ms Wall time: 1.05 s AIMessage(content="Why don't bears like fast food?\n\nBecause they can't catch it!", additional_kwargs={}, example=False)chain2.invoke({"topic": "bears"}) CPU times: user 42.9 ms, sys: 10.2 ms, total: 53 ms Wall time: 1.93 s AIMessage(content="In forest's embrace, bears roam free,\nSilent strength, nature's majesty.", additional_kwargs={}, example=False)combined.invoke({"topic": "bears"}) CPU times: user 96.3 ms, sys: 20.4 ms, total: 117 ms Wall time: 1.1 s {'joke': AIMessage(content="Why don't bears wear socks?\n\nBecause they have bear feet!", additional_kwargs={}, example=False), 'poem': AIMessage(content="In forest's embrace,\nMajestic bears leave their trace.", additional_kwargs={}, example=False)}PreviousLangChain Expression Language (LCEL)NextHow toInput SchemaOutput
4,659
Language (LCEL)NextHow toInput SchemaOutput SchemaStreamInvokeBatchAsync StreamAsync InvokeAsync BatchAsync Stream Intermediate StepsStreaming JSONPatch chunksStreaming the incremental RunStateParallelismCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:
In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which makes it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes: ->: Language (LCEL)NextHow toInput SchemaOutput SchemaStreamInvokeBatchAsync StreamAsync InvokeAsync BatchAsync Stream Intermediate StepsStreaming JSONPatch chunksStreaming the incremental RunStateParallelismCommunityDiscordTwitterGitHubPythonJS/TSMoreHomepageBlogCopyright © 2023 LangChain, Inc.