id
stringlengths 14
15
| text
stringlengths 13
2.7k
| source
stringlengths 60
181
|
---|---|---|
ab5bc98f10c9-0 | langchain_google_genai 0.0.4¶
langchain_google_genai.chat_models¶
Classes¶
chat_models.ChatGoogleGenerativeAI
Google Generative AI Chat models API.
chat_models.ChatGoogleGenerativeAIError
Custom exception class for errors associated with the Google GenAI API.
Functions¶
langchain_google_genai.embeddings¶
Classes¶
embeddings.GoogleGenerativeAIEmbeddings
Google Generative AI Embeddings.
langchain_google_genai.llms¶
Classes¶
llms.GoogleGenerativeAI
Google GenerativeAI models.
Functions¶ | https://api.python.langchain.com/en/latest/google_genai_api_reference.html |
171c9b7abdfd-0 | langchain_experimental 0.0.47¶
langchain_experimental.agents¶
Functions¶
agents.agent_toolkits.csv.base.create_csv_agent(...)
Create csv agent by loading to a dataframe and using pandas agent.
agents.agent_toolkits.pandas.base.create_pandas_dataframe_agent(llm, df)
Construct a pandas agent from an LLM and dataframe.
agents.agent_toolkits.python.base.create_python_agent(...)
Construct a python agent from an LLM and tool.
agents.agent_toolkits.spark.base.create_spark_dataframe_agent(llm, df)
Construct a Spark agent from an LLM and dataframe.
agents.agent_toolkits.xorbits.base.create_xorbits_agent(...)
Construct a xorbits agent from an LLM and dataframe.
langchain_experimental.autonomous_agents¶
Classes¶
autonomous_agents.autogpt.agent.AutoGPT(...)
Agent class for interacting with Auto-GPT.
autonomous_agents.autogpt.memory.AutoGPTMemory
Memory for AutoGPT.
autonomous_agents.autogpt.output_parser.AutoGPTAction(...)
Action returned by AutoGPTOutputParser.
autonomous_agents.autogpt.output_parser.AutoGPTOutputParser
Output parser for AutoGPT.
autonomous_agents.autogpt.output_parser.BaseAutoGPTOutputParser
Base Output parser for AutoGPT.
autonomous_agents.autogpt.prompt.AutoGPTPrompt
Prompt for AutoGPT.
autonomous_agents.autogpt.prompt_generator.PromptGenerator()
A class for generating custom prompt strings.
autonomous_agents.baby_agi.baby_agi.BabyAGI
Controller model for the BabyAGI agent.
autonomous_agents.baby_agi.task_creation.TaskCreationChain
Chain generating tasks.
autonomous_agents.baby_agi.task_execution.TaskExecutionChain
Chain to execute tasks. | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-1 | autonomous_agents.baby_agi.task_execution.TaskExecutionChain
Chain to execute tasks.
autonomous_agents.baby_agi.task_prioritization.TaskPrioritizationChain
Chain to prioritize tasks.
autonomous_agents.hugginggpt.hugginggpt.HuggingGPT(...)
autonomous_agents.hugginggpt.repsonse_generator.ResponseGenerationChain
Chain to execute tasks.
autonomous_agents.hugginggpt.repsonse_generator.ResponseGenerator(...)
autonomous_agents.hugginggpt.task_executor.Task(...)
autonomous_agents.hugginggpt.task_executor.TaskExecutor(plan)
Load tools to execute tasks.
autonomous_agents.hugginggpt.task_planner.BasePlanner
Create a new model by parsing and validating input data from keyword arguments.
autonomous_agents.hugginggpt.task_planner.Plan(steps)
autonomous_agents.hugginggpt.task_planner.PlanningOutputParser
Create a new model by parsing and validating input data from keyword arguments.
autonomous_agents.hugginggpt.task_planner.Step(...)
autonomous_agents.hugginggpt.task_planner.TaskPlaningChain
Chain to execute tasks.
autonomous_agents.hugginggpt.task_planner.TaskPlanner
Create a new model by parsing and validating input data from keyword arguments.
Functions¶
autonomous_agents.autogpt.output_parser.preprocess_json_input(...)
Preprocesses a string to be parsed as json.
autonomous_agents.autogpt.prompt_generator.get_prompt(tools)
Generates a prompt string.
autonomous_agents.hugginggpt.repsonse_generator.load_response_generator(llm)
autonomous_agents.hugginggpt.task_planner.load_chat_planner(llm)
langchain_experimental.chat_models¶
Chat Models are a variation on language models.
While Chat Models use language models under the hood, the interface they expose | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-2 | While Chat Models use language models under the hood, the interface they expose
is a bit different. Rather than expose a “text in, text out” API, they expose
an interface where “chat messages” are the inputs and outputs.
Class hierarchy:
BaseLanguageModel --> BaseChatModel --> <name> # Examples: ChatOpenAI, ChatGooglePalm
Main helpers:
AIMessage, BaseMessage, HumanMessage
Classes¶
chat_models.llm_wrapper.ChatWrapper
Create a new model by parsing and validating input data from keyword arguments.
chat_models.llm_wrapper.Llama2Chat
Create a new model by parsing and validating input data from keyword arguments.
chat_models.llm_wrapper.Orca
Create a new model by parsing and validating input data from keyword arguments.
chat_models.llm_wrapper.Vicuna
Create a new model by parsing and validating input data from keyword arguments.
langchain_experimental.comprehend_moderation¶
Classes¶
comprehend_moderation.amazon_comprehend_moderation.AmazonComprehendModerationChain
A subclass of Chain, designed to apply moderation to LLMs.
comprehend_moderation.base_moderation.BaseModeration(client)
comprehend_moderation.base_moderation_callbacks.BaseModerationCallbackHandler()
comprehend_moderation.base_moderation_config.BaseModerationConfig
Create a new model by parsing and validating input data from keyword arguments.
comprehend_moderation.base_moderation_config.ModerationPiiConfig
Create a new model by parsing and validating input data from keyword arguments.
comprehend_moderation.base_moderation_config.ModerationPromptSafetyConfig
Create a new model by parsing and validating input data from keyword arguments.
comprehend_moderation.base_moderation_config.ModerationToxicityConfig | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-3 | comprehend_moderation.base_moderation_config.ModerationToxicityConfig
Create a new model by parsing and validating input data from keyword arguments.
comprehend_moderation.base_moderation_exceptions.ModerationPiiError([...])
Exception raised if PII entities are detected.
comprehend_moderation.base_moderation_exceptions.ModerationPromptSafetyError([...])
Exception raised if Intention entities are detected.
comprehend_moderation.base_moderation_exceptions.ModerationToxicityError([...])
Exception raised if Toxic entities are detected.
comprehend_moderation.pii.ComprehendPII(client)
comprehend_moderation.prompt_safety.ComprehendPromptSafety(client)
comprehend_moderation.toxicity.ComprehendToxicity(client)
langchain_experimental.cpal¶
Classes¶
cpal.base.CPALChain
Causal program-aided language (CPAL) chain implementation.
cpal.base.CausalChain
Translate the causal narrative into a stack of operations.
cpal.base.InterventionChain
Set the hypothetical conditions for the causal model.
cpal.base.NarrativeChain
Decompose the narrative into its story elements
cpal.base.QueryChain
Query the outcome table using SQL.
cpal.constants.Constant(value[, names, ...])
Enum for constants used in the CPAL.
cpal.models.CausalModel
Create a new model by parsing and validating input data from keyword arguments.
cpal.models.EntityModel
Create a new model by parsing and validating input data from keyword arguments.
cpal.models.EntitySettingModel
Initial conditions for an entity
cpal.models.InterventionModel
aka initial conditions
cpal.models.NarrativeModel
Represent the narrative input as three story elements.
cpal.models.QueryModel | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-4 | Represent the narrative input as three story elements.
cpal.models.QueryModel
translate a question about the story outcome into a programmatic expression
cpal.models.ResultModel
Create a new model by parsing and validating input data from keyword arguments.
cpal.models.StoryModel
Create a new model by parsing and validating input data from keyword arguments.
cpal.models.SystemSettingModel
Initial global conditions for the system.
langchain_experimental.data_anonymizer¶
Data anonymizer package
Classes¶
data_anonymizer.base.AnonymizerBase()
Base abstract class for anonymizers. It is public and non-virtual because it allows wrapping the behavior for all methods in a base class.
data_anonymizer.base.ReversibleAnonymizerBase()
Base abstract class for reversible anonymizers.
data_anonymizer.deanonymizer_mapping.DeanonymizerMapping(...)
data_anonymizer.presidio.PresidioAnonymizer([...])
param analyzed_fields
List of fields to detect and then anonymize.
data_anonymizer.presidio.PresidioAnonymizerBase([...])
param analyzed_fields
List of fields to detect and then anonymize.
data_anonymizer.presidio.PresidioReversibleAnonymizer([...])
param analyzed_fields
List of fields to detect and then anonymize.
Functions¶
data_anonymizer.deanonymizer_mapping.create_anonymizer_mapping(...)
Creates or updates the mapping used to anonymize and/or deanonymize text.
data_anonymizer.deanonymizer_mapping.format_duplicated_operator(...)
Format the operator name with the count
data_anonymizer.deanonymizer_matching_strategies.case_insensitive_matching_strategy(...)
Case insensitive matching strategy for deanonymization. It replaces all the anonymized entities with the original ones irrespective of their letter case. | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-5 | data_anonymizer.deanonymizer_matching_strategies.combined_exact_fuzzy_matching_strategy(...)
RECOMMENDED STRATEGY.
data_anonymizer.deanonymizer_matching_strategies.exact_matching_strategy(...)
Exact matching strategy for deanonymization.
data_anonymizer.deanonymizer_matching_strategies.fuzzy_matching_strategy(...)
Fuzzy matching strategy for deanonymization.
data_anonymizer.deanonymizer_matching_strategies.ngram_fuzzy_matching_strategy(...)
N-gram fuzzy matching strategy for deanonymization.
data_anonymizer.faker_presidio_mapping.get_pseudoanonymizer_mapping([seed])
langchain_experimental.fallacy_removal¶
The Chain runs a self-review of logical fallacies as determined by this paper categorizing and defining logical fallacies https://arxiv.org/pdf/2212.07425.pdf. Modeled after Constitutional AI and in same format, but applying logical fallacies as generalized rules to remove in output
Classes¶
fallacy_removal.base.FallacyChain
Chain for applying logical fallacy evaluations, modeled after Constitutional AI and in same format, but applying logical fallacies as generalized rules to remove in output
fallacy_removal.models.LogicalFallacy
Class for a logical fallacy.
langchain_experimental.generative_agents¶
Generative Agents primitives.
Classes¶
generative_agents.generative_agent.GenerativeAgent
An Agent as a character with memory and innate characteristics.
generative_agents.memory.GenerativeAgentMemory
Memory for the generative agent.
langchain_experimental.graph_transformers¶
Classes¶
graph_transformers.diffbot.DiffbotGraphTransformer([...])
Transforms documents into graph documents using Diffbot's NLP API.
graph_transformers.diffbot.NodesList()
Manages a list of nodes with associated properties.
graph_transformers.diffbot.SimplifiedSchema() | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-6 | graph_transformers.diffbot.SimplifiedSchema()
Provides functionality for working with a simplified schema mapping.
Functions¶
graph_transformers.diffbot.format_property_key(s)
langchain_experimental.llm_bash¶
Chain that interprets a prompt and executes bash code to perform bash operations.
Classes¶
llm_bash.base.LLMBashChain
Chain that interprets a prompt and executes bash operations.
llm_bash.bash.BashProcess([strip_newlines, ...])
Wrapper class for starting subprocesses.
llm_bash.prompt.BashOutputParser
Parser for bash output.
langchain_experimental.llm_symbolic_math¶
Chain that interprets a prompt and executes python code to do math.
Heavily borrowed from llm_math, wrapper for SymPy
Classes¶
llm_symbolic_math.base.LLMSymbolicMathChain
Chain that interprets a prompt and executes python code to do symbolic math.
langchain_experimental.llms¶
Experimental LLM wrappers.
Classes¶
llms.anthropic_functions.AnthropicFunctions
Create a new model by parsing and validating input data from keyword arguments.
llms.anthropic_functions.TagParser()
A heavy-handed solution, but it's fast for prototyping.
llms.jsonformer_decoder.JsonFormer
Jsonformer wrapped LLM using HuggingFace Pipeline API.
llms.llamaapi.ChatLlamaAPI
Create a new model by parsing and validating input data from keyword arguments.
llms.lmformatenforcer_decoder.LMFormatEnforcer
LMFormatEnforcer wrapped LLM using HuggingFace Pipeline API.
llms.ollama_functions.OllamaFunctions
Create a new model by parsing and validating input data from keyword arguments.
llms.rellm_decoder.RELLM
RELLM wrapped LLM using HuggingFace Pipeline API.
Functions¶ | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-7 | RELLM wrapped LLM using HuggingFace Pipeline API.
Functions¶
llms.jsonformer_decoder.import_jsonformer()
Lazily import jsonformer.
llms.lmformatenforcer_decoder.import_lmformatenforcer()
Lazily import lmformatenforcer.
llms.rellm_decoder.import_rellm()
Lazily import rellm.
langchain_experimental.open_clip¶
Classes¶
open_clip.open_clip.OpenCLIPEmbeddings
Create a new model by parsing and validating input data from keyword arguments.
langchain_experimental.pal_chain¶
Implements Program-Aided Language Models.
As in https://arxiv.org/pdf/2211.10435.pdf.
This is vulnerable to arbitrary code execution:
https://github.com/langchain-ai/langchain/issues/5872
Classes¶
pal_chain.base.PALChain
Implements Program-Aided Language Models (PAL).
pal_chain.base.PALValidation([...])
Initialize a PALValidation instance.
langchain_experimental.plan_and_execute¶
Classes¶
plan_and_execute.agent_executor.PlanAndExecute
Plan and execute a chain of steps.
plan_and_execute.executors.base.BaseExecutor
Base executor.
plan_and_execute.executors.base.ChainExecutor
Chain executor.
plan_and_execute.planners.base.BasePlanner
Base planner.
plan_and_execute.planners.base.LLMPlanner
LLM planner.
plan_and_execute.planners.chat_planner.PlanningOutputParser
Planning output parser.
plan_and_execute.schema.BaseStepContainer
Base step container.
plan_and_execute.schema.ListStepContainer
List step container.
plan_and_execute.schema.Plan
Plan.
plan_and_execute.schema.PlanOutputParser
Plan output parser.
plan_and_execute.schema.Step
Step.
plan_and_execute.schema.StepResponse
Step response.
Functions¶ | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-8 | Step.
plan_and_execute.schema.StepResponse
Step response.
Functions¶
plan_and_execute.executors.agent_executor.load_agent_executor(...)
Load an agent executor.
plan_and_execute.planners.chat_planner.load_chat_planner(llm)
Load a chat planner.
langchain_experimental.prompt_injection_identifier¶
HuggingFace Security toolkit.
Classes¶
prompt_injection_identifier.hugging_face_identifier.HuggingFaceInjectionIdentifier
Tool that uses HF model to detect prompt injection attacks.
Functions¶
langchain_experimental.prompts¶
Functions¶
prompts.load.load_prompt(path)
Unified method for loading a prompt from LangChainHub or local fs.
langchain_experimental.retrievers¶
Classes¶
retrievers.vector_sql_database.VectorSQLDatabaseChainRetriever
Retriever that uses SQLDatabase as Retriever
langchain_experimental.rl_chain¶
Classes¶
rl_chain.base.AutoSelectionScorer
Create a new model by parsing and validating input data from keyword arguments.
rl_chain.base.Embedder(*args, **kwargs)
rl_chain.base.Event(inputs[, selected])
rl_chain.base.Policy(**kwargs)
rl_chain.base.RLChain
The RLChain class leverages the Vowpal Wabbit (VW) model as a learned policy for reinforcement learning.
rl_chain.base.Selected()
rl_chain.base.SelectionScorer
Abstract method to grade the chosen selection or the response of the llm
rl_chain.base.VwPolicy(model_repo, vw_cmd, ...)
rl_chain.metrics.MetricsTrackerAverage(step)
rl_chain.metrics.MetricsTrackerRollingWindow(...)
rl_chain.model_repository.ModelRepository(folder)
rl_chain.pick_best_chain.PickBest
PickBest is a class designed to leverage the Vowpal Wabbit (VW) model for reinforcement learning with a context, with the goal of modifying the prompt before the LLM call. | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-9 | rl_chain.pick_best_chain.PickBestEvent(...)
rl_chain.pick_best_chain.PickBestFeatureEmbedder(...)
Text Embedder class that embeds the BasedOn and ToSelectFrom inputs into a format that can be used by the learning policy
rl_chain.pick_best_chain.PickBestRandomPolicy(...)
rl_chain.pick_best_chain.PickBestSelected([...])
rl_chain.vw_logger.VwLogger(path)
Functions¶
rl_chain.base.BasedOn(anything)
rl_chain.base.Embed(anything[, keep])
rl_chain.base.EmbedAndKeep(anything)
rl_chain.base.ToSelectFrom(anything)
rl_chain.base.embed(to_embed, model[, namespace])
Embeds the actions or context using the SentenceTransformer model (or a model that has an encode function)
rl_chain.base.embed_dict_type(item, model)
Helper function to embed a dictionary item.
rl_chain.base.embed_list_type(item, model[, ...])
rl_chain.base.embed_string_type(item, model)
Helper function to embed a string or an _Embed object.
rl_chain.base.get_based_on_and_to_select_from(inputs)
rl_chain.base.is_stringtype_instance(item)
Helper function to check if an item is a string.
rl_chain.base.parse_lines(parser, input_str)
rl_chain.base.prepare_inputs_for_autoembed(inputs)
go over all the inputs and if something is either wrapped in _ToSelectFrom or _BasedOn, and if their inner values are not already _Embed, then wrap them in EmbedAndKeep while retaining their _ToSelectFrom or _BasedOn status
rl_chain.base.stringify_embedding(embedding)
langchain_experimental.smart_llm¶
Generalized implementation of SmartGPT (origin: https://youtu.be/wVzuvf9D9BU)
Classes¶
smart_llm.base.SmartLLMChain | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-10 | Classes¶
smart_llm.base.SmartLLMChain
Generalized implementation of SmartGPT (origin: https://youtu.be/wVzuvf9D9BU)
langchain_experimental.sql¶
Chain for interacting with SQL Database.
Classes¶
sql.base.SQLDatabaseChain
Chain for interacting with SQL Database.
sql.base.SQLDatabaseSequentialChain
Chain for querying SQL database that is a sequential chain.
sql.vector_sql.VectorSQLDatabaseChain
Chain for interacting with Vector SQL Database.
sql.vector_sql.VectorSQLOutputParser
Output Parser for Vector SQL 1.
sql.vector_sql.VectorSQLRetrieveAllOutputParser
Based on VectorSQLOutputParser It also modify the SQL to get all columns
Functions¶
sql.vector_sql.get_result_from_sqldb(db, cmd)
langchain_experimental.tabular_synthetic_data¶
Classes¶
tabular_synthetic_data.base.SyntheticDataGenerator
Generates synthetic data using the given LLM and few-shot template.
Functions¶
tabular_synthetic_data.openai.create_openai_data_generator(...)
Create an instance of SyntheticDataGenerator tailored for OpenAI models.
langchain_experimental.tools¶
Classes¶
tools.python.tool.PythonAstREPLTool
A tool for running python code in a REPL.
tools.python.tool.PythonInputs
Create a new model by parsing and validating input data from keyword arguments.
tools.python.tool.PythonREPLTool
A tool for running python code in a REPL.
Functions¶
tools.python.tool.sanitize_input(query)
Sanitize input to the python REPL.
langchain_experimental.tot¶
Classes¶
tot.base.ToTChain
A Chain implementing the Tree of Thought (ToT).
tot.checker.ToTChecker
Tree of Thought (ToT) checker.
tot.controller.ToTController([c]) | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
171c9b7abdfd-11 | Tree of Thought (ToT) checker.
tot.controller.ToTController([c])
Tree of Thought (ToT) controller.
tot.memory.ToTDFSMemory([stack])
Memory for the Tree of Thought (ToT) chain.
tot.prompts.CheckerOutputParser
tot.prompts.JSONListOutputParser
Class to parse the output of a PROPOSE_PROMPT response.
tot.thought.Thought
Create a new model by parsing and validating input data from keyword arguments.
tot.thought.ThoughtValidity(value[, names, ...])
tot.thought_generation.BaseThoughtGenerationStrategy
Base class for a thought generation strategy.
tot.thought_generation.ProposePromptStrategy
Propose thoughts sequentially using a "propose prompt".
tot.thought_generation.SampleCoTStrategy
Sample thoughts from a Chain-of-Thought (CoT) prompt.
Functions¶
tot.prompts.get_cot_prompt()
tot.prompts.get_propose_prompt()
langchain_experimental.utilities¶
Classes¶
utilities.python.PythonREPL
Simulates a standalone Python REPL. | https://api.python.langchain.com/en/latest/experimental_api_reference.html |
5637fd1fb5d8-0 | langchain 0.0.350¶
langchain.agents¶
Agent is a class that uses an LLM to choose a sequence of actions to take.
In Chains, a sequence of actions is hardcoded. In Agents,
a language model is used as a reasoning engine to determine which actions
to take and in which order.
Agents select and use Tools and Toolkits for actions.
Class hierarchy:
BaseSingleActionAgent --> LLMSingleActionAgent
OpenAIFunctionsAgent
XMLAgent
Agent --> <name>Agent # Examples: ZeroShotAgent, ChatAgent
BaseMultiActionAgent --> OpenAIMultiFunctionsAgent
Main helpers:
AgentType, AgentExecutor, AgentOutputParser, AgentExecutorIterator,
AgentAction, AgentFinish
Classes¶
agents.agent.Agent
Agent that calls the language model and deciding the action.
agents.agent.AgentExecutor
Agent that is using tools.
agents.agent.AgentOutputParser
Base class for parsing agent output into agent action/finish.
agents.agent.BaseMultiActionAgent
Base Multi Action Agent class.
agents.agent.BaseSingleActionAgent
Base Single Action Agent class.
agents.agent.ExceptionTool
Tool that just returns the query.
agents.agent.LLMSingleActionAgent
Base class for single action agents.
agents.agent.MultiActionAgentOutputParser
Base class for parsing agent output into agent actions/finish.
agents.agent.RunnableAgent
Agent powered by runnables.
agents.agent.RunnableMultiActionAgent
Agent powered by runnables.
agents.agent_iterator.AgentExecutorIterator(...)
Iterator for AgentExecutor.
agents.agent_toolkits.vectorstore.toolkit.VectorStoreInfo
Information about a VectorStore.
agents.agent_toolkits.vectorstore.toolkit.VectorStoreRouterToolkit
Toolkit for routing between Vector Stores.
agents.agent_toolkits.vectorstore.toolkit.VectorStoreToolkit
Toolkit for interacting with a Vector Store. | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-1 | Toolkit for interacting with a Vector Store.
agents.agent_types.AgentType(value[, names, ...])
An enum for agent types.
agents.chat.base.ChatAgent
Chat Agent.
agents.chat.output_parser.ChatOutputParser
Output parser for the chat agent.
agents.conversational.base.ConversationalAgent
An agent that holds a conversation in addition to using tools.
agents.conversational.output_parser.ConvoOutputParser
Output parser for the conversational agent.
agents.conversational_chat.base.ConversationalChatAgent
An agent designed to hold a conversation in addition to using tools.
agents.conversational_chat.output_parser.ConvoOutputParser
Output parser for the conversational agent.
agents.mrkl.base.ChainConfig(action_name, ...)
Configuration for chain to use in MRKL system.
agents.mrkl.base.MRKLChain
[Deprecated] Chain that implements the MRKL system.
agents.mrkl.base.ZeroShotAgent
Agent for the MRKL chain.
agents.mrkl.output_parser.MRKLOutputParser
MRKL Output parser for the chat agent.
agents.openai_assistant.base.OpenAIAssistantAction
AgentAction with info needed to submit custom tool output to existing run.
agents.openai_assistant.base.OpenAIAssistantFinish
AgentFinish with run and thread metadata.
agents.openai_assistant.base.OpenAIAssistantRunnable
Run an OpenAI Assistant.
agents.openai_functions_agent.agent_token_buffer_memory.AgentTokenBufferMemory
Memory used to save agent output AND intermediate steps.
agents.openai_functions_agent.base.OpenAIFunctionsAgent
An Agent driven by OpenAIs function powered API.
agents.openai_functions_multi_agent.base.OpenAIMultiFunctionsAgent
An Agent driven by OpenAIs function powered API.
agents.output_parsers.json.JSONAgentOutputParser
Parses tool invocations and final answers in JSON format. | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-2 | Parses tool invocations and final answers in JSON format.
agents.output_parsers.openai_functions.OpenAIFunctionsAgentOutputParser
Parses a message into agent action/finish.
agents.output_parsers.openai_tools.OpenAIToolAgentAction
Override init to support instantiation by position for backward compat.
agents.output_parsers.openai_tools.OpenAIToolsAgentOutputParser
Parses a message into agent actions/finish.
agents.output_parsers.react_json_single_input.ReActJsonSingleInputOutputParser
Parses ReAct-style LLM calls that have a single tool input in json format.
agents.output_parsers.react_single_input.ReActSingleInputOutputParser
Parses ReAct-style LLM calls that have a single tool input.
agents.output_parsers.self_ask.SelfAskOutputParser
Parses self-ask style LLM calls.
agents.output_parsers.xml.XMLAgentOutputParser
Parses tool invocations and final answers in XML format.
agents.react.base.DocstoreExplorer(docstore)
Class to assist with exploration of a document store.
agents.react.base.ReActChain
[Deprecated] Chain that implements the ReAct paper.
agents.react.base.ReActDocstoreAgent
Agent for the ReAct chain.
agents.react.base.ReActTextWorldAgent
Agent for the ReAct TextWorld chain.
agents.react.output_parser.ReActOutputParser
Output parser for the ReAct agent.
agents.schema.AgentScratchPadChatPromptTemplate
Chat prompt template for the agent scratchpad.
agents.self_ask_with_search.base.SelfAskWithSearchAgent
Agent for the self-ask-with-search paper.
agents.self_ask_with_search.base.SelfAskWithSearchChain
[Deprecated] Chain that does self-ask with search.
agents.structured_chat.base.StructuredChatAgent
Structured Chat Agent.
agents.structured_chat.output_parser.StructuredChatOutputParser | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-3 | Structured Chat Agent.
agents.structured_chat.output_parser.StructuredChatOutputParser
Output parser for the structured chat agent.
agents.structured_chat.output_parser.StructuredChatOutputParserWithRetries
Output parser with retries for the structured chat agent.
agents.tools.InvalidTool
Tool that is run when invalid tool name is encountered by agent.
agents.xml.base.XMLAgent
Agent that uses XML tags.
Functions¶
agents.agent_toolkits.conversational_retrieval.openai_functions.create_conversational_retrieval_agent(...)
A convenience method for creating a conversational retrieval agent.
agents.agent_toolkits.vectorstore.base.create_vectorstore_agent(...)
Construct a VectorStore agent from an LLM and tools.
agents.agent_toolkits.vectorstore.base.create_vectorstore_router_agent(...)
Construct a VectorStore router agent from an LLM and tools.
agents.format_scratchpad.log.format_log_to_str(...)
Construct the scratchpad that lets the agent continue its thought process.
agents.format_scratchpad.log_to_messages.format_log_to_messages(...)
Construct the scratchpad that lets the agent continue its thought process.
agents.format_scratchpad.openai_functions.format_to_openai_function_messages(...)
Convert (AgentAction, tool output) tuples into FunctionMessages.
agents.format_scratchpad.openai_functions.format_to_openai_functions(...)
Convert (AgentAction, tool output) tuples into FunctionMessages.
agents.format_scratchpad.openai_tools.format_to_openai_tool_messages(...)
Convert (AgentAction, tool output) tuples into FunctionMessages.
agents.format_scratchpad.xml.format_xml(...)
Format the intermediate steps as XML.
agents.initialize.initialize_agent(tools, llm)
Load an agent executor given tools and LLM.
agents.load_tools.get_all_tool_names()
Get a list of all possible tool names.
agents.load_tools.load_huggingface_tool(...)
Loads a tool from the HuggingFace Hub. | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-4 | Loads a tool from the HuggingFace Hub.
agents.load_tools.load_tools(tool_names[, ...])
Load tools based on their name.
agents.loading.load_agent(path, **kwargs)
Unified method for loading an agent from LangChainHub or local fs.
agents.loading.load_agent_from_config(config)
Load agent from Config Dict.
agents.output_parsers.openai_tools.parse_ai_message_to_openai_tool_action(message)
Parse an AI message potentially containing tool_calls.
agents.utils.validate_tools_single_input(...)
Validate tools for single input.
langchain.callbacks¶
Callback handlers allow listening to events in LangChain.
Class hierarchy:
BaseCallbackHandler --> <name>CallbackHandler # Example: AimCallbackHandler
Classes¶
callbacks.file.FileCallbackHandler(filename)
Callback Handler that writes to a file.
callbacks.streaming_aiter.AsyncIteratorCallbackHandler()
Callback handler that returns an async iterator.
callbacks.streaming_aiter_final_only.AsyncFinalIteratorCallbackHandler(*)
Callback handler that returns an async iterator.
callbacks.streaming_stdout_final_only.FinalStreamingStdOutCallbackHandler(*)
Callback handler for streaming in agents.
callbacks.tracers.logging.LoggingCallbackHandler(logger)
Tracer that logs via the input Logger.
langchain.chains¶
Chains are easily reusable components linked together.
Chains encode a sequence of calls to components like models, document retrievers,
other Chains, etc., and provide a simple interface to this sequence.
The Chain interface makes it easy to create apps that are:
Stateful: add Memory to any Chain to give it state,
Observable: pass Callbacks to a Chain to execute additional functionality,
like logging, outside the main sequence of component calls,
Composable: combine Chains with other components, including other Chains.
Class hierarchy:
Chain --> <name>Chain # Examples: LLMChain, MapReduceChain, RouterChain | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-5 | Classes¶
chains.api.base.APIChain
Chain that makes API calls and summarizes the responses to answer a question.
chains.api.openapi.chain.OpenAPIEndpointChain
Chain interacts with an OpenAPI endpoint using natural language.
chains.api.openapi.requests_chain.APIRequesterChain
Get the request parser.
chains.api.openapi.requests_chain.APIRequesterOutputParser
Parse the request and error tags.
chains.api.openapi.response_chain.APIResponderChain
Get the response parser.
chains.api.openapi.response_chain.APIResponderOutputParser
Parse the response and error tags.
chains.base.Chain
Abstract base class for creating structured sequences of calls to components.
chains.combine_documents.base.AnalyzeDocumentChain
Chain that splits documents, then analyzes it in pieces.
chains.combine_documents.base.BaseCombineDocumentsChain
Base interface for chains combining documents.
chains.combine_documents.map_reduce.MapReduceDocumentsChain
Combining documents by mapping a chain over them, then combining results.
chains.combine_documents.map_rerank.MapRerankDocumentsChain
Combining documents by mapping a chain over them, then reranking results.
chains.combine_documents.reduce.AsyncCombineDocsProtocol(...)
Interface for the combine_docs method.
chains.combine_documents.reduce.CombineDocsProtocol(...)
Interface for the combine_docs method.
chains.combine_documents.reduce.ReduceDocumentsChain
Combine documents by recursively reducing them.
chains.combine_documents.refine.RefineDocumentsChain
Combine documents by doing a first pass and then refining on more documents.
chains.combine_documents.stuff.StuffDocumentsChain
Chain that combines documents by stuffing into context.
chains.constitutional_ai.base.ConstitutionalChain
Chain for applying constitutional principles.
chains.constitutional_ai.models.ConstitutionalPrinciple
Class for a constitutional principle.
chains.conversation.base.ConversationChain
Chain to have a conversation and load context from memory.
chains.conversational_retrieval.base.BaseConversationalRetrievalChain | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-6 | chains.conversational_retrieval.base.BaseConversationalRetrievalChain
Chain for chatting with an index.
chains.conversational_retrieval.base.ChatVectorDBChain
Chain for chatting with a vector database.
chains.conversational_retrieval.base.ConversationalRetrievalChain
Chain for having a conversation based on retrieved documents.
chains.conversational_retrieval.base.InputType
Create a new model by parsing and validating input data from keyword arguments.
chains.elasticsearch_database.base.ElasticsearchDatabaseChain
Chain for interacting with Elasticsearch Database.
chains.flare.base.FlareChain
Chain that combines a retriever, a question generator, and a response generator.
chains.flare.base.QuestionGeneratorChain
Chain that generates questions from uncertain spans.
chains.flare.prompts.FinishedOutputParser
Output parser that checks if the output is finished.
chains.graph_qa.arangodb.ArangoGraphQAChain
Chain for question-answering against a graph by generating AQL statements.
chains.graph_qa.base.GraphQAChain
Chain for question-answering against a graph.
chains.graph_qa.cypher.GraphCypherQAChain
Chain for question-answering against a graph by generating Cypher statements.
chains.graph_qa.cypher_utils.CypherQueryCorrector(schemas)
Used to correct relationship direction in generated Cypher statements.
chains.graph_qa.cypher_utils.Schema(...)
Create new instance of Schema(left_node, relation, right_node)
chains.graph_qa.falkordb.FalkorDBQAChain
Chain for question-answering against a graph by generating Cypher statements.
chains.graph_qa.hugegraph.HugeGraphQAChain
Chain for question-answering against a graph by generating gremlin statements.
chains.graph_qa.kuzu.KuzuQAChain | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-7 | chains.graph_qa.kuzu.KuzuQAChain
Question-answering against a graph by generating Cypher statements for Kùzu.
chains.graph_qa.nebulagraph.NebulaGraphQAChain
Chain for question-answering against a graph by generating nGQL statements.
chains.graph_qa.neptune_cypher.NeptuneOpenCypherQAChain
Chain for question-answering against a Neptune graph by generating openCypher statements.
chains.graph_qa.sparql.GraphSparqlQAChain
Question-answering against an RDF or OWL graph by generating SPARQL statements.
chains.hyde.base.HypotheticalDocumentEmbedder
Generate hypothetical document for query, and then embed that.
chains.llm.LLMChain
Chain to run queries against LLMs.
chains.llm_checker.base.LLMCheckerChain
Chain for question-answering with self-verification.
chains.llm_math.base.LLMMathChain
Chain that interprets a prompt and executes python code to do math.
chains.llm_requests.LLMRequestsChain
Chain that requests a URL and then uses an LLM to parse results.
chains.llm_summarization_checker.base.LLMSummarizationCheckerChain
Chain for question-answering with self-verification.
chains.mapreduce.MapReduceChain
Map-reduce chain.
chains.moderation.OpenAIModerationChain
Pass input through a moderation endpoint.
chains.natbot.base.NatBotChain
Implement an LLM driven browser.
chains.natbot.crawler.Crawler()
A crawler for web pages.
chains.natbot.crawler.ElementInViewPort
A typed dictionary containing information about elements in the viewport.
chains.openai_functions.citation_fuzzy_match.FactWithEvidence
Class representing a single statement.
chains.openai_functions.citation_fuzzy_match.QuestionAnswer | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-8 | Class representing a single statement.
chains.openai_functions.citation_fuzzy_match.QuestionAnswer
A question and its answer as a list of facts each one should have a source.
chains.openai_functions.openapi.SimpleRequestChain
Chain for making a simple request to an API endpoint.
chains.openai_functions.qa_with_structure.AnswerWithSources
An answer to the question, with sources.
chains.prompt_selector.BasePromptSelector
Base class for prompt selectors.
chains.prompt_selector.ConditionalPromptSelector
Prompt collection that goes through conditionals.
chains.qa_generation.base.QAGenerationChain
Base class for question-answer generation chains.
chains.qa_with_sources.base.BaseQAWithSourcesChain
Question answering chain with sources over documents.
chains.qa_with_sources.base.QAWithSourcesChain
Question answering with sources over documents.
chains.qa_with_sources.loading.LoadingCallable(...)
Interface for loading the combine documents chain.
chains.qa_with_sources.retrieval.RetrievalQAWithSourcesChain
Question-answering with sources over an index.
chains.qa_with_sources.vector_db.VectorDBQAWithSourcesChain
Question-answering with sources over a vector database.
chains.query_constructor.base.StructuredQueryOutputParser
Output parser that parses a structured query.
chains.query_constructor.ir.Comparator(value)
Enumerator of the comparison operators.
chains.query_constructor.ir.Comparison
A comparison to a value.
chains.query_constructor.ir.Expr
Base class for all expressions.
chains.query_constructor.ir.FilterDirective
A filtering expression.
chains.query_constructor.ir.Operation
A logical operation over other directives.
chains.query_constructor.ir.Operator(value)
Enumerator of the operations.
chains.query_constructor.ir.StructuredQuery
A structured query.
chains.query_constructor.ir.Visitor()
Defines interface for IR translation using visitor pattern.
chains.query_constructor.parser.ISO8601Date | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-9 | Defines interface for IR translation using visitor pattern.
chains.query_constructor.parser.ISO8601Date
A date in ISO 8601 format (YYYY-MM-DD).
chains.query_constructor.schema.AttributeInfo
Information about a data source attribute.
chains.retrieval_qa.base.BaseRetrievalQA
Base class for question-answering chains.
chains.retrieval_qa.base.RetrievalQA
Chain for question-answering against an index.
chains.retrieval_qa.base.VectorDBQA
Chain for question-answering against a vector database.
chains.router.base.MultiRouteChain
Use a single chain to route an input to one of multiple candidate chains.
chains.router.base.Route(destination, ...)
Create new instance of Route(destination, next_inputs)
chains.router.base.RouterChain
Chain that outputs the name of a destination chain and the inputs to it.
chains.router.embedding_router.EmbeddingRouterChain
Chain that uses embeddings to route between options.
chains.router.llm_router.LLMRouterChain
A router chain that uses an LLM chain to perform routing.
chains.router.llm_router.RouterOutputParser
Parser for output of router chain in the multi-prompt chain.
chains.router.multi_prompt.MultiPromptChain
A multi-route chain that uses an LLM router chain to choose amongst prompts.
chains.router.multi_retrieval_qa.MultiRetrievalQAChain
A multi-route chain that uses an LLM router chain to choose amongst retrieval qa chains.
chains.sequential.SequentialChain
Chain where the outputs of one chain feed directly into next.
chains.sequential.SimpleSequentialChain
Simple chain where the outputs of one step feed directly into next.
chains.sql_database.query.SQLInput
Input for a SQL Chain.
chains.sql_database.query.SQLInputWithTables
Input for a SQL Chain.
chains.transform.TransformChain
Chain that transforms the chain output.
Functions¶ | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-10 | chains.transform.TransformChain
Chain that transforms the chain output.
Functions¶
chains.combine_documents.reduce.acollapse_docs(...)
Execute a collapse function on a set of documents and merge their metadatas.
chains.combine_documents.reduce.collapse_docs(...)
Execute a collapse function on a set of documents and merge their metadatas.
chains.combine_documents.reduce.split_list_of_docs(...)
Split Documents into subsets that each meet a cumulative length constraint.
chains.ernie_functions.base.convert_python_function_to_ernie_function(...)
Convert a Python function to an Ernie function-calling API compatible dict.
chains.ernie_functions.base.convert_to_ernie_function(...)
Convert a raw function/class to an Ernie function.
chains.ernie_functions.base.create_ernie_fn_chain(...)
[Legacy] Create an LLM chain that uses Ernie functions.
chains.ernie_functions.base.create_ernie_fn_runnable(...)
Create a runnable sequence that uses Ernie functions.
chains.ernie_functions.base.create_structured_output_chain(...)
[Legacy] Create an LLMChain that uses an Ernie function to get a structured output.
chains.ernie_functions.base.create_structured_output_runnable(...)
Create a runnable that uses an Ernie function to get a structured output.
chains.ernie_functions.base.get_ernie_output_parser(...)
Get the appropriate function output parser given the user functions.
chains.example_generator.generate_example(...)
Return another example given a list of examples for a prompt.
chains.graph_qa.cypher.construct_schema(...)
Filter the schema based on included or excluded types
chains.graph_qa.cypher.extract_cypher(text)
Extract Cypher code from a text.
chains.graph_qa.falkordb.extract_cypher(text)
Extract Cypher code from a text.
chains.graph_qa.neptune_cypher.extract_cypher(text) | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-11 | chains.graph_qa.neptune_cypher.extract_cypher(text)
Extract Cypher code from text using Regex.
chains.graph_qa.neptune_cypher.trim_query(query)
Trim the query to only include Cypher keywords.
chains.graph_qa.neptune_cypher.use_simple_prompt(llm)
Decides whether to use the simple prompt
chains.loading.load_chain(path, **kwargs)
Unified method for loading a chain from LangChainHub or local fs.
chains.loading.load_chain_from_config(...)
Load chain from Config Dict.
chains.openai_functions.base.convert_python_function_to_openai_function(...)
Convert a Python function to an OpenAI function-calling API compatible dict.
chains.openai_functions.base.convert_to_openai_function(...)
Convert a raw function/class to an OpenAI function.
chains.openai_functions.base.create_openai_fn_chain(...)
[Legacy] Create an LLM chain that uses OpenAI functions.
chains.openai_functions.base.create_openai_fn_runnable(...)
Create a runnable sequence that uses OpenAI functions.
chains.openai_functions.base.create_structured_output_chain(...)
[Legacy] Create an LLMChain that uses an OpenAI function to get a structured output.
chains.openai_functions.base.create_structured_output_runnable(...)
Create a runnable that uses an OpenAI function to get a structured output.
chains.openai_functions.base.get_openai_output_parser(...)
Get the appropriate function output parser given the user functions.
chains.openai_functions.citation_fuzzy_match.create_citation_fuzzy_match_chain(llm)
Create a citation fuzzy match chain.
chains.openai_functions.extraction.create_extraction_chain(...)
Creates a chain that extracts information from a passage.
chains.openai_functions.extraction.create_extraction_chain_pydantic(...)
Creates a chain that extracts information from a passage using pydantic schema.
chains.openai_functions.openapi.get_openapi_chain(spec) | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-12 | chains.openai_functions.openapi.get_openapi_chain(spec)
Create a chain for querying an API from a OpenAPI spec.
chains.openai_functions.openapi.openapi_spec_to_openai_fn(spec)
Convert a valid OpenAPI spec to the JSON Schema format expected for OpenAI
chains.openai_functions.qa_with_structure.create_qa_with_sources_chain(llm)
Create a question answering chain that returns an answer with sources.
chains.openai_functions.qa_with_structure.create_qa_with_structure_chain(...)
Create a question answering chain that returns an answer with sources
chains.openai_functions.tagging.create_tagging_chain(...)
Creates a chain that extracts information from a passage
chains.openai_functions.tagging.create_tagging_chain_pydantic(...)
Creates a chain that extracts information from a passage
chains.openai_functions.utils.get_llm_kwargs(...)
Returns the kwargs for the LLMChain constructor.
chains.openai_tools.extraction.create_extraction_chain_pydantic(...)
chains.prompt_selector.is_chat_model(llm)
Check if the language model is a chat model.
chains.prompt_selector.is_llm(llm)
Check if the language model is a LLM.
chains.qa_with_sources.loading.load_qa_with_sources_chain(llm)
Load a question answering with sources chain.
chains.query_constructor.base.construct_examples(...)
Construct examples from input-output pairs.
chains.query_constructor.base.fix_filter_directive(...)
Fix invalid filter directive.
chains.query_constructor.base.get_query_constructor_prompt(...)
Create query construction prompt.
chains.query_constructor.base.load_query_constructor_chain(...)
Load a query constructor chain.
chains.query_constructor.base.load_query_constructor_runnable(...)
Load a query constructor runnable chain.
chains.query_constructor.parser.get_parser([...])
Returns a parser for the query language.
chains.query_constructor.parser.v_args(...)
Dummy decorator for when lark is not installed. | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-13 | chains.query_constructor.parser.v_args(...)
Dummy decorator for when lark is not installed.
chains.sql_database.query.create_sql_query_chain(llm, db)
Create a chain that generates SQL queries.
langchain.embeddings¶
Embedding models are wrappers around embedding models
from different APIs and services.
Embedding models can be LLMs or not.
Class hierarchy:
Embeddings --> <name>Embeddings # Examples: OpenAIEmbeddings, HuggingFaceEmbeddings
Classes¶
embeddings.cache.CacheBackedEmbeddings(...)
Interface for caching results from embedding models.
Functions¶
langchain.evaluation¶
Evaluation chains for grading LLM and Chain outputs.
This module contains off-the-shelf evaluation chains for grading the output of
LangChain primitives such as language models and chains.
Loading an evaluator
To load an evaluator, you can use the load_evaluators or
load_evaluator functions with the
names of the evaluators to load.
from langchain.evaluation import load_evaluator
evaluator = load_evaluator("qa")
evaluator.evaluate_strings(
prediction="We sold more than 40,000 units last week",
input="How many units did we sell last week?",
reference="We sold 32,378 units",
)
The evaluator must be one of EvaluatorType.
Datasets
To load one of the LangChain HuggingFace datasets, you can use the load_dataset function with the
name of the dataset to load.
from langchain.evaluation import load_dataset
ds = load_dataset("llm-math")
Some common use cases for evaluation include:
Grading the accuracy of a response against ground truth answers: QAEvalChain
Comparing the output of two models: PairwiseStringEvalChain or LabeledPairwiseStringEvalChain when there is additionally a reference label. | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-14 | Judging the efficacy of an agent’s tool usage: TrajectoryEvalChain
Checking whether an output complies with a set of criteria: CriteriaEvalChain or LabeledCriteriaEvalChain when there is additionally a reference label.
Computing semantic difference between a prediction and reference: EmbeddingDistanceEvalChain or between two predictions: PairwiseEmbeddingDistanceEvalChain
Measuring the string distance between a prediction and reference StringDistanceEvalChain or between two predictions PairwiseStringDistanceEvalChain
Low-level API
These evaluators implement one of the following interfaces:
StringEvaluator: Evaluate a prediction string against a reference label and/or input context.
PairwiseStringEvaluator: Evaluate two prediction strings against each other. Useful for scoring preferences, measuring similarity between two chain or llm agents, or comparing outputs on similar inputs.
AgentTrajectoryEvaluator Evaluate the full sequence of actions taken by an agent.
These interfaces enable easier composability and usage within a higher level evaluation framework.
Classes¶
evaluation.agents.trajectory_eval_chain.TrajectoryEval
A named tuple containing the score and reasoning for a trajectory.
evaluation.agents.trajectory_eval_chain.TrajectoryEvalChain
A chain for evaluating ReAct style agents.
evaluation.agents.trajectory_eval_chain.TrajectoryOutputParser
Trajectory output parser.
evaluation.comparison.eval_chain.LabeledPairwiseStringEvalChain
A chain for comparing two outputs, such as the outputs
evaluation.comparison.eval_chain.PairwiseStringEvalChain
A chain for comparing two outputs, such as the outputs
evaluation.comparison.eval_chain.PairwiseStringResultOutputParser
A parser for the output of the PairwiseStringEvalChain.
evaluation.criteria.eval_chain.Criteria(value)
A Criteria to evaluate.
evaluation.criteria.eval_chain.CriteriaEvalChain
LLM Chain for evaluating runs against criteria.
evaluation.criteria.eval_chain.CriteriaResultOutputParser
A parser for the output of the CriteriaEvalChain. | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-15 | A parser for the output of the CriteriaEvalChain.
evaluation.criteria.eval_chain.LabeledCriteriaEvalChain
Criteria evaluation chain that requires references.
evaluation.embedding_distance.base.EmbeddingDistance(value)
Embedding Distance Metric.
evaluation.embedding_distance.base.EmbeddingDistanceEvalChain
Use embedding distances to score semantic difference between a prediction and reference.
evaluation.embedding_distance.base.PairwiseEmbeddingDistanceEvalChain
Use embedding distances to score semantic difference between two predictions.
evaluation.exact_match.base.ExactMatchStringEvaluator(*)
Compute an exact match between the prediction and the reference.
evaluation.parsing.base.JsonEqualityEvaluator([...])
Evaluates whether the prediction is equal to the reference after
evaluation.parsing.base.JsonValidityEvaluator(...)
Evaluates whether the prediction is valid JSON.
evaluation.parsing.json_distance.JsonEditDistanceEvaluator([...])
An evaluator that calculates the edit distance between JSON strings.
evaluation.parsing.json_schema.JsonSchemaEvaluator(...)
An evaluator that validates a JSON prediction against a JSON schema reference.
evaluation.qa.eval_chain.ContextQAEvalChain
LLM Chain for evaluating QA w/o GT based on context
evaluation.qa.eval_chain.CotQAEvalChain
LLM Chain for evaluating QA using chain of thought reasoning.
evaluation.qa.eval_chain.QAEvalChain
LLM Chain for evaluating question answering.
evaluation.qa.generate_chain.QAGenerateChain
LLM Chain for generating examples for question answering.
evaluation.regex_match.base.RegexMatchStringEvaluator(*)
Compute a regex match between the prediction and the reference.
evaluation.schema.AgentTrajectoryEvaluator()
Interface for evaluating agent trajectories.
evaluation.schema.EvaluatorType(value[, ...])
The types of the evaluators.
evaluation.schema.LLMEvalChain
A base class for evaluators that use an LLM.
evaluation.schema.PairwiseStringEvaluator() | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-16 | evaluation.schema.PairwiseStringEvaluator()
Compare the output of two models (or two outputs of the same model).
evaluation.schema.StringEvaluator()
Grade, tag, or otherwise evaluate predictions relative to their inputs and/or reference labels.
evaluation.scoring.eval_chain.LabeledScoreStringEvalChain
A chain for scoring the output of a model on a scale of 1-10.
evaluation.scoring.eval_chain.ScoreStringEvalChain
A chain for scoring on a scale of 1-10 the output of a model.
evaluation.scoring.eval_chain.ScoreStringResultOutputParser
A parser for the output of the ScoreStringEvalChain.
evaluation.string_distance.base.PairwiseStringDistanceEvalChain
Compute string edit distances between two predictions.
evaluation.string_distance.base.StringDistance(value)
Distance metric to use.
evaluation.string_distance.base.StringDistanceEvalChain
Compute string distances between the prediction and the reference.
Functions¶
evaluation.comparison.eval_chain.resolve_pairwise_criteria(...)
Resolve the criteria for the pairwise evaluator.
evaluation.criteria.eval_chain.resolve_criteria(...)
Resolve the criteria to evaluate.
evaluation.loading.load_dataset(uri)
Load a dataset from the LangChainDatasets on HuggingFace.
evaluation.loading.load_evaluator(evaluator, *)
Load the requested evaluation chain specified by a string.
evaluation.loading.load_evaluators(evaluators, *)
Load evaluators specified by a list of evaluator types.
evaluation.scoring.eval_chain.resolve_criteria(...)
Resolve the criteria for the pairwise evaluator.
langchain.hub¶
Interface with the LangChain Hub.
Functions¶
hub.pull(owner_repo_commit, *[, api_url, ...])
Pulls an object from the hub and returns it as a LangChain object.
hub.push(repo_full_name, object, *[, ...]) | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-17 | hub.push(repo_full_name, object, *[, ...])
Pushes an object to the hub and returns the URL it can be viewed at in a browser.
langchain.indexes¶
Code to support various indexing workflows.
Provides code to:
Create knowledge graphs from data.
Support indexing workflows from LangChain data loaders to vectorstores.
For indexing workflows, this code is used to avoid writing duplicated content
into the vectostore and to avoid over-writing content if it’s unchanged.
Importantly, this keeps on working even if the content being written is derived
via a set of transformations from some source content (e.g., indexing children
documents that were derived from parent documents by chunking.)
Classes¶
indexes.base.RecordManager(namespace)
An abstract base class representing the interface for a record manager.
indexes.graph.GraphIndexCreator
Functionality to create graph index.
indexes.vectorstore.VectorStoreIndexWrapper
Wrapper around a vectorstore for easy access.
indexes.vectorstore.VectorstoreIndexCreator
Logic for creating indexes.
Functions¶
langchain.memory¶
Memory maintains Chain state, incorporating context from past runs.
Class hierarchy for Memory:
BaseMemory --> BaseChatMemory --> <name>Memory # Examples: ZepMemory, MotorheadMemory
Main helpers:
BaseChatMessageHistory
Chat Message History stores the chat message history in different stores.
Class hierarchy for ChatMessageHistory:
BaseChatMessageHistory --> <name>ChatMessageHistory # Example: ZepChatMessageHistory
Main helpers:
AIMessage, BaseMessage, HumanMessage
Classes¶
memory.buffer.ConversationBufferMemory
Buffer for storing conversation memory.
memory.buffer.ConversationStringBufferMemory
Buffer for storing conversation memory.
memory.buffer_window.ConversationBufferWindowMemory
Buffer for storing conversation memory inside a limited size window.
memory.chat_memory.BaseChatMemory | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-18 | Buffer for storing conversation memory inside a limited size window.
memory.chat_memory.BaseChatMemory
Abstract base class for chat memory.
memory.combined.CombinedMemory
Combining multiple memories' data together.
memory.entity.BaseEntityStore
Abstract base class for Entity store.
memory.entity.ConversationEntityMemory
Entity extractor & summarizer memory.
memory.entity.InMemoryEntityStore
In-memory Entity store.
memory.entity.RedisEntityStore
Redis-backed Entity store.
memory.entity.SQLiteEntityStore
SQLite-backed Entity store
memory.entity.UpstashRedisEntityStore
Upstash Redis backed Entity store.
memory.kg.ConversationKGMemory
Knowledge graph conversation memory.
memory.motorhead_memory.MotorheadMemory
Chat message memory backed by Motorhead service.
memory.readonly.ReadOnlySharedMemory
A memory wrapper that is read-only and cannot be changed.
memory.simple.SimpleMemory
Simple memory for storing context or other information that shouldn't ever change between prompts.
memory.summary.ConversationSummaryMemory
Conversation summarizer to chat memory.
memory.summary.SummarizerMixin
Mixin for summarizer.
memory.summary_buffer.ConversationSummaryBufferMemory
Buffer with summarizer for storing conversation memory.
memory.token_buffer.ConversationTokenBufferMemory
Conversation chat memory with token limit.
memory.vectorstore.VectorStoreRetrieverMemory
VectorStoreRetriever-backed memory.
memory.zep_memory.ZepMemory
Persist your chain history to the Zep MemoryStore.
Functions¶
memory.utils.get_prompt_input_key(inputs, ...)
Get the prompt input key.
langchain.model_laboratory¶
Experiment with different models.
Classes¶
model_laboratory.ModelLaboratory(chains[, names])
Experiment with different models.
langchain.output_parsers¶
OutputParser classes parse the output of an LLM call.
Class hierarchy: | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-19 | OutputParser classes parse the output of an LLM call.
Class hierarchy:
BaseLLMOutputParser --> BaseOutputParser --> <name>OutputParser # ListOutputParser, PydanticOutputParser
Main helpers:
Serializable, Generation, PromptValue
Classes¶
output_parsers.boolean.BooleanOutputParser
Parse the output of an LLM call to a boolean.
output_parsers.combining.CombiningOutputParser
Combine multiple output parsers into one.
output_parsers.datetime.DatetimeOutputParser
Parse the output of an LLM call to a datetime.
output_parsers.enum.EnumOutputParser
Parse an output that is one of a set of values.
output_parsers.ernie_functions.JsonKeyOutputFunctionsParser
Parse an output as the element of the Json object.
output_parsers.ernie_functions.JsonOutputFunctionsParser
Parse an output as the Json object.
output_parsers.ernie_functions.OutputFunctionsParser
Parse an output that is one of sets of values.
output_parsers.ernie_functions.PydanticAttrOutputFunctionsParser
Parse an output as an attribute of a pydantic object.
output_parsers.ernie_functions.PydanticOutputFunctionsParser
Parse an output as a pydantic object.
output_parsers.fix.OutputFixingParser
Wraps a parser and tries to fix parsing errors.
output_parsers.json.SimpleJsonOutputParser
Parse the output of an LLM call to a JSON object.
output_parsers.openai_functions.JsonKeyOutputFunctionsParser
Parse an output as the element of the Json object.
output_parsers.openai_functions.JsonOutputFunctionsParser
Parse an output as the Json object.
output_parsers.openai_functions.OutputFunctionsParser
Parse an output that is one of sets of values.
output_parsers.openai_functions.PydanticAttrOutputFunctionsParser | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-20 | output_parsers.openai_functions.PydanticAttrOutputFunctionsParser
Parse an output as an attribute of a pydantic object.
output_parsers.openai_functions.PydanticOutputFunctionsParser
Parse an output as a pydantic object.
output_parsers.openai_tools.JsonOutputKeyToolsParser
Parse tools from OpenAI response.
output_parsers.openai_tools.JsonOutputToolsParser
Parse tools from OpenAI response.
output_parsers.openai_tools.PydanticToolsParser
Parse tools from OpenAI response.
output_parsers.pandas_dataframe.PandasDataFrameOutputParser
Parse an output using Pandas DataFrame format.
output_parsers.pydantic.PydanticOutputParser
Parse an output using a pydantic model.
output_parsers.rail_parser.GuardrailsOutputParser
Parse the output of an LLM call using Guardrails.
output_parsers.regex.RegexParser
Parse the output of an LLM call using a regex.
output_parsers.regex_dict.RegexDictParser
Parse the output of an LLM call into a Dictionary using a regex.
output_parsers.retry.RetryOutputParser
Wraps a parser and tries to fix parsing errors.
output_parsers.retry.RetryWithErrorOutputParser
Wraps a parser and tries to fix parsing errors.
output_parsers.structured.ResponseSchema
A schema for a response from a structured output parser.
output_parsers.structured.StructuredOutputParser
Parse the output of an LLM call to a structured output.
output_parsers.xml.XMLOutputParser
Parse an output using xml format.
output_parsers.yaml.YamlOutputParser
Parse YAML output using a pydantic model.
Functions¶
output_parsers.json.parse_and_check_json_markdown(...)
Parse a JSON string from a Markdown string and check that it contains the expected keys.
output_parsers.json.parse_json_markdown(...) | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-21 | output_parsers.json.parse_json_markdown(...)
Parse a JSON string from a Markdown string.
output_parsers.json.parse_partial_json(s, *)
Parse a JSON string that may be missing closing braces.
output_parsers.loading.load_output_parser(config)
Load an output parser.
langchain.prompts¶
Prompt is the input to the model.
Prompt is often constructed
from multiple components. Prompt classes and functions make constructing
and working with prompts easy.
Class hierarchy:
BasePromptTemplate --> PipelinePromptTemplate
StringPromptTemplate --> PromptTemplate
FewShotPromptTemplate
FewShotPromptWithTemplates
BaseChatPromptTemplate --> AutoGPTPrompt
ChatPromptTemplate --> AgentScratchPadChatPromptTemplate
BaseMessagePromptTemplate --> MessagesPlaceholder
BaseStringMessagePromptTemplate --> ChatMessagePromptTemplate
HumanMessagePromptTemplate
AIMessagePromptTemplate
SystemMessagePromptTemplate
PromptValue --> StringPromptValue
ChatPromptValue
Classes¶
prompts.example_selector.ngram_overlap.NGramOverlapExampleSelector
Select and order examples based on ngram overlap score (sentence_bleu score).
Functions¶
prompts.example_selector.ngram_overlap.ngram_overlap_score(...)
Compute ngram overlap score of source and example as sentence_bleu score.
langchain.retrievers¶
Retriever class returns Documents given a text query.
It is more general than a vector store. A retriever does not need to be able to
store documents, only to return (or retrieve) it. Vector stores can be used as
the backbone of a retriever, but there are other types of retrievers as well.
Class hierarchy:
BaseRetriever --> <name>Retriever # Examples: ArxivRetriever, MergerRetriever
Main helpers:
Document, Serializable, Callbacks, | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-22 | Main helpers:
Document, Serializable, Callbacks,
CallbackManagerForRetrieverRun, AsyncCallbackManagerForRetrieverRun
Classes¶
retrievers.contextual_compression.ContextualCompressionRetriever
Retriever that wraps a base retriever and compresses the results.
retrievers.document_compressors.base.BaseDocumentCompressor
Base class for document compressors.
retrievers.document_compressors.base.DocumentCompressorPipeline
Document compressor that uses a pipeline of Transformers.
retrievers.document_compressors.chain_extract.LLMChainExtractor
Document compressor that uses an LLM chain to extract the relevant parts of documents.
retrievers.document_compressors.chain_extract.NoOutputParser
Parse outputs that could return a null string of some sort.
retrievers.document_compressors.chain_filter.LLMChainFilter
Filter that drops documents that aren't relevant to the query.
retrievers.document_compressors.cohere_rerank.CohereRerank
Document compressor that uses Cohere Rerank API.
retrievers.document_compressors.embeddings_filter.EmbeddingsFilter
Document compressor that uses embeddings to drop documents unrelated to the query.
retrievers.ensemble.EnsembleRetriever
Retriever that ensembles the multiple retrievers.
retrievers.merger_retriever.MergerRetriever
Retriever that merges the results of multiple retrievers.
retrievers.multi_query.LineList
List of lines.
retrievers.multi_query.LineListOutputParser
Output parser for a list of lines.
retrievers.multi_query.MultiQueryRetriever
Given a query, use an LLM to write a set of queries.
retrievers.multi_vector.MultiVectorRetriever
Retrieve from a set of multiple embeddings for the same document.
retrievers.multi_vector.SearchType(value[, ...])
Enumerator of the types of search to perform. | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-23 | Enumerator of the types of search to perform.
retrievers.parent_document_retriever.ParentDocumentRetriever
Retrieve small chunks then retrieve their parent documents.
retrievers.re_phraser.RePhraseQueryRetriever
Given a query, use an LLM to re-phrase it.
retrievers.self_query.base.SelfQueryRetriever
Retriever that uses a vector store and an LLM to generate the vector store queries.
retrievers.self_query.chroma.ChromaTranslator()
Translate Chroma internal query language elements to valid filters.
retrievers.self_query.dashvector.DashvectorTranslator()
Logic for converting internal query language elements to valid filters.
retrievers.self_query.deeplake.DeepLakeTranslator()
Translate DeepLake internal query language elements to valid filters.
retrievers.self_query.elasticsearch.ElasticsearchTranslator()
Translate Elasticsearch internal query language elements to valid filters.
retrievers.self_query.milvus.MilvusTranslator()
Translate Milvus internal query language elements to valid filters.
retrievers.self_query.mongodb_atlas.MongoDBAtlasTranslator()
Translate Mongo internal query language elements to valid filters.
retrievers.self_query.myscale.MyScaleTranslator([...])
Translate MyScale internal query language elements to valid filters.
retrievers.self_query.opensearch.OpenSearchTranslator()
Translate OpenSearch internal query domain-specific language elements to valid filters.
retrievers.self_query.pinecone.PineconeTranslator()
Translate Pinecone internal query language elements to valid filters.
retrievers.self_query.qdrant.QdrantTranslator(...)
Translate Qdrant internal query language elements to valid filters.
retrievers.self_query.redis.RedisTranslator(schema)
Translate
retrievers.self_query.supabase.SupabaseVectorTranslator()
Translate Langchain filters to Supabase PostgREST filters. | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-24 | Translate Langchain filters to Supabase PostgREST filters.
retrievers.self_query.timescalevector.TimescaleVectorTranslator()
Translate the internal query language elements to valid filters.
retrievers.self_query.vectara.VectaraTranslator()
Translate Vectara internal query language elements to valid filters.
retrievers.self_query.weaviate.WeaviateTranslator()
Translate Weaviate internal query language elements to valid filters.
retrievers.time_weighted_retriever.TimeWeightedVectorStoreRetriever
Retriever that combines embedding similarity with recency in retrieving values.
retrievers.web_research.LineList
List of questions.
retrievers.web_research.QuestionListOutputParser
Output parser for a list of numbered questions.
retrievers.web_research.SearchQueries
Search queries to research for the user's goal.
retrievers.web_research.WebResearchRetriever
Google Search API retriever.
Functions¶
retrievers.document_compressors.chain_extract.default_get_input(...)
Return the compression chain input.
retrievers.document_compressors.chain_filter.default_get_input(...)
Return the compression chain input.
retrievers.self_query.deeplake.can_cast_to_float(string)
Check if a string can be cast to a float.
retrievers.self_query.milvus.process_value(value)
Convert a value to a string and add double quotes if it is a string.
retrievers.self_query.vectara.process_value(value)
Convert a value to a string and add single quotes if it is a string.
langchain.runnables¶
Classes¶
runnables.hub.HubRunnable
An instance of a runnable stored in the LangChain Hub.
runnables.openai_functions.OpenAIFunction
A function description for ChatOpenAI
runnables.openai_functions.OpenAIFunctionsRouter | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-25 | runnables.openai_functions.OpenAIFunctionsRouter
A runnable that routes to the selected function.
langchain.smith¶
LangSmith utilities.
This module provides utilities for connecting to LangSmith. For more information on LangSmith, see the LangSmith documentation.
Evaluation
LangSmith helps you evaluate Chains and other language model application components using a number of LangChain evaluators.
An example of this is shown below, assuming you’ve created a LangSmith dataset called <my_dataset_name>:
from langsmith import Client
from langchain.chat_models import ChatOpenAI
from langchain.chains import LLMChain
from langchain.smith import RunEvalConfig, run_on_dataset
# Chains may have memory. Passing in a constructor function lets the
# evaluation framework avoid cross-contamination between runs.
def construct_chain():
llm = ChatOpenAI(temperature=0)
chain = LLMChain.from_string(
llm,
"What's the answer to {your_input_key}"
)
return chain
# Load off-the-shelf evaluators via config or the EvaluatorType (string or enum)
evaluation_config = RunEvalConfig(
evaluators=[
"qa", # "Correctness" against a reference answer
"embedding_distance",
RunEvalConfig.Criteria("helpfulness"),
RunEvalConfig.Criteria({
"fifth-grader-score": "Do you have to be smarter than a fifth grader to answer this question?"
}),
]
)
client = Client()
run_on_dataset(
client,
"<my_dataset_name>",
construct_chain,
evaluation=evaluation_config,
)
You can also create custom evaluators by subclassing the
StringEvaluator
or LangSmith’s RunEvaluator classes.
from typing import Optional | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-26 | StringEvaluator
or LangSmith’s RunEvaluator classes.
from typing import Optional
from langchain.evaluation import StringEvaluator
class MyStringEvaluator(StringEvaluator):
@property
def requires_input(self) -> bool:
return False
@property
def requires_reference(self) -> bool:
return True
@property
def evaluation_name(self) -> str:
return "exact_match"
def _evaluate_strings(self, prediction, reference=None, input=None, **kwargs) -> dict:
return {"score": prediction == reference}
evaluation_config = RunEvalConfig(
custom_evaluators = [MyStringEvaluator()],
)
run_on_dataset(
client,
"<my_dataset_name>",
construct_chain,
evaluation=evaluation_config,
)
Primary Functions
arun_on_dataset: Asynchronous function to evaluate a chain, agent, or other LangChain component over a dataset.
run_on_dataset: Function to evaluate a chain, agent, or other LangChain component over a dataset.
RunEvalConfig: Class representing the configuration for running evaluation. You can select evaluators by EvaluatorType or config, or you can pass in custom_evaluators
Classes¶
smith.evaluation.config.EvalConfig
Configuration for a given run evaluator.
smith.evaluation.config.RunEvalConfig
Configuration for a run evaluation.
smith.evaluation.config.SingleKeyEvalConfig
Create a new model by parsing and validating input data from keyword arguments.
smith.evaluation.progress.ProgressBarCallback(total)
A simple progress bar for the console.
smith.evaluation.runner_utils.EvalError(...)
Your architecture raised an error.
smith.evaluation.runner_utils.InputFormatError
Raised when the input format is invalid.
smith.evaluation.runner_utils.TestResult
A dictionary of the results of a single test run. | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-27 | smith.evaluation.runner_utils.TestResult
A dictionary of the results of a single test run.
smith.evaluation.string_run_evaluator.ChainStringRunMapper
Extract items to evaluate from the run object from a chain.
smith.evaluation.string_run_evaluator.LLMStringRunMapper
Extract items to evaluate from the run object.
smith.evaluation.string_run_evaluator.StringExampleMapper
Map an example, or row in the dataset, to the inputs of an evaluation.
smith.evaluation.string_run_evaluator.StringRunEvaluatorChain
Evaluate Run and optional examples.
smith.evaluation.string_run_evaluator.StringRunMapper
Extract items to evaluate from the run object.
smith.evaluation.string_run_evaluator.ToolStringRunMapper
Map an input to the tool.
Functions¶
smith.evaluation.name_generation.random_name()
Generate a random name.
smith.evaluation.runner_utils.arun_on_dataset(...)
Run the Chain or language model on a dataset and store traces to the specified project name.
smith.evaluation.runner_utils.run_on_dataset(...)
Run the Chain or language model on a dataset and store traces to the specified project name.
langchain.storage¶
Implementations of key-value stores and storage helpers.
Module provides implementations of various key-value stores that conform
to a simple key-value interface.
The primary goal of these storages is to support implementation of caching.
Classes¶
storage.encoder_backed.EncoderBackedStore(...)
Wraps a store with key and value encoders/decoders.
storage.file_system.LocalFileStore(root_path)
BaseStore interface that works on the local file system.
storage.in_memory.InMemoryBaseStore()
In-memory implementation of the BaseStore using a dictionary.
langchain.text_splitter¶
Text Splitters are classes for splitting text.
Class hierarchy:
BaseDocumentTransformer --> TextSplitter --> <name>TextSplitter # Example: CharacterTextSplitter | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-28 | RecursiveCharacterTextSplitter --> <name>TextSplitter
Note: MarkdownHeaderTextSplitter and **HTMLHeaderTextSplitter do not derive from TextSplitter.
Main helpers:
Document, Tokenizer, Language, LineType, HeaderType
Classes¶
text_splitter.CharacterTextSplitter([...])
Splitting text that looks at characters.
text_splitter.ElementType
Element type as typed dict.
text_splitter.HTMLHeaderTextSplitter(...[, ...])
Splitting HTML files based on specified headers.
text_splitter.HeaderType
Header type as typed dict.
text_splitter.Language(value[, names, ...])
Enum of the programming languages.
text_splitter.LatexTextSplitter(**kwargs)
Attempts to split the text along Latex-formatted layout elements.
text_splitter.LineType
Line type as typed dict.
text_splitter.MarkdownHeaderTextSplitter(...)
Splitting markdown files based on specified headers.
text_splitter.MarkdownTextSplitter(**kwargs)
Attempts to split the text along Markdown-formatted headings.
text_splitter.NLTKTextSplitter([separator, ...])
Splitting text using NLTK package.
text_splitter.PythonCodeTextSplitter(**kwargs)
Attempts to split the text along Python syntax.
text_splitter.RecursiveCharacterTextSplitter([...])
Splitting text by recursively look at characters.
text_splitter.SentenceTransformersTokenTextSplitter([...])
Splitting text to tokens using sentence model tokenizer.
text_splitter.SpacyTextSplitter([separator, ...])
Splitting text using Spacy package.
text_splitter.TextSplitter(chunk_size, ...)
Interface for splitting text into chunks.
text_splitter.TokenTextSplitter([...])
Splitting text to tokens using model tokenizer.
text_splitter.Tokenizer(chunk_overlap, ...) | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
5637fd1fb5d8-29 | text_splitter.Tokenizer(chunk_overlap, ...)
Tokenizer data class.
Functions¶
text_splitter.split_text_on_tokens(*, text, ...)
Split incoming text and return chunks using tokenizer.
langchain.tools¶
Tools are classes that an Agent uses to interact with the world.
Each tool has a description. Agent uses the description to choose the right
tool for the job.
Class hierarchy:
ToolMetaclass --> BaseTool --> <name>Tool # Examples: AIPluginTool, BaseGraphQLTool
<name> # Examples: BraveSearch, HumanInputRun
Main helpers:
CallbackManagerForToolRun, AsyncCallbackManagerForToolRun
Classes¶
tools.retriever.RetrieverInput
Create a new model by parsing and validating input data from keyword arguments.
Functions¶
tools.render.render_text_description(tools)
Render the tool name and description in plain text.
tools.render.render_text_description_and_args(tools)
Render the tool name, description, and args in plain text.
tools.retriever.create_retriever_tool(...)
Create a tool to do retrieval of documents.
langchain.utils¶
Utility functions for LangChain.
These functions do not depend on any other LangChain module.
Classes¶
utils.ernie_functions.FunctionDescription
Representation of a callable function to the Ernie API.
utils.ernie_functions.ToolDescription
Representation of a callable function to the Ernie API.
Functions¶
utils.ernie_functions.convert_pydantic_to_ernie_function(...)
Converts a Pydantic model to a function description for the Ernie API.
utils.ernie_functions.convert_pydantic_to_ernie_tool(...)
Converts a Pydantic model to a function description for the Ernie API. | https://api.python.langchain.com/en/latest/langchain_api_reference.html |
82a11935e65b-0 | langchain_community 0.0.3¶
langchain_community.adapters¶
Classes¶
adapters.openai.Chat()
adapters.openai.ChatCompletion()
Chat completion.
adapters.openai.ChatCompletionChunk
Create a new model by parsing and validating input data from keyword arguments.
adapters.openai.ChatCompletions
Create a new model by parsing and validating input data from keyword arguments.
adapters.openai.Choice
Create a new model by parsing and validating input data from keyword arguments.
adapters.openai.ChoiceChunk
Create a new model by parsing and validating input data from keyword arguments.
adapters.openai.Completions()
Completion.
adapters.openai.IndexableBaseModel
Allows a BaseModel to return its fields by string variable indexing
Functions¶
adapters.openai.aenumerate(iterable[, start])
Async version of enumerate function.
adapters.openai.convert_dict_to_message(_dict)
Convert a dictionary to a LangChain message.
adapters.openai.convert_message_to_dict(message)
Convert a LangChain message to a dictionary.
adapters.openai.convert_messages_for_finetuning(...)
Convert messages to a list of lists of dictionaries for fine-tuning.
adapters.openai.convert_openai_messages(messages)
Convert dictionaries representing OpenAI messages to LangChain format.
langchain_community.agent_toolkits¶
Agent toolkits contain integrations with various resources and services.
LangChain has a large ecosystem of integrations with various external resources
like local and remote file systems, APIs and databases.
These integrations allow developers to create versatile applications that combine the
power of LLMs with the ability to access, interact with and manipulate external
resources.
When developing an application, developers should inspect the capabilities and
permissions of the tools that underlie the given agent toolkit, and determine
whether permissions of the given toolkit are appropriate for the application. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-1 | whether permissions of the given toolkit are appropriate for the application.
See [Security](https://python.langchain.com/docs/security) for more information.
Classes¶
agent_toolkits.ainetwork.toolkit.AINetworkToolkit
Toolkit for interacting with AINetwork Blockchain.
agent_toolkits.amadeus.toolkit.AmadeusToolkit
Toolkit for interacting with Amadeus which offers APIs for travel.
agent_toolkits.azure_cognitive_services.AzureCognitiveServicesToolkit
Toolkit for Azure Cognitive Services.
agent_toolkits.base.BaseToolkit
Base Toolkit representing a collection of related tools.
agent_toolkits.clickup.toolkit.ClickupToolkit
Clickup Toolkit.
agent_toolkits.file_management.toolkit.FileManagementToolkit
Toolkit for interacting with local files.
agent_toolkits.github.toolkit.BranchName
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.CommentOnIssue
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.CreateFile
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.CreatePR
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.CreateReviewRequest
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.DeleteFile
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.DirectoryPath
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.GetIssue
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.GetPR
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.GitHubToolkit
GitHub Toolkit.
agent_toolkits.github.toolkit.NoInput | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-2 | GitHub Toolkit.
agent_toolkits.github.toolkit.NoInput
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.ReadFile
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.SearchCode
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.SearchIssuesAndPRs
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.github.toolkit.UpdateFile
Create a new model by parsing and validating input data from keyword arguments.
agent_toolkits.gitlab.toolkit.GitLabToolkit
GitLab Toolkit.
agent_toolkits.gmail.toolkit.GmailToolkit
Toolkit for interacting with Gmail.
agent_toolkits.jira.toolkit.JiraToolkit
Jira Toolkit.
agent_toolkits.json.toolkit.JsonToolkit
Toolkit for interacting with a JSON spec.
agent_toolkits.multion.toolkit.MultionToolkit
Toolkit for interacting with the Browser Agent.
agent_toolkits.nasa.toolkit.NasaToolkit
Nasa Toolkit.
agent_toolkits.nla.tool.NLATool
Natural Language API Tool.
agent_toolkits.nla.toolkit.NLAToolkit
Natural Language API Toolkit.
agent_toolkits.office365.toolkit.O365Toolkit
Toolkit for interacting with Office 365.
agent_toolkits.openapi.planner.RequestsDeleteToolWithParsing
A tool that sends a DELETE request and parses the response.
agent_toolkits.openapi.planner.RequestsGetToolWithParsing
Requests GET tool with LLM-instructed extraction of truncated responses.
agent_toolkits.openapi.planner.RequestsPatchToolWithParsing
Requests PATCH tool with LLM-instructed extraction of truncated responses.
agent_toolkits.openapi.planner.RequestsPostToolWithParsing
Requests POST tool with LLM-instructed extraction of truncated responses. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-3 | Requests POST tool with LLM-instructed extraction of truncated responses.
agent_toolkits.openapi.planner.RequestsPutToolWithParsing
Requests PUT tool with LLM-instructed extraction of truncated responses.
agent_toolkits.openapi.spec.ReducedOpenAPISpec(...)
A reduced OpenAPI spec.
agent_toolkits.openapi.toolkit.OpenAPIToolkit
Toolkit for interacting with an OpenAPI API.
agent_toolkits.openapi.toolkit.RequestsToolkit
Toolkit for making REST requests.
agent_toolkits.playwright.toolkit.PlayWrightBrowserToolkit
Toolkit for PlayWright browser tools.
agent_toolkits.powerbi.toolkit.PowerBIToolkit
Toolkit for interacting with Power BI dataset.
agent_toolkits.slack.toolkit.SlackToolkit
Toolkit for interacting with Slack.
agent_toolkits.spark_sql.toolkit.SparkSQLToolkit
Toolkit for interacting with Spark SQL.
agent_toolkits.sql.toolkit.SQLDatabaseToolkit
Toolkit for interacting with SQL databases.
agent_toolkits.steam.toolkit.SteamToolkit
Steam Toolkit.
agent_toolkits.zapier.toolkit.ZapierToolkit
Zapier Toolkit.
Functions¶
agent_toolkits.json.base.create_json_agent(...)
Construct a json agent from an LLM and tools.
agent_toolkits.openapi.base.create_openapi_agent(...)
Construct an OpenAPI agent from an LLM and tools.
agent_toolkits.openapi.planner.create_openapi_agent(...)
Instantiate OpenAI API planner and controller for a given spec.
agent_toolkits.openapi.spec.reduce_openapi_spec(spec)
Simplify/distill/minify a spec somehow.
agent_toolkits.powerbi.base.create_pbi_agent(llm)
Construct a Power BI agent from an LLM and tools.
agent_toolkits.powerbi.chat_base.create_pbi_chat_agent(llm)
Construct a Power BI agent from a Chat LLM and tools. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-4 | Construct a Power BI agent from a Chat LLM and tools.
agent_toolkits.spark_sql.base.create_spark_sql_agent(...)
Construct a Spark SQL agent from an LLM and tools.
agent_toolkits.sql.base.create_sql_agent(...)
Construct an SQL agent from an LLM and tools.
langchain_community.cache¶
Warning
Beta Feature!
Cache provides an optional caching layer for LLMs.
Cache is useful for two reasons:
It can save you money by reducing the number of API calls you make to the LLM
provider if you’re often requesting the same completion multiple times.
It can speed up your application by reducing the number of API calls you make
to the LLM provider.
Cache directly competes with Memory. See documentation for Pros and Cons.
Class hierarchy:
BaseCache --> <name>Cache # Examples: InMemoryCache, RedisCache, GPTCache
Classes¶
cache.AstraDBCache(*[, collection_name, ...])
Cache that uses Astra DB as a backend.
cache.AstraDBSemanticCache(*[, ...])
Cache that uses Astra DB as a vector-store backend for semantic (i.e.
cache.CassandraCache([session, keyspace, ...])
Cache that uses Cassandra / Astra DB as a backend.
cache.CassandraSemanticCache(session, ...[, ...])
Cache that uses Cassandra as a vector-store backend for semantic (i.e.
cache.FullLLMCache(**kwargs)
SQLite table for full LLM Cache (all generations).
cache.FullMd5LLMCache(**kwargs)
SQLite table for full LLM Cache (all generations).
cache.GPTCache([init_func])
Cache that uses GPTCache as a backend.
cache.InMemoryCache()
Cache that stores things in memory.
cache.MomentoCache(cache_client, cache_name, *) | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-5 | cache.MomentoCache(cache_client, cache_name, *)
Cache that uses Momento as a backend.
cache.RedisCache(redis_, *[, ttl])
Cache that uses Redis as a backend.
cache.RedisSemanticCache(redis_url, embedding)
Cache that uses Redis as a vector-store backend.
cache.SQLAlchemyCache(engine, cache_schema)
Cache that uses SQAlchemy as a backend.
cache.SQLAlchemyMd5Cache(engine, cache_schema)
Cache that uses SQAlchemy as a backend.
cache.SQLiteCache([database_path])
Cache that uses SQLite as a backend.
cache.UpstashRedisCache(redis_, *[, ttl])
Cache that uses Upstash Redis as a backend.
Functions¶
langchain_community.callbacks¶
Callback handlers allow listening to events in LangChain.
Class hierarchy:
BaseCallbackHandler --> <name>CallbackHandler # Example: AimCallbackHandler
Classes¶
callbacks.aim_callback.AimCallbackHandler([...])
Callback Handler that logs to Aim.
callbacks.aim_callback.BaseMetadataCallbackHandler()
This class handles the metadata and associated function states for callbacks.
callbacks.argilla_callback.ArgillaCallbackHandler(...)
Callback Handler that logs into Argilla.
callbacks.arize_callback.ArizeCallbackHandler([...])
Callback Handler that logs to Arize.
callbacks.arthur_callback.ArthurCallbackHandler(...)
Callback Handler that logs to Arthur platform.
callbacks.clearml_callback.ClearMLCallbackHandler([...])
Callback Handler that logs to ClearML.
callbacks.comet_ml_callback.CometCallbackHandler([...])
Callback Handler that logs to Comet.
callbacks.confident_callback.DeepEvalCallbackHandler(metrics)
Callback Handler that logs into deepeval.
callbacks.context_callback.ContextCallbackHandler([...])
Callback Handler that records transcripts to the Context service.
callbacks.flyte_callback.FlyteCallbackHandler() | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-6 | callbacks.flyte_callback.FlyteCallbackHandler()
This callback handler that is used within a Flyte task.
callbacks.human.AsyncHumanApprovalCallbackHandler(...)
Asynchronous callback for manually validating values.
callbacks.human.HumanApprovalCallbackHandler(...)
Callback for manually validating values.
callbacks.human.HumanRejectedException
Exception to raise when a person manually review and rejects a value.
callbacks.infino_callback.InfinoCallbackHandler([...])
Callback Handler that logs to Infino.
callbacks.labelstudio_callback.LabelStudioCallbackHandler([...])
Label Studio callback handler.
callbacks.labelstudio_callback.LabelStudioMode(value)
Label Studio mode enumerator.
callbacks.llmonitor_callback.LLMonitorCallbackHandler([...])
Callback Handler for LLMonitor`.
callbacks.llmonitor_callback.UserContextManager(user_id)
Context manager for LLMonitor user context.
callbacks.mlflow_callback.MlflowCallbackHandler([...])
Callback Handler that logs metrics and artifacts to mlflow server.
callbacks.mlflow_callback.MlflowLogger(**kwargs)
Callback Handler that logs metrics and artifacts to mlflow server.
callbacks.openai_info.OpenAICallbackHandler()
Callback Handler that tracks OpenAI info.
callbacks.promptlayer_callback.PromptLayerCallbackHandler([...])
Callback handler for promptlayer.
callbacks.sagemaker_callback.SageMakerCallbackHandler(run)
Callback Handler that logs prompt artifacts and metrics to SageMaker Experiments.
callbacks.streamlit.mutable_expander.ChildRecord(...)
The child record as a NamedTuple.
callbacks.streamlit.mutable_expander.ChildType(value)
The enumerator of the child type.
callbacks.streamlit.mutable_expander.MutableExpander(...)
A Streamlit expander that can be renamed and dynamically expanded/collapsed.
callbacks.streamlit.streamlit_callback_handler.LLMThought(...)
A thought in the LLM's thought stream.
callbacks.streamlit.streamlit_callback_handler.LLMThoughtLabeler() | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-7 | callbacks.streamlit.streamlit_callback_handler.LLMThoughtLabeler()
Generates markdown labels for LLMThought containers.
callbacks.streamlit.streamlit_callback_handler.LLMThoughtState(value)
Enumerator of the LLMThought state.
callbacks.streamlit.streamlit_callback_handler.StreamlitCallbackHandler(...)
A callback handler that writes to a Streamlit app.
callbacks.streamlit.streamlit_callback_handler.ToolRecord(...)
The tool record as a NamedTuple.
callbacks.tracers.comet.CometTracer(**kwargs)
callbacks.tracers.wandb.RunProcessor(...)
Handles the conversion of a LangChain Runs into a WBTraceTree.
callbacks.tracers.wandb.WandbRunArgs
Arguments for the WandbTracer.
callbacks.tracers.wandb.WandbTracer([run_args])
Callback Handler that logs to Weights and Biases.
callbacks.trubrics_callback.TrubricsCallbackHandler([...])
Callback handler for Trubrics.
callbacks.utils.BaseMetadataCallbackHandler()
This class handles the metadata and associated function states for callbacks.
callbacks.wandb_callback.WandbCallbackHandler([...])
Callback Handler that logs to Weights and Biases.
callbacks.whylabs_callback.WhyLabsCallbackHandler(...)
Callback Handler for logging to WhyLabs.
Functions¶
callbacks.aim_callback.import_aim()
Import the aim python package and raise an error if it is not installed.
callbacks.clearml_callback.import_clearml()
Import the clearml python package and raise an error if it is not installed.
callbacks.comet_ml_callback.import_comet_ml()
Import comet_ml and raise an error if it is not installed.
callbacks.context_callback.import_context()
Import the getcontext package.
callbacks.flyte_callback.analyze_text(text)
Analyze text using textstat and spacy.
callbacks.flyte_callback.import_flytekit() | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-8 | callbacks.flyte_callback.import_flytekit()
Import flytekit and flytekitplugins-deck-standard.
callbacks.infino_callback.get_num_tokens(...)
Calculate num tokens for OpenAI with tiktoken package.
callbacks.infino_callback.import_infino()
Import the infino client.
callbacks.infino_callback.import_tiktoken()
Import tiktoken for counting tokens for OpenAI models.
callbacks.labelstudio_callback.get_default_label_configs(mode)
Get default Label Studio configs for the given mode.
callbacks.llmonitor_callback.identify(user_id)
Builds an LLMonitor UserContextManager
callbacks.manager.get_openai_callback()
Get the OpenAI callback handler in a context manager.
callbacks.manager.wandb_tracing_enabled([...])
Get the WandbTracer in a context manager.
callbacks.mlflow_callback.analyze_text(text)
Analyze text using textstat and spacy.
callbacks.mlflow_callback.construct_html_from_prompt_and_generation(...)
Construct an html element from a prompt and a generation.
callbacks.mlflow_callback.import_mlflow()
Import the mlflow python package and raise an error if it is not installed.
callbacks.openai_info.get_openai_token_cost_for_model(...)
Get the cost in USD for a given model and number of tokens.
callbacks.openai_info.standardize_model_name(...)
Standardize the model name to a format that can be used in the OpenAI API.
callbacks.sagemaker_callback.save_json(data, ...)
Save dict to local file path.
callbacks.tracers.comet.import_comet_llm_api()
Import comet_llm api and raise an error if it is not installed.
callbacks.utils.flatten_dict(nested_dict[, ...])
Flattens a nested dictionary into a flat dictionary.
callbacks.utils.hash_string(s)
Hash a string using sha1.
callbacks.utils.import_pandas() | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-9 | Hash a string using sha1.
callbacks.utils.import_pandas()
Import the pandas python package and raise an error if it is not installed.
callbacks.utils.import_spacy()
Import the spacy python package and raise an error if it is not installed.
callbacks.utils.import_textstat()
Import the textstat python package and raise an error if it is not installed.
callbacks.utils.load_json(json_path)
Load json file to a string.
callbacks.wandb_callback.analyze_text(text)
Analyze text using textstat and spacy.
callbacks.wandb_callback.construct_html_from_prompt_and_generation(...)
Construct an html element from a prompt and a generation.
callbacks.wandb_callback.import_wandb()
Import the wandb python package and raise an error if it is not installed.
callbacks.wandb_callback.load_json_to_dict(...)
Load json file to a dictionary.
callbacks.whylabs_callback.import_langkit([...])
Import the langkit python package and raise an error if it is not installed.
langchain_community.chat_loaders¶
Chat Loaders load chat messages from common communications platforms.
Load chat messages from various
communications platforms such as Facebook Messenger, Telegram, and
WhatsApp. The loaded chat messages can be used for fine-tuning models.
Class hierarchy:
BaseChatLoader --> <name>ChatLoader # Examples: WhatsAppChatLoader, IMessageChatLoader
Main helpers:
ChatSession
Classes¶
chat_loaders.base.BaseChatLoader()
Base class for chat loaders.
chat_loaders.facebook_messenger.FolderFacebookMessengerChatLoader(path)
Load Facebook Messenger chat data from a folder.
chat_loaders.facebook_messenger.SingleFileFacebookMessengerChatLoader(path)
Load Facebook Messenger chat data from a single file.
chat_loaders.gmail.GMailLoader(creds[, n, ...])
Load data from GMail. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-10 | Load data from GMail.
chat_loaders.imessage.IMessageChatLoader([path])
Load chat sessions from the iMessage chat.db SQLite file.
chat_loaders.langsmith.LangSmithDatasetChatLoader(*, ...)
Load chat sessions from a LangSmith dataset with the "chat" data type.
chat_loaders.langsmith.LangSmithRunChatLoader(runs)
Load chat sessions from a list of LangSmith "llm" runs.
chat_loaders.slack.SlackChatLoader(path)
Load Slack conversations from a dump zip file.
chat_loaders.telegram.TelegramChatLoader(path)
Load telegram conversations to LangChain chat messages.
chat_loaders.whatsapp.WhatsAppChatLoader(path)
Load WhatsApp conversations from a dump zip file or directory.
Functions¶
chat_loaders.utils.map_ai_messages(...)
Convert messages from the specified 'sender' to AI messages.
chat_loaders.utils.map_ai_messages_in_session(...)
Convert messages from the specified 'sender' to AI messages.
chat_loaders.utils.merge_chat_runs(chat_sessions)
Merge chat runs together.
chat_loaders.utils.merge_chat_runs_in_session(...)
Merge chat runs together in a chat session.
langchain_community.chat_message_histories¶
Classes¶
chat_message_histories.astradb.AstraDBChatMessageHistory(*, ...)
Chat message history that stores history in Astra DB.
chat_message_histories.cassandra.CassandraChatMessageHistory(...)
Chat message history that stores history in Cassandra.
chat_message_histories.cosmos_db.CosmosDBChatMessageHistory(...)
Chat message history backed by Azure CosmosDB.
chat_message_histories.dynamodb.DynamoDBChatMessageHistory(...)
Chat message history that stores history in AWS DynamoDB.
chat_message_histories.elasticsearch.ElasticsearchChatMessageHistory(...)
Chat message history that stores history in Elasticsearch. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-11 | Chat message history that stores history in Elasticsearch.
chat_message_histories.file.FileChatMessageHistory(...)
Chat message history that stores history in a local file.
chat_message_histories.firestore.FirestoreChatMessageHistory(...)
Chat message history backed by Google Firestore.
chat_message_histories.in_memory.ChatMessageHistory
In memory implementation of chat message history.
chat_message_histories.momento.MomentoChatMessageHistory(...)
Chat message history cache that uses Momento as a backend.
chat_message_histories.mongodb.MongoDBChatMessageHistory(...)
Chat message history that stores history in MongoDB.
chat_message_histories.neo4j.Neo4jChatMessageHistory(...)
Chat message history stored in a Neo4j database.
chat_message_histories.postgres.PostgresChatMessageHistory(...)
Chat message history stored in a Postgres database.
chat_message_histories.redis.RedisChatMessageHistory(...)
Chat message history stored in a Redis database.
chat_message_histories.rocksetdb.RocksetChatMessageHistory(...)
Uses Rockset to store chat messages.
chat_message_histories.singlestoredb.SingleStoreDBChatMessageHistory(...)
Chat message history stored in a SingleStoreDB database.
chat_message_histories.sql.BaseMessageConverter()
The class responsible for converting BaseMessage to your SQLAlchemy model.
chat_message_histories.sql.DefaultMessageConverter(...)
The default message converter for SQLChatMessageHistory.
chat_message_histories.sql.SQLChatMessageHistory(...)
Chat message history stored in an SQL database.
chat_message_histories.streamlit.StreamlitChatMessageHistory([key])
Chat message history that stores messages in Streamlit session state.
chat_message_histories.upstash_redis.UpstashRedisChatMessageHistory(...)
Chat message history stored in an Upstash Redis database.
chat_message_histories.xata.XataChatMessageHistory(...)
Chat message history stored in a Xata database. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-12 | Chat message history stored in a Xata database.
chat_message_histories.zep.ZepChatMessageHistory(...)
Chat message history that uses Zep as a backend.
Functions¶
chat_message_histories.sql.create_message_model(...)
Create a message model for a given table name.
langchain_community.chat_models¶
Chat Models are a variation on language models.
While Chat Models use language models under the hood, the interface they expose
is a bit different. Rather than expose a “text in, text out” API, they expose
an interface where “chat messages” are the inputs and outputs.
Class hierarchy:
BaseLanguageModel --> BaseChatModel --> <name> # Examples: ChatOpenAI, ChatGooglePalm
Main helpers:
AIMessage, BaseMessage, HumanMessage
Classes¶
chat_models.anthropic.ChatAnthropic
Anthropic chat large language models.
chat_models.anyscale.ChatAnyscale
Anyscale Chat large language models.
chat_models.azure_openai.AzureChatOpenAI
Azure OpenAI Chat Completion API.
chat_models.azureml_endpoint.AzureMLChatOnlineEndpoint
AzureML Chat models API.
chat_models.azureml_endpoint.LlamaContentFormatter()
Content formatter for LLaMA.
chat_models.baichuan.ChatBaichuan
Baichuan chat models API by Baichuan Intelligent Technology.
chat_models.baidu_qianfan_endpoint.QianfanChatEndpoint
Baidu Qianfan chat models.
chat_models.bedrock.BedrockChat
A chat model that uses the Bedrock API.
chat_models.bedrock.ChatPromptAdapter()
Adapter class to prepare the inputs from Langchain to prompt format that Chat model expects.
chat_models.cohere.ChatCohere
Cohere chat large language models.
chat_models.databricks.ChatDatabricks
Databricks chat models API.
chat_models.ernie.ErnieBotChat | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-13 | Databricks chat models API.
chat_models.ernie.ErnieBotChat
ERNIE-Bot large language model.
chat_models.everlyai.ChatEverlyAI
EverlyAI Chat large language models.
chat_models.fake.FakeListChatModel
Fake ChatModel for testing purposes.
chat_models.fake.FakeMessagesListChatModel
Fake ChatModel for testing purposes.
chat_models.fireworks.ChatFireworks
Fireworks Chat models.
chat_models.gigachat.GigaChat
GigaChat large language models API.
chat_models.google_palm.ChatGooglePalm
Google PaLM Chat models API.
chat_models.google_palm.ChatGooglePalmError
Error with the Google PaLM API.
chat_models.human.HumanInputChatModel
ChatModel which returns user input as the response.
chat_models.hunyuan.ChatHunyuan
Tencent Hunyuan chat models API by Tencent.
chat_models.javelin_ai_gateway.ChatJavelinAIGateway
Javelin AI Gateway chat models API.
chat_models.javelin_ai_gateway.ChatParams
Parameters for the Javelin AI Gateway LLM.
chat_models.jinachat.JinaChat
Jina AI Chat models API.
chat_models.konko.ChatKonko
ChatKonko Chat large language models API.
chat_models.litellm.ChatLiteLLM
A chat model that uses the LiteLLM API.
chat_models.litellm.ChatLiteLLMException
Error with the LiteLLM I/O library
chat_models.minimax.MiniMaxChat
Wrapper around Minimax large language models.
chat_models.mlflow.ChatMlflow
MLflow chat models API.
chat_models.mlflow_ai_gateway.ChatMLflowAIGateway
MLflow AI Gateway chat models API.
chat_models.mlflow_ai_gateway.ChatParams | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-14 | MLflow AI Gateway chat models API.
chat_models.mlflow_ai_gateway.ChatParams
Parameters for the MLflow AI Gateway LLM.
chat_models.ollama.ChatOllama
Ollama locally runs large language models.
chat_models.openai.ChatOpenAI
OpenAI Chat large language models API.
chat_models.pai_eas_endpoint.PaiEasChatEndpoint
Eas LLM Service chat model API.
chat_models.promptlayer_openai.PromptLayerChatOpenAI
PromptLayer and OpenAI Chat large language models API.
chat_models.tongyi.ChatTongyi
Alibaba Tongyi Qwen chat models API.
chat_models.vertexai.ChatVertexAI
Vertex AI Chat large language models API.
chat_models.volcengine_maas.VolcEngineMaasChat
volc engine maas hosts a plethora of models.
chat_models.yandex.ChatYandexGPT
Wrapper around YandexGPT large language models.
Functions¶
chat_models.anthropic.convert_messages_to_prompt_anthropic(...)
Format a list of messages into a full prompt for the Anthropic model
chat_models.baidu_qianfan_endpoint.convert_message_to_dict(message)
Convert a message to a dictionary that can be passed to the API.
chat_models.cohere.get_cohere_chat_request(...)
Get the request for the Cohere chat API.
chat_models.cohere.get_role(message)
Get the role of the message.
chat_models.fireworks.acompletion_with_retry(...)
Use tenacity to retry the async completion call.
chat_models.fireworks.acompletion_with_retry_streaming(...)
Use tenacity to retry the completion call for streaming.
chat_models.fireworks.completion_with_retry(...)
Use tenacity to retry the completion call.
chat_models.fireworks.conditional_decorator(...)
chat_models.fireworks.convert_dict_to_message(_dict) | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-15 | chat_models.fireworks.convert_dict_to_message(_dict)
Convert a dict response to a message.
chat_models.google_palm.achat_with_retry(...)
Use tenacity to retry the async completion call.
chat_models.google_palm.chat_with_retry(llm, ...)
Use tenacity to retry the completion call.
chat_models.jinachat.acompletion_with_retry(...)
Use tenacity to retry the async completion call.
chat_models.litellm.acompletion_with_retry(llm)
Use tenacity to retry the async completion call.
chat_models.meta.convert_messages_to_prompt_llama(...)
chat_models.openai.acompletion_with_retry(llm)
Use tenacity to retry the async completion call.
chat_models.tongyi.convert_dict_to_message(_dict)
chat_models.tongyi.convert_message_to_dict(message)
chat_models.volcengine_maas.convert_dict_to_message(_dict)
langchain_community.docstore¶
Docstores are classes to store and load Documents.
The Docstore is a simplified version of the Document Loader.
Class hierarchy:
Docstore --> <name> # Examples: InMemoryDocstore, Wikipedia
Main helpers:
Document, AddableMixin
Classes¶
docstore.arbitrary_fn.DocstoreFn(lookup_fn)
Langchain Docstore via arbitrary lookup function.
docstore.base.AddableMixin()
Mixin class that supports adding texts.
docstore.base.Docstore()
Interface to access to place that stores documents.
docstore.in_memory.InMemoryDocstore([_dict])
Simple in memory docstore in the form of a dict.
docstore.wikipedia.Wikipedia()
Wrapper around wikipedia API.
langchain_community.document_loaders¶
Document Loaders are classes to load Documents.
Document Loaders are usually used to load a lot of Documents in a single run.
Class hierarchy: | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-16 | Class hierarchy:
BaseLoader --> <name>Loader # Examples: TextLoader, UnstructuredFileLoader
Main helpers:
Document, <name>TextSplitter
Classes¶
document_loaders.acreom.AcreomLoader(path[, ...])
Load acreom vault from a directory.
document_loaders.airbyte.AirbyteCDKLoader(...)
Load with an Airbyte source connector implemented using the CDK.
document_loaders.airbyte.AirbyteGongLoader(...)
Load from Gong using an Airbyte source connector.
document_loaders.airbyte.AirbyteHubspotLoader(...)
Load from Hubspot using an Airbyte source connector.
document_loaders.airbyte.AirbyteSalesforceLoader(...)
Load from Salesforce using an Airbyte source connector.
document_loaders.airbyte.AirbyteShopifyLoader(...)
Load from Shopify using an Airbyte source connector.
document_loaders.airbyte.AirbyteStripeLoader(...)
Load from Stripe using an Airbyte source connector.
document_loaders.airbyte.AirbyteTypeformLoader(...)
Load from Typeform using an Airbyte source connector.
document_loaders.airbyte.AirbyteZendeskSupportLoader(...)
Load from Zendesk Support using an Airbyte source connector.
document_loaders.airbyte_json.AirbyteJSONLoader(...)
Load local Airbyte json files.
document_loaders.airtable.AirtableLoader(...)
Load the Airtable tables.
document_loaders.apify_dataset.ApifyDatasetLoader
Load datasets from Apify web scraping, crawling, and data extraction platform.
document_loaders.arcgis_loader.ArcGISLoader(layer)
Load records from an ArcGIS FeatureLayer.
document_loaders.arxiv.ArxivLoader(query[, ...])
Load a query result from Arxiv.
document_loaders.assemblyai.AssemblyAIAudioTranscriptLoader(...)
Loader for AssemblyAI audio transcripts. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-17 | Loader for AssemblyAI audio transcripts.
document_loaders.assemblyai.TranscriptFormat(value)
Transcript format to use for the document loader.
document_loaders.async_html.AsyncHtmlLoader(...)
Load HTML asynchronously.
document_loaders.azlyrics.AZLyricsLoader([...])
Load AZLyrics webpages.
document_loaders.azure_ai_data.AzureAIDataLoader(url)
Load from Azure AI Data.
document_loaders.azure_blob_storage_container.AzureBlobStorageContainerLoader(...)
Load from Azure Blob Storage container.
document_loaders.azure_blob_storage_file.AzureBlobStorageFileLoader(...)
Load from Azure Blob Storage files.
document_loaders.baiducloud_bos_directory.BaiduBOSDirectoryLoader(...)
Load from Baidu BOS directory.
document_loaders.baiducloud_bos_file.BaiduBOSFileLoader(...)
Load from Baidu Cloud BOS file.
document_loaders.base.BaseBlobParser()
Abstract interface for blob parsers.
document_loaders.base.BaseLoader()
Interface for Document Loader.
document_loaders.base_o365.O365BaseLoader
Base class for all loaders that uses O365 Package
document_loaders.bibtex.BibtexLoader(...[, ...])
Load a bibtex file.
document_loaders.bigquery.BigQueryLoader(query)
Load from the Google Cloud Platform BigQuery.
document_loaders.bilibili.BiliBiliLoader(...)
Load BiliBili video transcripts.
document_loaders.blackboard.BlackboardLoader(...)
Load a Blackboard course.
document_loaders.blob_loaders.file_system.FileSystemBlobLoader(path, *)
Load blobs in the local file system.
document_loaders.blob_loaders.schema.Blob
Blob represents raw data by either reference or value.
document_loaders.blob_loaders.schema.BlobLoader()
Abstract interface for blob loaders implementation.
document_loaders.blob_loaders.youtube_audio.YoutubeAudioLoader(...) | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-18 | document_loaders.blob_loaders.youtube_audio.YoutubeAudioLoader(...)
Load YouTube urls as audio file(s).
document_loaders.blockchain.BlockchainDocumentLoader(...)
Load elements from a blockchain smart contract.
document_loaders.blockchain.BlockchainType(value)
Enumerator of the supported blockchains.
document_loaders.brave_search.BraveSearchLoader(...)
Load with Brave Search engine.
document_loaders.browserless.BrowserlessLoader(...)
Load webpages with Browserless /content endpoint.
document_loaders.chatgpt.ChatGPTLoader(log_file)
Load conversations from exported ChatGPT data.
document_loaders.chromium.AsyncChromiumLoader(urls)
Scrape HTML pages from URLs using a headless instance of the Chromium.
document_loaders.college_confidential.CollegeConfidentialLoader([...])
Load College Confidential webpages.
document_loaders.concurrent.ConcurrentLoader(...)
Load and pars Documents concurrently.
document_loaders.confluence.ConfluenceLoader(url)
Load Confluence pages.
document_loaders.confluence.ContentFormat(value)
Enumerator of the content formats of Confluence page.
document_loaders.conllu.CoNLLULoader(file_path)
Load CoNLL-U files.
document_loaders.couchbase.CouchbaseLoader(...)
Load documents from Couchbase.
document_loaders.csv_loader.CSVLoader(file_path)
Load a CSV file into a list of Documents.
document_loaders.csv_loader.UnstructuredCSVLoader(...)
Load CSV files using Unstructured.
document_loaders.cube_semantic.CubeSemanticLoader(...)
Load Cube semantic layer metadata.
document_loaders.datadog_logs.DatadogLogsLoader(...)
Load Datadog logs.
document_loaders.dataframe.BaseDataFrameLoader(...)
Initialize with dataframe object.
document_loaders.dataframe.DataFrameLoader(...)
Load Pandas DataFrame.
document_loaders.diffbot.DiffbotLoader(...) | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-19 | Load Pandas DataFrame.
document_loaders.diffbot.DiffbotLoader(...)
Load Diffbot json file.
document_loaders.directory.DirectoryLoader(...)
Load from a directory.
document_loaders.discord.DiscordChatLoader(...)
Load Discord chat logs.
document_loaders.docugami.DocugamiLoader
Load from Docugami.
document_loaders.docusaurus.DocusaurusLoader(url)
Loader that leverages the SitemapLoader to loop through the generated pages of a Docusaurus Documentation website and extracts the content by looking for specific HTML tags.
document_loaders.dropbox.DropboxLoader
Load files from Dropbox.
document_loaders.duckdb_loader.DuckDBLoader(query)
Load from DuckDB.
document_loaders.email.OutlookMessageLoader(...)
Loads Outlook Message files using extract_msg.
document_loaders.email.UnstructuredEmailLoader(...)
Load email files using Unstructured.
document_loaders.epub.UnstructuredEPubLoader(...)
Load EPub files using Unstructured.
document_loaders.etherscan.EtherscanLoader(...)
Load transactions from Ethereum mainnet.
document_loaders.evernote.EverNoteLoader(...)
Load from EverNote.
document_loaders.excel.UnstructuredExcelLoader(...)
Load Microsoft Excel files using Unstructured.
document_loaders.facebook_chat.FacebookChatLoader(path)
Load Facebook Chat messages directory dump.
document_loaders.fauna.FaunaLoader(query, ...)
Load from FaunaDB.
document_loaders.figma.FigmaFileLoader(...)
Load Figma file.
document_loaders.gcs_directory.GCSDirectoryLoader(...)
Load from GCS directory.
document_loaders.gcs_file.GCSFileLoader(...)
Load from GCS file.
document_loaders.generic.GenericLoader(...)
Generic Document Loader.
document_loaders.geodataframe.GeoDataFrameLoader(...)
Load geopandas Dataframe. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-20 | document_loaders.geodataframe.GeoDataFrameLoader(...)
Load geopandas Dataframe.
document_loaders.git.GitLoader(repo_path[, ...])
Load Git repository files.
document_loaders.gitbook.GitbookLoader(web_page)
Load GitBook data.
document_loaders.github.BaseGitHubLoader
Load GitHub repository Issues.
document_loaders.github.GitHubIssuesLoader
Load issues of a GitHub repository.
document_loaders.google_speech_to_text.GoogleSpeechToTextLoader(...)
Loader for Google Cloud Speech-to-Text audio transcripts.
document_loaders.googledrive.GoogleDriveLoader
Load Google Docs from Google Drive.
document_loaders.gutenberg.GutenbergLoader(...)
Load from Gutenberg.org.
document_loaders.helpers.FileEncoding(...)
File encoding as the NamedTuple.
document_loaders.hn.HNLoader([web_path, ...])
Load Hacker News data.
document_loaders.html.UnstructuredHTMLLoader(...)
Load HTML files using Unstructured.
document_loaders.html_bs.BSHTMLLoader(file_path)
Load HTML files and parse them with beautiful soup.
document_loaders.hugging_face_dataset.HuggingFaceDatasetLoader(path)
Load from Hugging Face Hub datasets.
document_loaders.ifixit.IFixitLoader(web_path)
Load iFixit repair guides, device wikis and answers.
document_loaders.image.UnstructuredImageLoader(...)
Load PNG and JPG files using Unstructured.
document_loaders.image_captions.ImageCaptionLoader(images)
Load image captions.
document_loaders.imsdb.IMSDbLoader([...])
Load IMSDb webpages.
document_loaders.iugu.IuguLoader(resource[, ...])
Load from IUGU.
document_loaders.joplin.JoplinLoader([...])
Load notes from Joplin.
document_loaders.json_loader.JSONLoader(...) | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-21 | Load notes from Joplin.
document_loaders.json_loader.JSONLoader(...)
Load a JSON file using a jq schema.
document_loaders.lakefs.LakeFSClient(...)
document_loaders.lakefs.LakeFSLoader(...[, ...])
Load from lakeFS.
document_loaders.lakefs.UnstructuredLakeFSLoader(...)
Args:
document_loaders.larksuite.LarkSuiteDocLoader(...)
Load from LarkSuite (FeiShu).
document_loaders.markdown.UnstructuredMarkdownLoader(...)
Load Markdown files using Unstructured.
document_loaders.mastodon.MastodonTootsLoader(...)
Load the Mastodon 'toots'.
document_loaders.max_compute.MaxComputeLoader(...)
Load from Alibaba Cloud MaxCompute table.
document_loaders.mediawikidump.MWDumpLoader(...)
Load MediaWiki dump from an XML file.
document_loaders.merge.MergedDataLoader(loaders)
Merge documents from a list of loaders
document_loaders.mhtml.MHTMLLoader(file_path)
Parse MHTML files with BeautifulSoup.
document_loaders.modern_treasury.ModernTreasuryLoader(...)
Load from Modern Treasury.
document_loaders.mongodb.MongodbLoader(...)
Load MongoDB documents.
document_loaders.news.NewsURLLoader(urls[, ...])
Load news articles from URLs using Unstructured.
document_loaders.notebook.NotebookLoader(path)
Load Jupyter notebook (.ipynb) files.
document_loaders.notion.NotionDirectoryLoader(path, *)
Load Notion directory dump.
document_loaders.notiondb.NotionDBLoader(...)
Load from Notion DB.
document_loaders.nuclia.NucliaLoader(path, ...)
Load from any file type using Nuclia Understanding API.
document_loaders.obs_directory.OBSDirectoryLoader(...)
Load from Huawei OBS directory. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-22 | document_loaders.obs_directory.OBSDirectoryLoader(...)
Load from Huawei OBS directory.
document_loaders.obs_file.OBSFileLoader(...)
Load from the Huawei OBS file.
document_loaders.obsidian.ObsidianLoader(path)
Load Obsidian files from directory.
document_loaders.odt.UnstructuredODTLoader(...)
Load OpenOffice ODT files using Unstructured.
document_loaders.onedrive.OneDriveLoader
Load from Microsoft OneDrive.
document_loaders.onedrive_file.OneDriveFileLoader
Load a file from Microsoft OneDrive.
document_loaders.onenote.OneNoteLoader
Load pages from OneNote notebooks.
document_loaders.open_city_data.OpenCityDataLoader(...)
Load from Open City.
document_loaders.org_mode.UnstructuredOrgModeLoader(...)
Load Org-Mode files using Unstructured.
document_loaders.parsers.audio.OpenAIWhisperParser([...])
Transcribe and parse audio files.
document_loaders.parsers.audio.OpenAIWhisperParserLocal([...])
Transcribe and parse audio files with OpenAI Whisper model.
document_loaders.parsers.audio.YandexSTTParser(*)
Transcribe and parse audio files.
document_loaders.parsers.docai.DocAIParser(*)
Google Cloud Document AI parser.
document_loaders.parsers.docai.DocAIParsingResults(...)
A dataclass to store Document AI parsing results.
document_loaders.parsers.generic.MimeTypeBasedParser(...)
Parser that uses mime-types to parse a blob.
document_loaders.parsers.grobid.GrobidParser(...)
Load article PDF files using Grobid.
document_loaders.parsers.grobid.ServerUnavailableException
Exception raised when the Grobid server is unavailable.
document_loaders.parsers.html.bs4.BS4HTMLParser(*)
Pparse HTML files using Beautiful Soup.
document_loaders.parsers.language.cobol.CobolSegmenter(code) | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-23 | document_loaders.parsers.language.cobol.CobolSegmenter(code)
Code segmenter for COBOL.
document_loaders.parsers.language.code_segmenter.CodeSegmenter(code)
Abstract class for the code segmenter.
document_loaders.parsers.language.javascript.JavaScriptSegmenter(code)
Code segmenter for JavaScript.
document_loaders.parsers.language.language_parser.LanguageParser([...])
Parse using the respective programming language syntax.
document_loaders.parsers.language.python.PythonSegmenter(code)
Code segmenter for Python.
document_loaders.parsers.msword.MsWordParser()
Parse the Microsoft Word documents from a blob.
document_loaders.parsers.pdf.AmazonTextractPDFParser([...])
Send PDF files to Amazon Textract and parse them.
document_loaders.parsers.pdf.DocumentIntelligenceParser(...)
Loads a PDF with Azure Document Intelligence (formerly Forms Recognizer) and chunks at character level.
document_loaders.parsers.pdf.PDFMinerParser([...])
Parse PDF using PDFMiner.
document_loaders.parsers.pdf.PDFPlumberParser([...])
Parse PDF with PDFPlumber.
document_loaders.parsers.pdf.PyMuPDFParser([...])
Parse PDF using PyMuPDF.
document_loaders.parsers.pdf.PyPDFParser([...])
Load PDF using pypdf
document_loaders.parsers.pdf.PyPDFium2Parser([...])
Parse PDF with PyPDFium2.
document_loaders.parsers.txt.TextParser()
Parser for text blobs.
document_loaders.pdf.AmazonTextractPDFLoader(...)
Load PDF files from a local file system, HTTP or S3.
document_loaders.pdf.BasePDFLoader(file_path, *)
Base Loader class for PDF files.
document_loaders.pdf.DocumentIntelligenceLoader(...)
Loads a PDF with Azure Document Intelligence
document_loaders.pdf.MathpixPDFLoader(file_path)
Load PDF files using Mathpix service. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-24 | Load PDF files using Mathpix service.
document_loaders.pdf.OnlinePDFLoader(...[, ...])
Load online PDF.
document_loaders.pdf.PDFMinerLoader(file_path, *)
Load PDF files using PDFMiner.
document_loaders.pdf.PDFMinerPDFasHTMLLoader(...)
Load PDF files as HTML content using PDFMiner.
document_loaders.pdf.PDFPlumberLoader(file_path)
Load PDF files using pdfplumber.
document_loaders.pdf.PyMuPDFLoader(file_path, *)
Load PDF files using PyMuPDF.
document_loaders.pdf.PyPDFDirectoryLoader(path)
Load a directory with PDF files using pypdf and chunks at character level.
document_loaders.pdf.PyPDFLoader(file_path)
Load PDF using pypdf into list of documents.
document_loaders.pdf.PyPDFium2Loader(...[, ...])
Load PDF using pypdfium2 and chunks at character level.
document_loaders.pdf.UnstructuredPDFLoader(...)
Load PDF files using Unstructured.
document_loaders.polars_dataframe.PolarsDataFrameLoader(...)
Load Polars DataFrame.
document_loaders.powerpoint.UnstructuredPowerPointLoader(...)
Load Microsoft PowerPoint files using Unstructured.
document_loaders.psychic.PsychicLoader(...)
Load from Psychic.dev.
document_loaders.pubmed.PubMedLoader(query)
Load from the PubMed biomedical library.
document_loaders.pyspark_dataframe.PySparkDataFrameLoader([...])
Load PySpark DataFrames.
document_loaders.python.PythonLoader(file_path)
Load Python files, respecting any non-default encoding if specified.
document_loaders.quip.QuipLoader(api_url, ...)
Load Quip pages.
document_loaders.readthedocs.ReadTheDocsLoader(path)
Load ReadTheDocs documentation directory. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-25 | Load ReadTheDocs documentation directory.
document_loaders.recursive_url_loader.RecursiveUrlLoader(url)
Load all child links from a URL page.
document_loaders.reddit.RedditPostsLoader(...)
Load Reddit posts.
document_loaders.roam.RoamLoader(path)
Load Roam files from a directory.
document_loaders.rocksetdb.ColumnNotFoundError(...)
Column not found error.
document_loaders.rocksetdb.RocksetLoader(...)
Load from a Rockset database.
document_loaders.rspace.RSpaceLoader(global_id)
Loads content from RSpace notebooks, folders, documents or PDF Gallery files into Langchain documents.
document_loaders.rss.RSSFeedLoader([urls, ...])
Load news articles from RSS feeds using Unstructured.
document_loaders.rst.UnstructuredRSTLoader(...)
Load RST files using Unstructured.
document_loaders.rtf.UnstructuredRTFLoader(...)
Load RTF files using Unstructured.
document_loaders.s3_directory.S3DirectoryLoader(bucket)
Load from Amazon AWS S3 directory.
document_loaders.s3_file.S3FileLoader(...[, ...])
Load from Amazon AWS S3 file.
document_loaders.sharepoint.SharePointLoader
Load from SharePoint.
document_loaders.sitemap.SitemapLoader(web_path)
Load a sitemap and its URLs.
document_loaders.slack_directory.SlackDirectoryLoader(...)
Load from a Slack directory dump.
document_loaders.snowflake_loader.SnowflakeLoader(...)
Load from Snowflake API.
document_loaders.spreedly.SpreedlyLoader(...)
Load from Spreedly API.
document_loaders.srt.SRTLoader(file_path)
Load .srt (subtitle) files.
document_loaders.stripe.StripeLoader(resource)
Load from Stripe API. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-26 | document_loaders.stripe.StripeLoader(resource)
Load from Stripe API.
document_loaders.telegram.TelegramChatApiLoader([...])
Load Telegram chat json directory dump.
document_loaders.telegram.TelegramChatFileLoader(path)
Load from Telegram chat dump.
document_loaders.tencent_cos_directory.TencentCOSDirectoryLoader(...)
Load from Tencent Cloud COS directory.
document_loaders.tencent_cos_file.TencentCOSFileLoader(...)
Load from Tencent Cloud COS file.
document_loaders.tensorflow_datasets.TensorflowDatasetLoader(...)
Load from TensorFlow Dataset.
document_loaders.text.TextLoader(file_path)
Load text file.
document_loaders.tomarkdown.ToMarkdownLoader(...)
Load HTML using 2markdown API.
document_loaders.toml.TomlLoader(source)
Load TOML files.
document_loaders.trello.TrelloLoader(client, ...)
Load cards from a Trello board.
document_loaders.tsv.UnstructuredTSVLoader(...)
Load TSV files using Unstructured.
document_loaders.twitter.TwitterTweetLoader(...)
Load Twitter tweets.
document_loaders.unstructured.UnstructuredAPIFileIOLoader(file)
Load files using Unstructured API.
document_loaders.unstructured.UnstructuredAPIFileLoader([...])
Load files using Unstructured API.
document_loaders.unstructured.UnstructuredBaseLoader([...])
Base Loader that uses Unstructured.
document_loaders.unstructured.UnstructuredFileIOLoader(file)
Load files using Unstructured.
document_loaders.unstructured.UnstructuredFileLoader(...)
Load files using Unstructured.
document_loaders.url.UnstructuredURLLoader(urls)
Load files from remote URLs using Unstructured.
document_loaders.url_playwright.PlaywrightEvaluator()
Abstract base class for all evaluators.
document_loaders.url_playwright.PlaywrightURLLoader(urls)
Load HTML pages with Playwright and parse with Unstructured. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-27 | Load HTML pages with Playwright and parse with Unstructured.
document_loaders.url_playwright.UnstructuredHtmlEvaluator([...])
Evaluates the page HTML content using the unstructured library.
document_loaders.url_selenium.SeleniumURLLoader(urls)
Load HTML pages with Selenium and parse with Unstructured.
document_loaders.weather.WeatherDataLoader(...)
Load weather data with Open Weather Map API.
document_loaders.web_base.WebBaseLoader([...])
Load HTML pages using urllib and parse them with `BeautifulSoup'.
document_loaders.whatsapp_chat.WhatsAppChatLoader(path)
Load WhatsApp messages text file.
document_loaders.wikipedia.WikipediaLoader(query)
Load from Wikipedia.
document_loaders.word_document.Docx2txtLoader(...)
Load DOCX file using docx2txt and chunks at character level.
document_loaders.word_document.UnstructuredWordDocumentLoader(...)
Load Microsoft Word file using Unstructured.
document_loaders.xml.UnstructuredXMLLoader(...)
Load XML file using Unstructured.
document_loaders.xorbits.XorbitsLoader(...)
Load Xorbits DataFrame.
document_loaders.youtube.GoogleApiClient([...])
Generic Google API Client.
document_loaders.youtube.GoogleApiYoutubeLoader(...)
Load all Videos from a YouTube Channel.
document_loaders.youtube.YoutubeLoader(video_id)
Load YouTube transcripts.
Functions¶
document_loaders.base_o365.fetch_mime_types(...)
Fetch the mime types for the specified file types.
document_loaders.chatgpt.concatenate_rows(...)
Combine message information in a readable format ready to be used.
document_loaders.facebook_chat.concatenate_rows(row)
Combine message information in a readable format ready to be used.
document_loaders.helpers.detect_file_encodings(...)
Try to detect the file encoding.
document_loaders.notebook.concatenate_cells(...)
Combine cells information in a readable format ready to be used. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-28 | Combine cells information in a readable format ready to be used.
document_loaders.notebook.remove_newlines(x)
Recursively remove newlines, no matter the data structure they are stored in.
document_loaders.parsers.pdf.extract_from_images_with_rapidocr(images)
Extract text from images with RapidOCR.
document_loaders.parsers.registry.get_parser(...)
Get a parser by parser name.
document_loaders.rocksetdb.default_joiner(docs)
Default joiner for content columns.
document_loaders.telegram.concatenate_rows(row)
Combine message information in a readable format ready to be used.
document_loaders.telegram.text_to_docs(text)
Convert a string or list of strings to a list of Documents with metadata.
document_loaders.unstructured.get_elements_from_api([...])
Retrieve a list of elements from the Unstructured API.
document_loaders.unstructured.satisfies_min_unstructured_version(...)
Check if the installed Unstructured version exceeds the minimum version for the feature in question.
document_loaders.unstructured.validate_unstructured_version(...)
Raise an error if the Unstructured version does not exceed the specified minimum.
document_loaders.whatsapp_chat.concatenate_rows(...)
Combine message information in a readable format ready to be used.
langchain_community.document_transformers¶
Document Transformers are classes to transform Documents.
Document Transformers usually used to transform a lot of Documents in a single run.
Class hierarchy:
BaseDocumentTransformer --> <name> # Examples: DoctranQATransformer, DoctranTextTranslator
Main helpers:
Document
Classes¶
document_transformers.beautiful_soup_transformer.BeautifulSoupTransformer()
Transform HTML content by extracting specific tags and removing unwanted ones.
document_transformers.doctran_text_extract.DoctranPropertyExtractor(...)
Extract properties from text documents using doctran.
document_transformers.doctran_text_qa.DoctranQATransformer([...]) | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-29 | document_transformers.doctran_text_qa.DoctranQATransformer([...])
Extract QA from text documents using doctran.
document_transformers.doctran_text_translate.DoctranTextTranslator([...])
Translate text documents using doctran.
document_transformers.embeddings_redundant_filter.EmbeddingsClusteringFilter
Perform K-means clustering on document vectors.
document_transformers.embeddings_redundant_filter.EmbeddingsRedundantFilter
Filter that drops redundant documents by comparing their embeddings.
document_transformers.google_translate.GoogleTranslateTransformer(...)
Translate text documents using Google Cloud Translation.
document_transformers.html2text.Html2TextTransformer([...])
Replace occurrences of a particular search pattern with a replacement string
document_transformers.long_context_reorder.LongContextReorder
Lost in the middle: Performance degrades when models must access relevant information in the middle of long contexts.
document_transformers.nuclia_text_transform.NucliaTextTransformer(nua)
The Nuclia Understanding API splits into paragraphs and sentences, identifies entities, provides a summary of the text and generates embeddings for all sentences.
document_transformers.openai_functions.OpenAIMetadataTagger
Extract metadata tags from document contents using OpenAI functions.
Functions¶
document_transformers.beautiful_soup_transformer.get_navigable_strings(element)
document_transformers.embeddings_redundant_filter.get_stateful_documents(...)
Convert a list of documents to a list of documents with state.
document_transformers.openai_functions.create_metadata_tagger(...)
Create a DocumentTransformer that uses an OpenAI function chain to automatically
langchain_community.embeddings¶
Embedding models are wrappers around embedding models
from different APIs and services.
Embedding models can be LLMs or not.
Class hierarchy:
Embeddings --> <name>Embeddings # Examples: OpenAIEmbeddings, HuggingFaceEmbeddings
Classes¶ | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-30 | Classes¶
embeddings.aleph_alpha.AlephAlphaAsymmetricSemanticEmbedding
Aleph Alpha's asymmetric semantic embedding.
embeddings.aleph_alpha.AlephAlphaSymmetricSemanticEmbedding
The symmetric version of the Aleph Alpha's semantic embeddings.
embeddings.awa.AwaEmbeddings
Embedding documents and queries with Awa DB.
embeddings.azure_openai.AzureOpenAIEmbeddings
Azure OpenAI Embeddings API.
embeddings.baidu_qianfan_endpoint.QianfanEmbeddingsEndpoint
Baidu Qianfan Embeddings embedding models.
embeddings.bedrock.BedrockEmbeddings
Bedrock embedding models.
embeddings.bookend.BookendEmbeddings
Bookend AI sentence_transformers embedding models.
embeddings.clarifai.ClarifaiEmbeddings
Clarifai embedding models.
embeddings.cloudflare_workersai.CloudflareWorkersAIEmbeddings
Cloudflare Workers AI embedding model.
embeddings.cohere.CohereEmbeddings
Cohere embedding models.
embeddings.dashscope.DashScopeEmbeddings
DashScope embedding models.
embeddings.databricks.DatabricksEmbeddings
Wrapper around embeddings LLMs in Databricks.
embeddings.deepinfra.DeepInfraEmbeddings
Deep Infra's embedding inference service.
embeddings.edenai.EdenAiEmbeddings
EdenAI embedding.
embeddings.elasticsearch.ElasticsearchEmbeddings(...)
Elasticsearch embedding models.
embeddings.embaas.EmbaasEmbeddings
Embaas's embedding service.
embeddings.embaas.EmbaasEmbeddingsPayload
Payload for the Embaas embeddings API.
embeddings.ernie.ErnieEmbeddings
Ernie Embeddings V1 embedding models.
embeddings.fake.DeterministicFakeEmbedding
Fake embedding model that always returns the same embedding vector for the same text. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-31 | Fake embedding model that always returns the same embedding vector for the same text.
embeddings.fake.FakeEmbeddings
Fake embedding model.
embeddings.fastembed.FastEmbedEmbeddings
Qdrant FastEmbedding models.
embeddings.google_palm.GooglePalmEmbeddings
Google's PaLM Embeddings APIs.
embeddings.gpt4all.GPT4AllEmbeddings
GPT4All embedding models.
embeddings.gradient_ai.GradientEmbeddings
Gradient.ai Embedding models.
embeddings.gradient_ai.TinyAsyncGradientEmbeddingClient([...])
A helper tool to embed Gradient.
embeddings.huggingface.HuggingFaceBgeEmbeddings
HuggingFace BGE sentence_transformers embedding models.
embeddings.huggingface.HuggingFaceEmbeddings
HuggingFace sentence_transformers embedding models.
embeddings.huggingface.HuggingFaceInferenceAPIEmbeddings
Embed texts using the HuggingFace API.
embeddings.huggingface.HuggingFaceInstructEmbeddings
Wrapper around sentence_transformers embedding models.
embeddings.huggingface_hub.HuggingFaceHubEmbeddings
HuggingFaceHub embedding models.
embeddings.infinity.InfinityEmbeddings
Embedding models for self-hosted https://github.com/michaelfeil/infinity This should also work for text-embeddings-inference and other self-hosted openai-compatible servers.
embeddings.infinity.TinyAsyncOpenAIInfinityEmbeddingClient([...])
A helper tool to embed Infinity.
embeddings.javelin_ai_gateway.JavelinAIGatewayEmbeddings
Wrapper around embeddings LLMs in the Javelin AI Gateway.
embeddings.jina.JinaEmbeddings
Jina embedding models.
embeddings.johnsnowlabs.JohnSnowLabsEmbeddings
JohnSnowLabs embedding models
embeddings.llamacpp.LlamaCppEmbeddings
llama.cpp embedding models. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-32 | embeddings.llamacpp.LlamaCppEmbeddings
llama.cpp embedding models.
embeddings.llm_rails.LLMRailsEmbeddings
LLMRails embedding models.
embeddings.localai.LocalAIEmbeddings
LocalAI embedding models.
embeddings.minimax.MiniMaxEmbeddings
MiniMax's embedding service.
embeddings.mlflow.MlflowEmbeddings
Wrapper around embeddings LLMs in MLflow.
embeddings.mlflow_gateway.MlflowAIGatewayEmbeddings
Wrapper around embeddings LLMs in the MLflow AI Gateway.
embeddings.modelscope_hub.ModelScopeEmbeddings
ModelScopeHub embedding models.
embeddings.mosaicml.MosaicMLInstructorEmbeddings
MosaicML embedding service.
embeddings.nlpcloud.NLPCloudEmbeddings
NLP Cloud embedding models.
embeddings.octoai_embeddings.OctoAIEmbeddings
OctoAI Compute Service embedding models.
embeddings.ollama.OllamaEmbeddings
Ollama locally runs large language models.
embeddings.openai.OpenAIEmbeddings
OpenAI embedding models.
embeddings.sagemaker_endpoint.EmbeddingsContentHandler()
Content handler for LLM class.
embeddings.sagemaker_endpoint.SagemakerEndpointEmbeddings
Custom Sagemaker Inference Endpoints.
embeddings.self_hosted.SelfHostedEmbeddings
Custom embedding models on self-hosted remote hardware.
embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceEmbeddings
HuggingFace embedding models on self-hosted remote hardware.
embeddings.self_hosted_hugging_face.SelfHostedHuggingFaceInstructEmbeddings
HuggingFace InstructEmbedding models on self-hosted remote hardware.
embeddings.spacy_embeddings.SpacyEmbeddings
Embeddings by SpaCy models.
embeddings.tensorflow_hub.TensorflowHubEmbeddings | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-33 | Embeddings by SpaCy models.
embeddings.tensorflow_hub.TensorflowHubEmbeddings
TensorflowHub embedding models.
embeddings.vertexai.VertexAIEmbeddings
Google Cloud VertexAI embedding models.
embeddings.voyageai.VoyageEmbeddings
Voyage embedding models.
embeddings.xinference.XinferenceEmbeddings([...])
Xinference embedding models.
Functions¶
embeddings.dashscope.embed_with_retry(...)
Use tenacity to retry the embedding call.
embeddings.google_palm.embed_with_retry(...)
Use tenacity to retry the completion call.
embeddings.localai.async_embed_with_retry(...)
Use tenacity to retry the embedding call.
embeddings.localai.embed_with_retry(...)
Use tenacity to retry the embedding call.
embeddings.minimax.embed_with_retry(...)
Use tenacity to retry the completion call.
embeddings.openai.async_embed_with_retry(...)
Use tenacity to retry the embedding call.
embeddings.openai.embed_with_retry(...)
Use tenacity to retry the embedding call.
embeddings.self_hosted_hugging_face.load_embedding_model(...)
Load the embedding model.
embeddings.voyageai.embed_with_retry(...)
Use tenacity to retry the embedding call.
langchain_community.graphs¶
Graphs provide a natural language interface to graph databases.
Classes¶
graphs.arangodb_graph.ArangoGraph(db)
ArangoDB wrapper for graph operations.
graphs.falkordb_graph.FalkorDBGraph(database)
FalkorDB wrapper for graph operations.
graphs.graph_document.GraphDocument
Represents a graph document consisting of nodes and relationships.
graphs.graph_document.Node
Represents a node in a graph with associated properties.
graphs.graph_document.Relationship
Represents a directed relationship between two nodes in a graph.
graphs.graph_store.GraphStore()
An abstract class wrapper for graph operations. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-34 | graphs.graph_store.GraphStore()
An abstract class wrapper for graph operations.
graphs.hugegraph.HugeGraph([username, ...])
HugeGraph wrapper for graph operations.
graphs.kuzu_graph.KuzuGraph(db[, database])
Kùzu wrapper for graph operations.
graphs.memgraph_graph.MemgraphGraph(url, ...)
Memgraph wrapper for graph operations.
graphs.nebula_graph.NebulaGraph(space[, ...])
NebulaGraph wrapper for graph operations.
graphs.neo4j_graph.Neo4jGraph([url, ...])
Neo4j wrapper for graph operations.
graphs.neptune_graph.NeptuneGraph(host[, ...])
Neptune wrapper for graph operations.
graphs.neptune_graph.NeptuneQueryException(...)
A class to handle queries that fail to execute
graphs.networkx_graph.KnowledgeTriple(...)
A triple in the graph.
graphs.networkx_graph.NetworkxEntityGraph([graph])
Networkx wrapper for entity graph operations.
graphs.rdf_graph.RdfGraph([source_file, ...])
RDFlib wrapper for graph operations.
Functions¶
graphs.arangodb_graph.get_arangodb_client([...])
Get the Arango DB client from credentials.
graphs.networkx_graph.get_entities(entity_str)
Extract entities from entity string.
graphs.networkx_graph.parse_triples(...)
Parse knowledge triples from the knowledge string.
langchain_community.indexes¶
Classes¶
indexes.base.RecordManager(namespace)
An abstract base class representing the interface for a record manager.
langchain_community.llms¶
LLM classes provide
access to the large language model (LLM) APIs and services.
Class hierarchy:
BaseLanguageModel --> BaseLLM --> LLM --> <name> # Examples: AI21, HuggingFaceHub, OpenAI
Main helpers:
LLMResult, PromptValue, | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-35 | Main helpers:
LLMResult, PromptValue,
CallbackManagerForLLMRun, AsyncCallbackManagerForLLMRun,
CallbackManager, AsyncCallbackManager,
AIMessage, BaseMessage
Classes¶
llms.ai21.AI21
AI21 large language models.
llms.ai21.AI21PenaltyData
Parameters for AI21 penalty data.
llms.aleph_alpha.AlephAlpha
Aleph Alpha large language models.
llms.amazon_api_gateway.AmazonAPIGateway
Amazon API Gateway to access LLM models hosted on AWS.
llms.amazon_api_gateway.ContentHandlerAmazonAPIGateway()
Adapter to prepare the inputs from Langchain to a format that LLM model expects.
llms.anthropic.Anthropic
Anthropic large language models.
llms.anyscale.Anyscale
Anyscale large language models.
llms.arcee.Arcee
Arcee's Domain Adapted Language Models (DALMs).
llms.aviary.Aviary
Aviary hosted models.
llms.aviary.AviaryBackend(backend_url, bearer)
Aviary backend.
llms.azureml_endpoint.AzureMLEndpointClient(...)
AzureML Managed Endpoint client.
llms.azureml_endpoint.AzureMLOnlineEndpoint
Azure ML Online Endpoint models.
llms.azureml_endpoint.ContentFormatterBase()
Transform request and response of AzureML endpoint to match with required schema.
llms.azureml_endpoint.DollyContentFormatter()
Content handler for the Dolly-v2-12b model
llms.azureml_endpoint.GPT2ContentFormatter()
Content handler for GPT2
llms.azureml_endpoint.HFContentFormatter()
Content handler for LLMs from the HuggingFace catalog.
llms.azureml_endpoint.LlamaContentFormatter()
Content formatter for LLaMa
llms.azureml_endpoint.OSSContentFormatter() | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-36 | Content formatter for LLaMa
llms.azureml_endpoint.OSSContentFormatter()
Deprecated: Kept for backwards compatibility
llms.baidu_qianfan_endpoint.QianfanLLMEndpoint
Baidu Qianfan hosted open source or customized models.
llms.bananadev.Banana
Banana large language models.
llms.baseten.Baseten
Baseten models.
llms.beam.Beam
Beam API for gpt2 large language model.
llms.bedrock.Bedrock
Bedrock models.
llms.bedrock.BedrockBase
Base class for Bedrock models.
llms.bedrock.LLMInputOutputAdapter()
Adapter class to prepare the inputs from Langchain to a format that LLM model expects.
llms.bittensor.NIBittensorLLM
NIBittensor LLMs
llms.cerebriumai.CerebriumAI
CerebriumAI large language models.
llms.chatglm.ChatGLM
ChatGLM LLM service.
llms.clarifai.Clarifai
Clarifai large language models.
llms.cloudflare_workersai.CloudflareWorkersAI
Langchain LLM class to help to access Cloudflare Workers AI service.
llms.cohere.BaseCohere
Base class for Cohere models.
llms.cohere.Cohere
Cohere large language models.
llms.ctransformers.CTransformers
C Transformers LLM models.
llms.ctranslate2.CTranslate2
CTranslate2 language model.
llms.databricks.Databricks
Databricks serving endpoint or a cluster driver proxy app for LLM.
llms.deepinfra.DeepInfra
DeepInfra models.
llms.deepsparse.DeepSparse
Neural Magic DeepSparse LLM interface. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-37 | llms.deepsparse.DeepSparse
Neural Magic DeepSparse LLM interface.
llms.edenai.EdenAI
Wrapper around edenai models.
llms.fake.FakeListLLM
Fake LLM for testing purposes.
llms.fake.FakeStreamingListLLM
Fake streaming list LLM for testing purposes.
llms.fireworks.Fireworks
Fireworks models.
llms.forefrontai.ForefrontAI
ForefrontAI large language models.
llms.gigachat.GigaChat
GigaChat large language models API.
llms.google_palm.GooglePalm
[Deprecated] DEPRECATED: Use langchain_google_genai.GoogleGenerativeAI instead.
llms.gooseai.GooseAI
GooseAI large language models.
llms.gpt4all.GPT4All
GPT4All language models.
llms.gradient_ai.GradientLLM
Gradient.ai LLM Endpoints.
llms.gradient_ai.TrainResult
Train result.
llms.huggingface_endpoint.HuggingFaceEndpoint
HuggingFace Endpoint models.
llms.huggingface_hub.HuggingFaceHub
HuggingFaceHub models.
llms.huggingface_pipeline.HuggingFacePipeline
HuggingFace Pipeline API.
llms.huggingface_text_gen_inference.HuggingFaceTextGenInference
HuggingFace text generation API.
llms.human.HumanInputLLM
It returns user input as the response.
llms.javelin_ai_gateway.JavelinAIGateway
Javelin AI Gateway LLMs.
llms.javelin_ai_gateway.Params
Parameters for the Javelin AI Gateway LLM.
llms.koboldai.KoboldApiLLM
Kobold API language model.
llms.llamacpp.LlamaCpp
llama.cpp model.
llms.manifest.ManifestWrapper | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-38 | llama.cpp model.
llms.manifest.ManifestWrapper
HazyResearch's Manifest library.
llms.minimax.Minimax
Wrapper around Minimax large language models.
llms.minimax.MinimaxCommon
Common parameters for Minimax large language models.
llms.mlflow.Mlflow
Wrapper around completions LLMs in MLflow.
llms.mlflow.Params
Parameters for MLflow
llms.mlflow_ai_gateway.MlflowAIGateway
Wrapper around completions LLMs in the MLflow AI Gateway.
llms.mlflow_ai_gateway.Params
Parameters for the MLflow AI Gateway LLM.
llms.modal.Modal
Modal large language models.
llms.mosaicml.MosaicML
MosaicML LLM service.
llms.nlpcloud.NLPCloud
NLPCloud large language models.
llms.octoai_endpoint.OctoAIEndpoint
OctoAI LLM Endpoints.
llms.ollama.Ollama
Ollama locally runs large language models.
llms.ollama.OllamaEndpointNotFoundError
Raised when the Ollama endpoint is not found.
llms.opaqueprompts.OpaquePrompts
An LLM wrapper that uses OpaquePrompts to sanitize prompts.
llms.openai.AzureOpenAI
Azure-specific OpenAI large language models.
llms.openai.BaseOpenAI
Base OpenAI large language model class.
llms.openai.OpenAI
OpenAI large language models.
llms.openai.OpenAIChat
OpenAI Chat large language models.
llms.openllm.IdentifyingParams
Parameters for identifying a model as a typed dict.
llms.openllm.OpenLLM
OpenLLM, supporting both in-process model instance and remote OpenLLM servers.
llms.openlm.OpenLM
OpenLM models. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-39 | llms.openlm.OpenLM
OpenLM models.
llms.pai_eas_endpoint.PaiEasEndpoint
Langchain LLM class to help to access eass llm service.
llms.petals.Petals
Petals Bloom models.
llms.pipelineai.PipelineAI
PipelineAI large language models.
llms.predibase.Predibase
Use your Predibase models with Langchain.
llms.predictionguard.PredictionGuard
Prediction Guard large language models.
llms.promptlayer_openai.PromptLayerOpenAI
PromptLayer OpenAI large language models.
llms.promptlayer_openai.PromptLayerOpenAIChat
Wrapper around OpenAI large language models.
llms.replicate.Replicate
Replicate models.
llms.rwkv.RWKV
RWKV language models.
llms.sagemaker_endpoint.ContentHandlerBase()
A handler class to transform input from LLM to a format that SageMaker endpoint expects.
llms.sagemaker_endpoint.LLMContentHandler()
Content handler for LLM class.
llms.sagemaker_endpoint.LineIterator(stream)
A helper class for parsing the byte stream input.
llms.sagemaker_endpoint.SagemakerEndpoint
Sagemaker Inference Endpoint models.
llms.self_hosted.SelfHostedPipeline
Model inference on self-hosted remote hardware.
llms.self_hosted_hugging_face.SelfHostedHuggingFaceLLM
HuggingFace Pipeline API to run on self-hosted remote hardware.
llms.stochasticai.StochasticAI
StochasticAI large language models.
llms.symblai_nebula.Nebula
Nebula Service models.
llms.textgen.TextGen
text-generation-webui models.
llms.titan_takeoff.TitanTakeoff
Wrapper around Titan Takeoff APIs. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-40 | llms.titan_takeoff.TitanTakeoff
Wrapper around Titan Takeoff APIs.
llms.titan_takeoff_pro.TitanTakeoffPro
Create a new model by parsing and validating input data from keyword arguments.
llms.together.Together
Wrapper around Together AI models.
llms.tongyi.Tongyi
Tongyi Qwen large language models.
llms.vertexai.VertexAI
Google Vertex AI large language models.
llms.vertexai.VertexAIModelGarden
Large language models served from Vertex AI Model Garden.
llms.vllm.VLLM
VLLM language model.
llms.vllm.VLLMOpenAI
vLLM OpenAI-compatible API client
llms.volcengine_maas.VolcEngineMaasBase
Base class for VolcEngineMaas models.
llms.volcengine_maas.VolcEngineMaasLLM
volc engine maas hosts a plethora of models.
llms.watsonxllm.WatsonxLLM
IBM watsonx.ai large language models.
llms.writer.Writer
Writer large language models.
llms.xinference.Xinference
Wrapper for accessing Xinference's large-scale model inference service.
llms.yandex.YandexGPT
Yandex large language models.
Functions¶
llms.anyscale.create_llm_result(choices, ...)
Create the LLMResult from the choices and prompts.
llms.anyscale.update_token_usage(keys, ...)
Update token usage.
llms.aviary.get_completions(model, prompt[, ...])
Get completions from Aviary models.
llms.aviary.get_models()
List available models
llms.cohere.acompletion_with_retry(llm, **kwargs)
Use tenacity to retry the completion call. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-41 | Use tenacity to retry the completion call.
llms.cohere.completion_with_retry(llm, **kwargs)
Use tenacity to retry the completion call.
llms.databricks.get_default_api_token()
Gets the default Databricks personal access token.
llms.databricks.get_default_host()
Gets the default Databricks workspace hostname.
llms.databricks.get_repl_context()
Gets the notebook REPL context if running inside a Databricks notebook.
llms.fireworks.acompletion_with_retry(llm, ...)
Use tenacity to retry the completion call.
llms.fireworks.acompletion_with_retry_batching(...)
Use tenacity to retry the completion call.
llms.fireworks.acompletion_with_retry_streaming(...)
Use tenacity to retry the completion call for streaming.
llms.fireworks.completion_with_retry(llm, ...)
Use tenacity to retry the completion call.
llms.fireworks.completion_with_retry_batching(...)
Use tenacity to retry the completion call.
llms.fireworks.conditional_decorator(...)
llms.google_palm.completion_with_retry(llm, ...)
Use tenacity to retry the completion call.
llms.koboldai.clean_url(url)
Remove trailing slash and /api from url if present.
llms.loading.load_llm(file)
Load LLM from file.
llms.loading.load_llm_from_config(config)
Load LLM from Config Dict.
llms.openai.acompletion_with_retry(llm[, ...])
Use tenacity to retry the async completion call.
llms.openai.completion_with_retry(llm[, ...])
Use tenacity to retry the completion call.
llms.openai.update_token_usage(keys, ...)
Update token usage.
llms.symblai_nebula.completion_with_retry(...) | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-42 | Update token usage.
llms.symblai_nebula.completion_with_retry(...)
Use tenacity to retry the completion call.
llms.symblai_nebula.make_request(self, prompt)
Generate text from the model.
llms.tongyi.generate_with_retry(llm, **kwargs)
Use tenacity to retry the completion call.
llms.tongyi.stream_generate_with_retry(llm, ...)
Use tenacity to retry the completion call.
llms.utils.enforce_stop_tokens(text, stop)
Cut off the text as soon as any stop words occur.
llms.vertexai.acompletion_with_retry(llm, prompt)
Use tenacity to retry the completion call.
llms.vertexai.completion_with_retry(llm, prompt)
Use tenacity to retry the completion call.
llms.vertexai.is_codey_model(model_name)
Returns True if the model name is a Codey model.
llms.vertexai.is_gemini_model(model_name)
Returns True if the model name is a Gemini model.
langchain_community.retrievers¶
Retriever class returns Documents given a text query.
It is more general than a vector store. A retriever does not need to be able to
store documents, only to return (or retrieve) it. Vector stores can be used as
the backbone of a retriever, but there are other types of retrievers as well.
Class hierarchy:
BaseRetriever --> <name>Retriever # Examples: ArxivRetriever, MergerRetriever
Main helpers:
Document, Serializable, Callbacks,
CallbackManagerForRetrieverRun, AsyncCallbackManagerForRetrieverRun
Classes¶
retrievers.arcee.ArceeRetriever
Document retriever for Arcee's Domain Adapted Language Models (DALMs). | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-43 | Document retriever for Arcee's Domain Adapted Language Models (DALMs).
retrievers.arxiv.ArxivRetriever
Arxiv retriever.
retrievers.azure_cognitive_search.AzureCognitiveSearchRetriever
Azure Cognitive Search service retriever.
retrievers.bedrock.AmazonKnowledgeBasesRetriever
A retriever class for Amazon Bedrock Knowledge Bases.
retrievers.bedrock.RetrievalConfig
Create a new model by parsing and validating input data from keyword arguments.
retrievers.bedrock.VectorSearchConfig
Create a new model by parsing and validating input data from keyword arguments.
retrievers.bm25.BM25Retriever
BM25 retriever without Elasticsearch.
retrievers.chaindesk.ChaindeskRetriever
Chaindesk API retriever.
retrievers.chatgpt_plugin_retriever.ChatGPTPluginRetriever
ChatGPT plugin retriever.
retrievers.cohere_rag_retriever.CohereRagRetriever
Cohere Chat API with RAG.
retrievers.databerry.DataberryRetriever
Databerry API retriever.
retrievers.docarray.DocArrayRetriever
DocArray Document Indices retriever.
retrievers.docarray.SearchType(value[, ...])
Enumerator of the types of search to perform.
retrievers.elastic_search_bm25.ElasticSearchBM25Retriever
Elasticsearch retriever that uses BM25.
retrievers.embedchain.EmbedchainRetriever
Embedchain retriever.
retrievers.google_cloud_documentai_warehouse.GoogleDocumentAIWarehouseRetriever
A retriever based on Document AI Warehouse.
retrievers.google_vertex_ai_search.GoogleCloudEnterpriseSearchRetriever
Google Vertex Search API retriever alias for backwards compatibility. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-44 | Google Vertex Search API retriever alias for backwards compatibility.
retrievers.google_vertex_ai_search.GoogleVertexAIMultiTurnSearchRetriever
Google Vertex AI Search retriever for multi-turn conversations.
retrievers.google_vertex_ai_search.GoogleVertexAISearchRetriever
Google Vertex AI Search retriever.
retrievers.kay.KayAiRetriever
Retriever for Kay.ai datasets.
retrievers.kendra.AdditionalResultAttribute
Additional result attribute.
retrievers.kendra.AdditionalResultAttributeValue
Value of an additional result attribute.
retrievers.kendra.AmazonKendraRetriever
Amazon Kendra Index retriever.
retrievers.kendra.DocumentAttribute
Document attribute.
retrievers.kendra.DocumentAttributeValue
Value of a document attribute.
retrievers.kendra.Highlight
Information that highlights the keywords in the excerpt.
retrievers.kendra.QueryResult
Amazon Kendra Query API search result.
retrievers.kendra.QueryResultItem
Query API result item.
retrievers.kendra.ResultItem
Base class of a result item.
retrievers.kendra.RetrieveResult
Amazon Kendra Retrieve API search result.
retrievers.kendra.RetrieveResultItem
Retrieve API result item.
retrievers.kendra.TextWithHighLights
Text with highlights.
retrievers.knn.KNNRetriever
KNN retriever.
retrievers.llama_index.LlamaIndexGraphRetriever
LlamaIndex graph data structure retriever.
retrievers.llama_index.LlamaIndexRetriever
LlamaIndex retriever.
retrievers.metal.MetalRetriever
Metal API retriever.
retrievers.milvus.MilvusRetriever
Milvus API retriever.
retrievers.outline.OutlineRetriever | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-45 | Milvus API retriever.
retrievers.outline.OutlineRetriever
Retriever for Outline API.
retrievers.pinecone_hybrid_search.PineconeHybridSearchRetriever
Pinecone Hybrid Search retriever.
retrievers.pubmed.PubMedRetriever
PubMed API retriever.
retrievers.remote_retriever.RemoteLangChainRetriever
LangChain API retriever.
retrievers.svm.SVMRetriever
SVM retriever.
retrievers.tavily_search_api.SearchDepth(value)
Search depth as enumerator.
retrievers.tavily_search_api.TavilySearchAPIRetriever
Tavily Search API retriever.
retrievers.tfidf.TFIDFRetriever
TF-IDF retriever.
retrievers.vespa_retriever.VespaRetriever
Vespa retriever.
retrievers.weaviate_hybrid_search.WeaviateHybridSearchRetriever
Weaviate hybrid search retriever.
retrievers.wikipedia.WikipediaRetriever
Wikipedia API retriever.
retrievers.you.YouRetriever
You retriever that uses You.com's search API.
retrievers.zep.SearchScope(value[, names, ...])
Which documents to search.
retrievers.zep.SearchType(value[, names, ...])
Enumerator of the types of search to perform.
retrievers.zep.ZepRetriever
Zep MemoryStore Retriever.
retrievers.zilliz.ZillizRetriever
Zilliz API retriever.
Functions¶
retrievers.bm25.default_preprocessing_func(text)
retrievers.kendra.clean_excerpt(excerpt)
Clean an excerpt from Kendra. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-46 | retrievers.kendra.clean_excerpt(excerpt)
Clean an excerpt from Kendra.
retrievers.kendra.combined_text(item)
Combine a ResultItem title and excerpt into a single string.
retrievers.knn.create_index(contexts, embeddings)
Create an index of embeddings for a list of contexts.
retrievers.milvus.MilvusRetreiver(*args, ...)
Deprecated MilvusRetreiver.
retrievers.pinecone_hybrid_search.create_index(...)
Create an index from a list of contexts.
retrievers.pinecone_hybrid_search.hash_text(text)
Hash a text using SHA256.
retrievers.svm.create_index(contexts, embeddings)
Create an index of embeddings for a list of contexts.
retrievers.zilliz.ZillizRetreiver(*args, ...)
Deprecated ZillizRetreiver.
langchain_community.storage¶
Implementations of key-value stores and storage helpers.
Module provides implementations of various key-value stores that conform
to a simple key-value interface.
The primary goal of these storages is to support implementation of caching.
Classes¶
storage.exceptions.InvalidKeyException
Raised when a key is invalid; e.g., uses incorrect characters.
storage.redis.RedisStore(*[, client, ...])
BaseStore implementation using Redis as the underlying store.
storage.upstash_redis.UpstashRedisByteStore(*)
BaseStore implementation using Upstash Redis as the underlying store to store raw bytes.
storage.upstash_redis.UpstashRedisStore(*[, ...])
[Deprecated] BaseStore implementation using Upstash Redis as the underlying store to store strings.
langchain_community.tools¶
Tools are classes that an Agent uses to interact with the world.
Each tool has a description. Agent uses the description to choose the right | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-47 | Each tool has a description. Agent uses the description to choose the right
tool for the job.
Class hierarchy:
ToolMetaclass --> BaseTool --> <name>Tool # Examples: AIPluginTool, BaseGraphQLTool
<name> # Examples: BraveSearch, HumanInputRun
Main helpers:
CallbackManagerForToolRun, AsyncCallbackManagerForToolRun
Classes¶
tools.ainetwork.app.AINAppOps
Tool for app operations.
tools.ainetwork.app.AppOperationType(value)
Type of app operation as enumerator.
tools.ainetwork.app.AppSchema
Schema for app operations.
tools.ainetwork.base.AINBaseTool
Base class for the AINetwork tools.
tools.ainetwork.base.OperationType(value[, ...])
Type of operation as enumerator.
tools.ainetwork.owner.AINOwnerOps
Tool for owner operations.
tools.ainetwork.owner.RuleSchema
Schema for owner operations.
tools.ainetwork.rule.AINRuleOps
Tool for owner operations.
tools.ainetwork.rule.RuleSchema
Schema for owner operations.
tools.ainetwork.transfer.AINTransfer
Tool for transfer operations.
tools.ainetwork.transfer.TransferSchema
Schema for transfer operations.
tools.ainetwork.value.AINValueOps
Tool for value operations.
tools.ainetwork.value.ValueSchema
Schema for value operations.
tools.amadeus.base.AmadeusBaseTool
Base Tool for Amadeus.
tools.amadeus.closest_airport.AmadeusClosestAirport
Tool for finding the closest airport to a particular location.
tools.amadeus.closest_airport.ClosestAirportSchema
Schema for the AmadeusClosestAirport tool.
tools.amadeus.flight_search.AmadeusFlightSearch
Tool for searching for a single flight between two airports.
tools.amadeus.flight_search.FlightSearchSchema | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-48 | tools.amadeus.flight_search.FlightSearchSchema
Schema for the AmadeusFlightSearch tool.
tools.arxiv.tool.ArxivInput
Create a new model by parsing and validating input data from keyword arguments.
tools.arxiv.tool.ArxivQueryRun
Tool that searches the Arxiv API.
tools.azure_cognitive_services.form_recognizer.AzureCogsFormRecognizerTool
Tool that queries the Azure Cognitive Services Form Recognizer API.
tools.azure_cognitive_services.image_analysis.AzureCogsImageAnalysisTool
Tool that queries the Azure Cognitive Services Image Analysis API.
tools.azure_cognitive_services.speech2text.AzureCogsSpeech2TextTool
Tool that queries the Azure Cognitive Services Speech2Text API.
tools.azure_cognitive_services.text2speech.AzureCogsText2SpeechTool
Tool that queries the Azure Cognitive Services Text2Speech API.
tools.azure_cognitive_services.text_analytics_health.AzureCogsTextAnalyticsHealthTool
Tool that queries the Azure Cognitive Services Text Analytics for Health API.
tools.bearly.tool.BearlyInterpreterTool(api_key)
Tool for evaluating python code in a sandbox environment.
tools.bearly.tool.BearlyInterpreterToolArguments
Arguments for the BearlyInterpreterTool.
tools.bearly.tool.FileInfo
Information about a file to be uploaded.
tools.bing_search.tool.BingSearchResults
Tool that queries the Bing Search API and gets back json.
tools.bing_search.tool.BingSearchRun
Tool that queries the Bing search API.
tools.brave_search.tool.BraveSearch
Tool that queries the BraveSearch.
tools.clickup.tool.ClickupAction
Tool that queries the Clickup API.
tools.dataforseo_api_search.tool.DataForSeoAPISearchResults
Tool that queries the DataForSeo Google Search API and get back json.
tools.dataforseo_api_search.tool.DataForSeoAPISearchRun | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-49 | tools.dataforseo_api_search.tool.DataForSeoAPISearchRun
Tool that queries the DataForSeo Google search API.
tools.ddg_search.tool.DDGInput
Create a new model by parsing and validating input data from keyword arguments.
tools.ddg_search.tool.DuckDuckGoSearchResults
Tool that queries the DuckDuckGo search API and gets back json.
tools.ddg_search.tool.DuckDuckGoSearchRun
Tool that queries the DuckDuckGo search API.
tools.e2b_data_analysis.tool.E2BDataAnalysisTool
Tool for running python code in a sandboxed environment for data analysis.
tools.e2b_data_analysis.tool.E2BDataAnalysisToolArguments
Arguments for the E2BDataAnalysisTool.
tools.e2b_data_analysis.tool.UploadedFile
Description of the uploaded path with its remote path.
tools.e2b_data_analysis.unparse.Unparser(tree)
Methods in this class recursively traverse an AST and output source code for the abstract syntax; original formatting is disregarded.
tools.edenai.audio_speech_to_text.EdenAiSpeechToTextTool
Tool that queries the Eden AI Speech To Text API.
tools.edenai.audio_text_to_speech.EdenAiTextToSpeechTool
Tool that queries the Eden AI Text to speech API.
tools.edenai.edenai_base_tool.EdenaiTool
the base tool for all the EdenAI Tools .
tools.edenai.image_explicitcontent.EdenAiExplicitImageTool
Tool that queries the Eden AI Explicit image detection.
tools.edenai.image_objectdetection.EdenAiObjectDetectionTool
Tool that queries the Eden AI Object detection API.
tools.edenai.ocr_identityparser.EdenAiParsingIDTool
Tool that queries the Eden AI Identity parsing API.
tools.edenai.ocr_invoiceparser.EdenAiParsingInvoiceTool | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-50 | tools.edenai.ocr_invoiceparser.EdenAiParsingInvoiceTool
Tool that queries the Eden AI Invoice parsing API.
tools.edenai.text_moderation.EdenAiTextModerationTool
Tool that queries the Eden AI Explicit text detection.
tools.eleven_labs.models.ElevenLabsModel(value)
Models available for Eleven Labs Text2Speech.
tools.eleven_labs.text2speech.ElevenLabsModel(value)
Models available for Eleven Labs Text2Speech.
tools.eleven_labs.text2speech.ElevenLabsText2SpeechTool
Tool that queries the Eleven Labs Text2Speech API.
tools.file_management.copy.CopyFileTool
Tool that copies a file.
tools.file_management.copy.FileCopyInput
Input for CopyFileTool.
tools.file_management.delete.DeleteFileTool
Tool that deletes a file.
tools.file_management.delete.FileDeleteInput
Input for DeleteFileTool.
tools.file_management.file_search.FileSearchInput
Input for FileSearchTool.
tools.file_management.file_search.FileSearchTool
Tool that searches for files in a subdirectory that match a regex pattern.
tools.file_management.list_dir.DirectoryListingInput
Input for ListDirectoryTool.
tools.file_management.list_dir.ListDirectoryTool
Tool that lists files and directories in a specified folder.
tools.file_management.move.FileMoveInput
Input for MoveFileTool.
tools.file_management.move.MoveFileTool
Tool that moves a file.
tools.file_management.read.ReadFileInput
Input for ReadFileTool.
tools.file_management.read.ReadFileTool
Tool that reads a file.
tools.file_management.utils.BaseFileToolMixin
Mixin for file system tools.
tools.file_management.utils.FileValidationError
Error for paths outside the root directory.
tools.file_management.write.WriteFileInput
Input for WriteFileTool.
tools.file_management.write.WriteFileTool
Tool that writes a file to disk. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-51 | tools.file_management.write.WriteFileTool
Tool that writes a file to disk.
tools.github.tool.GitHubAction
Tool for interacting with the GitHub API.
tools.gitlab.tool.GitLabAction
Tool for interacting with the GitLab API.
tools.gmail.base.GmailBaseTool
Base class for Gmail tools.
tools.gmail.create_draft.CreateDraftSchema
Input for CreateDraftTool.
tools.gmail.create_draft.GmailCreateDraft
Tool that creates a draft email for Gmail.
tools.gmail.get_message.GmailGetMessage
Tool that gets a message by ID from Gmail.
tools.gmail.get_message.SearchArgsSchema
Input for GetMessageTool.
tools.gmail.get_thread.GetThreadSchema
Input for GetMessageTool.
tools.gmail.get_thread.GmailGetThread
Tool that gets a thread by ID from Gmail.
tools.gmail.search.GmailSearch
Tool that searches for messages or threads in Gmail.
tools.gmail.search.Resource(value[, names, ...])
Enumerator of Resources to search.
tools.gmail.search.SearchArgsSchema
Input for SearchGmailTool.
tools.gmail.send_message.GmailSendMessage
Tool that sends a message to Gmail.
tools.gmail.send_message.SendMessageSchema
Input for SendMessageTool.
tools.golden_query.tool.GoldenQueryRun
Tool that adds the capability to query using the Golden API and get back JSON.
tools.google_cloud.texttospeech.GoogleCloudTextToSpeechTool
Tool that queries the Google Cloud Text to Speech API.
tools.google_finance.tool.GoogleFinanceQueryRun
Tool that queries the Google Finance API.
tools.google_jobs.tool.GoogleJobsQueryRun
Tool that queries the Google Jobs API.
tools.google_lens.tool.GoogleLensQueryRun
Tool that queries the Google Lens API.
tools.google_places.tool.GooglePlacesSchema
Input for GooglePlacesTool.
tools.google_places.tool.GooglePlacesTool
Tool that queries the Google places API. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-52 | tools.google_places.tool.GooglePlacesTool
Tool that queries the Google places API.
tools.google_scholar.tool.GoogleScholarQueryRun
Tool that queries the Google search API.
tools.google_search.tool.GoogleSearchResults
Tool that queries the Google Search API and gets back json.
tools.google_search.tool.GoogleSearchRun
Tool that queries the Google search API.
tools.google_serper.tool.GoogleSerperResults
Tool that queries the Serper.dev Google Search API and get back json.
tools.google_serper.tool.GoogleSerperRun
Tool that queries the Serper.dev Google search API.
tools.google_trends.tool.GoogleTrendsQueryRun
Tool that queries the Google trends API.
tools.graphql.tool.BaseGraphQLTool
Base tool for querying a GraphQL API.
tools.human.tool.HumanInputRun
Tool that asks user for input.
tools.ifttt.IFTTTWebhook
IFTTT Webhook.
tools.jira.tool.JiraAction
Tool that queries the Atlassian Jira API.
tools.json.tool.JsonGetValueTool
Tool for getting a value in a JSON spec.
tools.json.tool.JsonListKeysTool
Tool for listing keys in a JSON spec.
tools.json.tool.JsonSpec
Base class for JSON spec.
tools.memorize.tool.Memorize
Create a new model by parsing and validating input data from keyword arguments.
tools.memorize.tool.TrainableLLM(*args, **kwargs)
tools.merriam_webster.tool.MerriamWebsterQueryRun
Tool that searches the Merriam-Webster API.
tools.metaphor_search.tool.MetaphorSearchResults
Tool that queries the Metaphor Search API and gets back json.
tools.multion.close_session.CloseSessionSchema
Input for UpdateSessionTool.
tools.multion.close_session.MultionCloseSession | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-53 | Input for UpdateSessionTool.
tools.multion.close_session.MultionCloseSession
Tool that closes an existing Multion Browser Window with provided fields.
tools.multion.create_session.CreateSessionSchema
Input for CreateSessionTool.
tools.multion.create_session.MultionCreateSession
Tool that creates a new Multion Browser Window with provided fields.
tools.multion.update_session.MultionUpdateSession
Tool that updates an existing Multion Browser Window with provided fields.
tools.multion.update_session.UpdateSessionSchema
Input for UpdateSessionTool.
tools.nasa.tool.NasaAction
Tool that queries the Atlassian Jira API.
tools.nuclia.tool.NUASchema
Input for Nuclia Understanding API.
tools.nuclia.tool.NucliaUnderstandingAPI
Tool to process files with the Nuclia Understanding API.
tools.office365.base.O365BaseTool
Base class for the Office 365 tools.
tools.office365.create_draft_message.CreateDraftMessageSchema
Input for SendMessageTool.
tools.office365.create_draft_message.O365CreateDraftMessage
Tool for creating a draft email in Office 365.
tools.office365.events_search.O365SearchEvents
Class for searching calendar events in Office 365
tools.office365.events_search.SearchEventsInput
Input for SearchEmails Tool.
tools.office365.messages_search.O365SearchEmails
Class for searching email messages in Office 365
tools.office365.messages_search.SearchEmailsInput
Input for SearchEmails Tool.
tools.office365.send_event.O365SendEvent
Tool for sending calendar events in Office 365.
tools.office365.send_event.SendEventSchema
Input for CreateEvent Tool.
tools.office365.send_message.O365SendMessage
Tool for sending an email in Office 365.
tools.office365.send_message.SendMessageSchema
Input for SendMessageTool.
tools.openapi.utils.api_models.APIOperation
A model for a single API operation. | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-54 | tools.openapi.utils.api_models.APIOperation
A model for a single API operation.
tools.openapi.utils.api_models.APIProperty
A model for a property in the query, path, header, or cookie params.
tools.openapi.utils.api_models.APIPropertyBase
Base model for an API property.
tools.openapi.utils.api_models.APIPropertyLocation(value)
The location of the property.
tools.openapi.utils.api_models.APIRequestBody
A model for a request body.
tools.openapi.utils.api_models.APIRequestBodyProperty
A model for a request body property.
tools.openweathermap.tool.OpenWeatherMapQueryRun
Tool that queries the OpenWeatherMap API.
tools.playwright.base.BaseBrowserTool
Base class for browser tools.
tools.playwright.click.ClickTool
Tool for clicking on an element with the given CSS selector.
tools.playwright.click.ClickToolInput
Input for ClickTool.
tools.playwright.current_page.CurrentWebPageTool
Tool for getting the URL of the current webpage.
tools.playwright.extract_hyperlinks.ExtractHyperlinksTool
Extract all hyperlinks on the page.
tools.playwright.extract_hyperlinks.ExtractHyperlinksToolInput
Input for ExtractHyperlinksTool.
tools.playwright.extract_text.ExtractTextTool
Tool for extracting all the text on the current webpage.
tools.playwright.get_elements.GetElementsTool
Tool for getting elements in the current web page matching a CSS selector.
tools.playwright.get_elements.GetElementsToolInput
Input for GetElementsTool.
tools.playwright.navigate.NavigateTool
Tool for navigating a browser to a URL.
tools.playwright.navigate.NavigateToolInput
Input for NavigateToolInput.
tools.playwright.navigate_back.NavigateBackTool
Navigate back to the previous page in the browser history.
tools.plugin.AIPlugin
AI Plugin Definition.
tools.plugin.AIPluginTool
Tool for getting the OpenAPI spec for an AI Plugin.
tools.plugin.AIPluginToolSchema | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-55 | Tool for getting the OpenAPI spec for an AI Plugin.
tools.plugin.AIPluginToolSchema
Schema for AIPluginTool.
tools.plugin.ApiConfig
API Configuration.
tools.powerbi.tool.InfoPowerBITool
Tool for getting metadata about a PowerBI Dataset.
tools.powerbi.tool.ListPowerBITool
Tool for getting tables names.
tools.powerbi.tool.QueryPowerBITool
Tool for querying a Power BI Dataset.
tools.pubmed.tool.PubmedQueryRun
Tool that searches the PubMed API.
tools.reddit_search.tool.RedditSearchRun
Tool that queries for posts on a subreddit.
tools.reddit_search.tool.RedditSearchSchema
Input for Reddit search.
tools.requests.tool.BaseRequestsTool
Base class for requests tools.
tools.requests.tool.RequestsDeleteTool
Tool for making a DELETE request to an API endpoint.
tools.requests.tool.RequestsGetTool
Tool for making a GET request to an API endpoint.
tools.requests.tool.RequestsPatchTool
Tool for making a PATCH request to an API endpoint.
tools.requests.tool.RequestsPostTool
Tool for making a POST request to an API endpoint.
tools.requests.tool.RequestsPutTool
Tool for making a PUT request to an API endpoint.
tools.scenexplain.tool.SceneXplainInput
Input for SceneXplain.
tools.scenexplain.tool.SceneXplainTool
Tool that explains images.
tools.searchapi.tool.SearchAPIResults
Tool that queries the SearchApi.io search API and returns JSON.
tools.searchapi.tool.SearchAPIRun
Tool that queries the SearchApi.io search API.
tools.searx_search.tool.SearxSearchResults
Tool that queries a Searx instance and gets back json.
tools.searx_search.tool.SearxSearchRun
Tool that queries a Searx instance.
tools.shell.tool.ShellInput
Commands for the Bash Shell tool.
tools.shell.tool.ShellTool | https://api.python.langchain.com/en/latest/community_api_reference.html |
82a11935e65b-56 | Commands for the Bash Shell tool.
tools.shell.tool.ShellTool
Tool to run shell commands.
tools.slack.base.SlackBaseTool
Base class for Slack tools.
tools.slack.get_channel.SlackGetChannel
Create a new model by parsing and validating input data from keyword arguments.
tools.slack.get_message.SlackGetMessage
Create a new model by parsing and validating input data from keyword arguments.
tools.slack.get_message.SlackGetMessageSchema
Input schema for SlackGetMessages.
tools.slack.schedule_message.ScheduleMessageSchema
Input for ScheduleMessageTool.
tools.slack.schedule_message.SlackScheduleMessage
Tool for scheduling a message in Slack.
tools.slack.send_message.SendMessageSchema
Input for SendMessageTool.
tools.slack.send_message.SlackSendMessage
Tool for sending a message in Slack.
tools.sleep.tool.SleepInput
Input for CopyFileTool.
tools.sleep.tool.SleepTool
Tool that adds the capability to sleep.
tools.spark_sql.tool.BaseSparkSQLTool
Base tool for interacting with Spark SQL.
tools.spark_sql.tool.InfoSparkSQLTool
Tool for getting metadata about a Spark SQL.
tools.spark_sql.tool.ListSparkSQLTool
Tool for getting tables names.
tools.spark_sql.tool.QueryCheckerTool
Use an LLM to check if a query is correct.
tools.spark_sql.tool.QuerySparkSQLTool
Tool for querying a Spark SQL.
tools.sql_database.tool.BaseSQLDatabaseTool
Base tool for interacting with a SQL database.
tools.sql_database.tool.InfoSQLDatabaseTool
Tool for getting metadata about a SQL database.
tools.sql_database.tool.ListSQLDatabaseTool
Tool for getting tables names.
tools.sql_database.tool.QuerySQLCheckerTool
Use an LLM to check if a query is correct.
tools.sql_database.tool.QuerySQLDataBaseTool
Tool for querying a SQL database.
tools.stackexchange.tool.StackExchangeTool | https://api.python.langchain.com/en/latest/community_api_reference.html |
Subsets and Splits