id stringlengths 14 16 | text stringlengths 36 2.73k | source stringlengths 59 127 |
|---|---|---|
ade01f84a537-20 | Construct a sql agent from an LLM and tools. | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/agent_toolkits.html |
ade01f84a537-21 | langchain.agents.agent_toolkits.create_sql_agent(llm: langchain.base_language.BaseLanguageModel, toolkit: langchain.agents.agent_toolkits.sql.toolkit.SQLDatabaseToolkit, agent_type: langchain.agents.agent_types.AgentType = AgentType.ZERO_SHOT_REACT_DESCRIPTION, callback_manager: Optional[langchain.callbacks.base.BaseCa... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/agent_toolkits.html |
ade01f84a537-22 | result of the action\n... (this Thought/Action/Action Input/Observation can repeat N times)\nThought: I now know the final answer\nFinal Answer: the final answer to the original input question', input_variables: Optional[List[str]] = None, top_k: int = 10, max_iterations: Optional[int] = 15, max_execution_time: Optiona... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/agent_toolkits.html |
ade01f84a537-23 | Construct a sql agent from an LLM and tools.
langchain.agents.agent_toolkits.create_vectorstore_agent(llm: langchain.base_language.BaseLanguageModel, toolkit: langchain.agents.agent_toolkits.vectorstore.toolkit.VectorStoreToolkit, callback_manager: Optional[langchain.callbacks.base.BaseCallbackManager] = None, prefix: ... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/agent_toolkits.html |
ade01f84a537-24 | Construct a vectorstore router agent from an LLM and tools.
previous
Tools
next
Utilities
By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Jun 16, 2023. | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/agent_toolkits.html |
212e130ca2e9-0 | .rst
.pdf
Utilities
Utilities#
General utilities.
pydantic model langchain.utilities.ApifyWrapper[source]#
Wrapper around Apify.
To use, you should have the apify-client python package installed,
and the environment variable APIFY_API_TOKEN set with your API key, or pass
apify_api_token as a named parameter to the cons... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-1 | Return type
ApifyDatasetLoader
call_actor(actor_id: str, run_input: Dict, dataset_mapping_function: Callable[[Dict], langchain.schema.Document], *, build: Optional[str] = None, memory_mbytes: Optional[int] = None, timeout_secs: Optional[int] = None) → langchain.document_loaders.apify_dataset.ApifyDatasetLoader[source]#... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-2 | Set doc_content_chars_max=None if you don’t want to limit the content size.
Parameters
top_k_results – number of the top-scored document used for the arxiv tool
ARXIV_MAX_QUERY_LENGTH – the cut limit on the query used for the arxiv tool.
load_max_docs – a limit to the number of loaded documents
load_all_available_meta ... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-3 | Run commands and return final output.
pydantic model langchain.utilities.BingSearchAPIWrapper[source]#
Wrapper for Bing Search API.
In order to set this up, follow instructions at:
https://levelup.gitconnected.com/api-tutorial-how-to-use-bing-web-search-api-in-python-4165d5592a7e
field bing_search_url: str [Required]#
... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-4 | num_results – The number of results to return.
Returns
snippet - The description of the result.
title - The title of the result.
link - The link to the result.
Return type
A list of dictionaries with the following keys
run(query: str) → str[source]#
pydantic model langchain.utilities.GooglePlacesAPIWrapper[source]#
Wra... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-5 | read the Managing Projects page and create a project in the Google API Console.
- Install the library using pip install google-api-python-client
The current version of the library is 2.70.0 at this time
2. To create an API key:
- Navigate to the APIs & Services→Credentials panel in Cloud Console.
- Select Create creden... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-6 | Returns
snippet - The description of the result.
title - The title of the result.
link - The link to the result.
Return type
A list of dictionaries with the following keys
run(query: str) → str[source]#
Run query through GoogleSearch and parse result.
pydantic model langchain.utilities.GoogleSerperAPIWrapper[source]#
W... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-7 | Wrapper around GraphQL API.
To use, you should have the gql python package installed.
This wrapper will use the GraphQL API to conduct queries.
field custom_headers: Optional[Dict[str, str]] = None#
field graphql_endpoint: str [Required]#
run(query: str) → str[source]#
Run a GraphQL query and get the results.
pydantic ... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-8 | pydantic model langchain.utilities.OpenWeatherMapAPIWrapper[source]#
Wrapper for OpenWeatherMap API using PyOWM.
Docs for using:
Go to OpenWeatherMap and sign up for an API key
Save your API KEY into OPENWEATHERMAP_API_KEY env variable
pip install pyowm
field openweathermap_api_key: Optional[str] = None#
field owm: Any... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-9 | Execute a DAX command and return the result asynchronously.
get_schemas() → str[source]#
Get the available schema’s.
get_table_info(table_names: Optional[Union[List[str], str]] = None) → str[source]#
Get information about specified tables.
get_table_names() → Iterable[str][source]#
Get names of tables available.
run(co... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-10 | Search PubMed for documents matching the query.
Return a list of dictionaries containing the document metadata.
load_docs(query: str) → List[langchain.schema.Document][source]#
retrieve_article(uid: str, webenv: str) → dict[source]#
run(query: str) → str[source]#
Run PubMed search and get the article meta information.
... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-11 | unsecure=True)
Validators
disable_ssl_warnings » unsecure
validate_params » all fields
field aiosession: Optional[Any] = None#
field categories: Optional[List[str]] = []#
field engines: Optional[List[str]] = []#
field headers: Optional[dict] = None#
field k: int = 10#
field params: dict [Optional]#
field query_suffix: ... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-12 | title: The title of the result.
link: The link to the result.
engines: The engines used for the result.
category: Searx category of the result.
}
Return type
Dict with the following keys
run(query: str, engines: Optional[List[str]] = None, categories: Optional[List[str]] = None, query_suffix: Optional[str] = '', **kwa... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-13 | serpapi_api_key as a named parameter to the constructor.
Example
from langchain import SerpAPIWrapper
serpapi = SerpAPIWrapper()
field aiosession: Optional[aiohttp.client.ClientSession] = None#
field params: dict = {'engine': 'google', 'gl': 'us', 'google_domain': 'google.com', 'hl': 'en'}#
field serpapi_api_key: Optio... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-14 | get_table_info(table_names: Optional[List[str]] = None) → str[source]#
get_table_info_no_throw(table_names: Optional[List[str]] = None) → str[source]#
Get information about specified tables.
Follows best practices as specified in: Rajkumar et al, 2022
(https://arxiv.org/abs/2204.00498)
If sample_rows_in_table_info, the... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-15 | PATCH the URL and return the text asynchronously.
async apost(url: str, data: Dict[str, Any], **kwargs: Any) → str[source]#
POST to the URL and return the text asynchronously.
async aput(url: str, data: Dict[str, Any], **kwargs: Any) → str[source]#
PUT the URL and return the text asynchronously.
delete(url: str, **kwar... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-16 | field account_sid: Optional[str] = None#
Twilio account string identifier.
field auth_token: Optional[str] = None#
Twilio auth token.
field from_number: Optional[str] = None#
A Twilio phone number in [E.164](https://www.twilio.com/docs/glossary/what-e164)
format, an
[alphanumeric sender ID](https://www.twilio.com/docs/... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
212e130ca2e9-17 | of the top-k results.
It limits the Document content by doc_content_chars_max.
field doc_content_chars_max: int = 4000#
field lang: str = 'en'#
field load_all_available_meta: bool = False#
field top_k_results: int = 3#
load(query: str) → List[langchain.schema.Document][source]#
Run Wikipedia search and get the article ... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/utilities.html |
c39d6ab62898-0 | .rst
.pdf
Embeddings
Embeddings#
Wrappers around embedding modules.
pydantic model langchain.embeddings.AlephAlphaAsymmetricSemanticEmbedding[source]#
Wrapper for Aleph Alpha’s Asymmetric Embeddings
AA provides you with an endpoint to embed a document and a query.
The models were optimized to make the embeddings of doc... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-1 | embed_documents(texts: List[str]) → List[List[float]][source]#
Call out to Aleph Alpha’s asymmetric Document endpoint.
Parameters
texts – The list of texts to embed.
Returns
List of embeddings, one for each text.
embed_query(text: str) → List[float][source]#
Call out to Aleph Alpha’s asymmetric, query embedding endpoin... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-2 | If a specific credential profile should be used, you must pass
the name of the profile from the ~/.aws/credentials file that is to be used.
Make sure the credentials / roles used have the required policies to
access the Bedrock service.
field credentials_profile_name: Optional[str] = None#
The name of the profile in th... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-3 | Parameters
text – The text to embed.
Returns
Embeddings for the text.
pydantic model langchain.embeddings.CohereEmbeddings[source]#
Wrapper around Cohere embedding models.
To use, you should have the cohere python package installed, and the
environment variable COHERE_API_KEY set with your API key or pass it
as a named... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-4 | os.environ["DASHSCOPE_API_KEY"] = "your DashScope API KEY"
from langchain.embeddings.dashscope import DashScopeEmbeddings
embeddings = DashScopeEmbeddings(
model="text-embedding-v1",
)
text = "This is a test query."
query_result = embeddings.embed_query(text)
field dashscope_api_key: Optional[str] = None#
Maximum n... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-5 | "Beta is the second letter of Greek alphabet",
]
)
r2 = deepinfra_emb.embed_query(
"What is the second letter of Greek alphabet"
)
field embed_instruction: str = 'passage: '#
Instruction used to embed documents.
field model_id: str = 'sentence-transformers/clip-ViT-B-32'#
Embeddings model to use.
field model_kw... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-6 | Generate embeddings for a list of documents.
Parameters
texts (List[str]) – A list of document text strings to generate embeddings
for.
Returns
A list of embeddings, one for each document in the inputlist.
Return type
List[List[float]]
embed_query(text: str) → List[float][source]#
Generate an embedding for a single que... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-7 | model_id,
input_field=input_field,
# es_cloud_id="foo",
# es_user="bar",
# es_password="baz",
)
documents = [
"This is an example document.",
"Another example document to generate embeddings for.",
]
embeddings_generator.embed_documents(documents)
classmethod from_es_connection(model_id: str, es... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-8 | )
documents = [
"This is an example document.",
"Another example document to generate embeddings for.",
]
embeddings_generator.embed_documents(documents)
pydantic model langchain.embeddings.EmbaasEmbeddings[source]#
Wrapper around embaas’s embedding service.
To use, you should have the
environment variable EMBA... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-9 | Embed search docs.
embed_query(text: str) → List[float][source]#
Embed query text.
pydantic model langchain.embeddings.HuggingFaceEmbeddings[source]#
Wrapper around sentence_transformers embedding models.
To use, you should have the sentence_transformers python package installed.
Example
from langchain.embeddings impor... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-10 | To use, you should have the huggingface_hub python package installed, and the
environment variable HUGGINGFACEHUB_API_TOKEN set with your API token, or pass
it as a named parameter to the constructor.
Example
from langchain.embeddings import HuggingFaceHubEmbeddings
repo_id = "sentence-transformers/all-mpnet-base-v2"
h... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-11 | hf = HuggingFaceInstructEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs
)
field cache_folder: Optional[str] = None#
Path to store models.
Can be also set by SENTENCE_TRANSFORMERS_HOME environment variable.
field embed_instruction: str = 'Represent the document for r... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-12 | field f16_kv: bool = False#
Use half-precision for key/value cache.
field logits_all: bool = False#
Return logits for all tokens, not just the last token.
field n_batch: Optional[int] = 8#
Number of tokens to process in parallel.
Should be a number between 1 and n_ctx.
field n_ctx: int = 512#
Token context window.
fiel... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-13 | MINIMAX_API_KEY set with your API token, or pass it as a named parameter to
the constructor.
Example
from langchain.embeddings import MiniMaxEmbeddings
embeddings = MiniMaxEmbeddings()
query_text = "This is a test query."
query_result = embeddings.embed_query(query_text)
document_text = "This is a test document."
docum... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-14 | embed = ModelScopeEmbeddings(model_id=model_id)
field model_id: str = 'damo/nlp_corom_sentence-embedding_english-base'#
Model name to use.
embed_documents(texts: List[str]) → List[List[float]][source]#
Compute doc embeddings using a modelscope embedding model.
Parameters
texts – The list of texts to embed.
Returns
List... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-15 | How long to try sleeping for if a rate limit is encountered
embed_documents(texts: List[str]) → List[List[float]][source]#
Embed documents using a MosaicML deployed instructor embedding model.
Parameters
texts – The list of texts to embed.
Returns
List of embeddings, one for each text.
embed_query(text: str) → List[flo... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-16 | from langchain.embeddings.openai import OpenAIEmbeddings
embeddings = OpenAIEmbeddings(
deployment="your-embeddings-deployment-name",
model="your-embeddings-model-name",
openai_api_base="https://your-endpoint.openai.azure.com/",
openai_api_type="azure",
)
text = "This is a test query."
query_result = em... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-17 | https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html
If a specific credential profile should be used, you must pass
the name of the profile from the ~/.aws/credentials file that is to be used.
Make sure the credentials / roles used have the required policies to
access the Sagemaker endpoint.
S... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-18 | Compute doc embeddings using a SageMaker Inference Endpoint.
Parameters
texts – The list of texts to embed.
chunk_size – The chunk size defines how many input texts will
be grouped together as request. If None, will use the
chunk size specified by the class.
Returns
List of embeddings, one for each text.
embed_query(te... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-19 | import runhouse as rh
from transformers import pipeline
gpu = rh.cluster(name="rh-a10x", instance_type="A100:1")
pipeline = pipeline(model="bert-base-uncased", task="feature-extraction")
rh.blob(pickle.dumps(pipeline),
path="models/pipeline.pkl").save().to(gpu, path="models")
embeddings = SelfHostedHFEmbeddings.fro... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-20 | Example
from langchain.embeddings import SelfHostedHuggingFaceEmbeddings
import runhouse as rh
model_name = "sentence-transformers/all-mpnet-base-v2"
gpu = rh.cluster(name="rh-a10x", instance_type="A100:1")
hf = SelfHostedHuggingFaceEmbeddings(model_name=model_name, hardware=gpu)
Validators
raise_deprecation » all fiel... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-21 | import runhouse as rh
model_name = "hkunlp/instructor-large"
gpu = rh.cluster(name='rh-a10x', instance_type='A100:1')
hf = SelfHostedHuggingFaceInstructEmbeddings(
model_name=model_name, hardware=gpu)
Validators
raise_deprecation » all fields
set_verbose » verbose
field embed_instruction: str = 'Represent the docum... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
c39d6ab62898-22 | tf = TensorflowHubEmbeddings(model_url=url)
field model_url: str = 'https://tfhub.dev/google/universal-sentence-encoder-multilingual/3'#
Model name to use.
embed_documents(texts: List[str]) → List[List[float]][source]#
Compute doc embeddings using a TensorflowHub embedding model.
Parameters
texts – The list of texts to... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/embeddings.html |
433a678170da-0 | .rst
.pdf
Example Selector
Example Selector#
Logic for selecting examples to include in prompts.
pydantic model langchain.prompts.example_selector.LengthBasedExampleSelector[source]#
Select examples based on length.
Validators
calculate_example_text_lengths » example_text_lengths
field example_prompt: langchain.prompts... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/example_selector.html |
433a678170da-1 | Create k-shot example selector using example list and embeddings.
Reshuffles examples dynamically based on query similarity.
Parameters
examples – List of examples to use in the prompt.
embeddings – An iniialized embedding API interface, e.g. OpenAIEmbeddings().
vectorstore_cls – A vector store DB interface class, e.g.... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/example_selector.html |
433a678170da-2 | Create k-shot example selector using example list and embeddings.
Reshuffles examples dynamically based on query similarity.
Parameters
examples – List of examples to use in the prompt.
embeddings – An initialized embedding API interface, e.g. OpenAIEmbeddings().
vectorstore_cls – A vector store DB interface class, e.g... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/example_selector.html |
3fba0a126828-0 | .rst
.pdf
Tools
Tools#
Core toolkit implementations.
pydantic model langchain.tools.AIPluginTool[source]#
field api_spec: str [Required]#
field args_schema: Type[AIPluginToolSchema] = <class 'langchain.tools.plugin.AIPluginToolSchema'>#
Pydantic model class to validate and parse the tool’s input arguments.
field plugin... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-1 | to_typescript() → str[source]#
Get typescript string representation of the operation.
static ts_type_from_python(type_: Union[str, Type, tuple, None, enum.Enum]) → str[source]#
property body_params: List[str]#
property path_params: List[str]#
property query_params: List[str]#
pydantic model langchain.tools.AzureCogsFor... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-2 | Interface LangChain tools must implement.
field args_schema: Optional[Type[pydantic.main.BaseModel]] = None#
Pydantic model class to validate and parse the tool’s input arguments.
field callback_manager: Optional[langchain.callbacks.base.BaseCallbackManager] = None#
Deprecated. Please use callbacks instead.
field callb... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-3 | Run the tool asynchronously.
run(tool_input: Union[str, Dict], verbose: Optional[bool] = None, start_color: Optional[str] = 'green', color: Optional[str] = 'green', callbacks: Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]] = None, **kwargs: Any) → Any[s... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-4 | field name: str = 'click_element'#
The unique name of the tool that clearly communicates its purpose.
field playwright_strict: bool = False#
Whether to employ Playwright’s strict mode when clicking on elements.
field playwright_timeout: float = 1000#
Timeout (in ms) for Playwright to wait for element to be ready.
field... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-5 | Pydantic model class to validate and parse the tool’s input arguments.
field description: str = 'Delete a file'#
Used to tell the model how/when/why to use the tool.
You can provide few-shot examples as a part of the description.
field name: str = 'file_delete'#
The unique name of the tool that clearly communicates its... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-6 | pydantic model langchain.tools.ExtractTextTool[source]#
field args_schema: Type[BaseModel] = <class 'pydantic.main.BaseModel'>#
Pydantic model class to validate and parse the tool’s input arguments.
field description: str = 'Extract all the text on the current webpage'#
Used to tell the model how/when/why to use the to... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-7 | The unique name of the tool that clearly communicates its purpose.
pydantic model langchain.tools.GmailCreateDraft[source]#
field args_schema: Type[langchain.tools.gmail.create_draft.CreateDraftSchema] = <class 'langchain.tools.gmail.create_draft.CreateDraftSchema'>#
Pydantic model class to validate and parse the tool’... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-8 | Pydantic model class to validate and parse the tool’s input arguments.
field description: str = 'Use this tool to search for email messages. The input must be a valid Gmail query. The output is a JSON list of messages.'#
Used to tell the model how/when/why to use the tool.
You can provide few-shot examples as a part of... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-9 | field api_wrapper: langchain.utilities.google_places_api.GooglePlacesAPIWrapper [Optional]#
field args_schema: Type[pydantic.main.BaseModel] = <class 'langchain.tools.google_places.tool.GooglePlacesSchema'>#
Pydantic model class to validate and parse the tool’s input arguments.
pydantic model langchain.tools.GoogleSear... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-10 | pydantic model langchain.tools.InfoPowerBITool[source]#
Tool for getting metadata about a PowerBI Dataset.
field powerbi: langchain.utilities.powerbi.PowerBIDataset [Required]#
pydantic model langchain.tools.ListDirectoryTool[source]#
field args_schema: Type[pydantic.main.BaseModel] = <class 'langchain.tools.file_manag... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-11 | field name: str = 'move_file'#
The unique name of the tool that clearly communicates its purpose.
pydantic model langchain.tools.NavigateBackTool[source]#
Navigate back to the previous page in the browser history.
field args_schema: Type[BaseModel] = <class 'pydantic.main.BaseModel'>#
Pydantic model class to validate a... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-12 | Get an OpenAPI spec from a dict.
classmethod from_text(text: str) → langchain.tools.openapi.utils.openapi_utils.OpenAPISpec[source]#
Get an OpenAPI spec from a text.
classmethod from_url(url: str) → langchain.tools.openapi.utils.openapi_utils.OpenAPISpec[source]#
Get an OpenAPI spec from a URL.
static get_cleaned_opera... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-13 | property base_url: str#
Get the base url.
pydantic model langchain.tools.OpenWeatherMapQueryRun[source]#
Tool that adds the capability to query using the OpenWeatherMap API.
field api_wrapper: langchain.utilities.openweathermap.OpenWeatherMapAPIWrapper [Optional]#
pydantic model langchain.tools.PubmedQueryRun[source]#
... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-14 | field template: Optional[str] = '\nAnswer the question below with a DAX query that can be sent to Power BI. DAX queries have a simple syntax comprised of just one required keyword, EVALUATE, and several optional keywords: ORDER BY, START AT, DEFINE, MEASURE, VAR, TABLE, and COLUMN. Each keyword defines a statement used... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-15 | ORDER BY <expression> ASC or DESC START AT <value> or <parameter> - The optional START AT keyword is used inside an ORDER BY clause. It defines the value at which the query results begin.\nDEFINE MEASURE | VAR; EVALUATE <table> - The optional DEFINE keyword introduces one or more calculated entity definitions that exis... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-16 | you nest the DISTINCT function within a formula, to get a list of distinct values that can be passed to another function and then counted, summed, or used for other operations.\nDISTINCT(<table>) - Returns a table by removing duplicate rows from another table or expression.\n\nAggregation functions, names with a A in i... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-17 | date2, <interval>) - Returns the difference between two date values, in the specified interval, that can be SECOND, MINUTE, HOUR, DAY, WEEK, MONTH, QUARTER, YEAR.\nDATEVALUE(<date_text>) - Returns a date value that represents the specified date.\nYEAR(<date>), QUARTER(<date>), MONTH(<date>), DAY(<date>), HOUR(<date>), ... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-18 | pydantic model langchain.tools.ReadFileTool[source]#
field args_schema: Type[pydantic.main.BaseModel] = <class 'langchain.tools.file_management.read.ReadFileInput'>#
Pydantic model class to validate and parse the tool’s input arguments.
field description: str = 'Read file from disk'#
Used to tell the model how/when/why... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-19 | The input arguments’ schema.
The tool schema.
field coroutine: Optional[Callable[[...], Awaitable[Any]]] = None#
The asynchronous version of the function.
field description: str = ''#
Used to tell the model how/when/why to use the tool.
You can provide few-shot examples as a part of the description.
field func: Callabl... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-20 | Tool that takes in function or coroutine directly.
field args_schema: Optional[Type[pydantic.main.BaseModel]] = None#
Pydantic model class to validate and parse the tool’s input arguments.
field callback_manager: Optional[langchain.callbacks.base.BaseCallbackManager] = None#
Deprecated. Please use callbacks instead.
fi... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-21 | Initialize tool from a function.
property args: dict#
The tool’s input arguments.
pydantic model langchain.tools.VectorStoreQATool[source]#
Tool for the VectorDBQA chain. To be initialized with name and chain.
static get_description(name: str, description: str) → str[source]#
pydantic model langchain.tools.VectorStoreQ... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-22 | actions here: https://nla.zapier.com/demo/start/
The return list can be empty if no actions exposed. Else will contain
a list of action objects:
[{“id”: str,
“description”: str,
“params”: Dict[str, str]
}]
params will always contain an instructions key, the only required
param. All others optional and if provided will ... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-23 | field base_prompt: str = 'A wrapper around Zapier NLA actions. The input to this tool is a natural language instruction, for example "get the latest email from my bank" or "send a slack message to the #general channel". Each tool will have params associated with it that are specified as a list. You MUST take into accou... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
3fba0a126828-24 | Parameters
*args – The arguments to the tool.
return_direct – Whether to return directly from the tool rather
than continuing the agent loop.
args_schema – optional argument schema for user to specify
infer_schema – Whether to infer the schema of the arguments from
the function’s signature. This also makes the resultan... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/tools.html |
6ba78a3bb651-0 | .rst
.pdf
Chat Models
Chat Models#
pydantic model langchain.chat_models.AzureChatOpenAI[source]#
Wrapper around Azure OpenAI Chat Completion API. To use this class you
must have a deployed model on Azure OpenAI. Use deployment_name in the
constructor to refer to the “Model deployment name” in the Azure portal.
In addit... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/chat_models.html |
6ba78a3bb651-1 | environment variable ANTHROPIC_API_KEY set with your API key, or pass
it as a named parameter to the constructor.
Example
import anthropic
from langchain.llms import Anthropic
model = ChatAnthropic(model="<model_name>", anthropic_api_key="my-api-key")
get_num_tokens(text: str) → int[source]#
Calculate number of tokens.... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/chat_models.html |
6ba78a3bb651-2 | pydantic model langchain.chat_models.ChatOpenAI[source]#
Wrapper around OpenAI Chat large language models.
To use, you should have the openai python package installed, and the
environment variable OPENAI_API_KEY set with your API key.
Any parameters that are valid to be passed to the openai.create call can be passed
in... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/chat_models.html |
6ba78a3bb651-3 | Use tenacity to retry the completion call.
get_num_tokens_from_messages(messages: List[langchain.schema.BaseMessage]) → int[source]#
Calculate num tokens for gpt-3.5-turbo and gpt-4 with tiktoken package.
Official documentation: openai/openai-cookbook
main/examples/How_to_format_inputs_to_ChatGPT_models.ipynb
get_token... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/chat_models.html |
6ba78a3bb651-4 | previous
Models
next
Embeddings
By Harrison Chase
© Copyright 2023, Harrison Chase.
Last updated on Jun 16, 2023. | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/chat_models.html |
14c2a1c4ad3b-0 | .rst
.pdf
LLMs
LLMs#
Wrappers on top of large language models APIs.
pydantic model langchain.llms.AI21[source]#
Wrapper around AI21 large language models.
To use, you should have the environment variable AI21_API_KEY
set with your API key.
Example
from langchain.llms import AI21
ai21 = AI21(model="j2-jumbo-instruct")
V... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-1 | How many completions to generate for each prompt.
field presencePenalty: langchain.llms.ai21.AI21PenaltyData = AI21PenaltyData(scale=0, applyToWhitespaces=True, applyToPunctuations=True, applyToNumbers=True, applyToStopwords=True, applyToEmojis=True)#
Penalizes repeated tokens.
field tags: Optional[List[str]] = None#
T... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-2 | Take in a list of prompt values and return an LLMResult.
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str#
Predict text from text.
async apredict_messages(messages: List[langchain.schema.BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → langchain.schema.BaseM... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-3 | dict(**kwargs: Any) → Dict#
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → langchain.... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-4 | predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str#
Predict text from text.
predict_messages(messages: List[langchain.schema.BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → langchain.schema.BaseMessage#
Predict message from messages.
save(file_path: Union[pathlib.Pa... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-5 | Example
from langchain.llms import AlephAlpha
alpeh_alpha = AlephAlpha(aleph_alpha_api_key="my-api-key")
Validators
raise_deprecation » all fields
set_verbose » verbose
validate_environment » all fields
field aleph_alpha_api_key: Optional[str] = None#
API key for Aleph Alpha API.
field best_of: Optional[int] = None#
re... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-6 | Model name to use.
field n: int = 1#
How many completions to generate for each prompt.
field penalty_bias: Optional[str] = None#
Penalty bias for the completion.
field penalty_exceptions: Optional[List[str]] = None#
List of strings that may be generated without penalty,
regardless of other penalty settings
field penalt... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-7 | field verbose: bool [Optional]#
Whether to print out response text.
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]] = None, **kwargs: Any) → str#
Check Cache and run the LLM on the given ... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-8 | Behaves as if Config.extra = ‘allow’ was set since it adds all passed values
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model#
Duplicate a model, optionally... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-9 | get_num_tokens_from_messages(messages: List[langchain.schema.BaseMessage]) → int#
Get the number of tokens in the message.
get_token_ids(text: str) → List[int]#
Get the token present in the text.
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, Map... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-10 | serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]#
Return the namespace of the langchain object.
eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]#
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_K... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-11 | field max_tokens_to_sample: int = 256#
Denotes the number of tokens to predict per generation.
field model: str = 'claude-v1'#
Model name to use.
field streaming: bool = False#
Whether to stream the results.
field tags: Optional[List[str]] = None#
Tags to add to the run trace.
field temperature: Optional[float] = None#... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-12 | Take in a list of prompt values and return an LLMResult.
async apredict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str#
Predict text from text.
async apredict_messages(messages: List[langchain.schema.BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → langchain.schema.BaseM... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-13 | dict(**kwargs: Any) → Dict#
Return a dictionary of the LLM.
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → langchain.... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-14 | predict(text: str, *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → str#
Predict text from text.
predict_messages(messages: List[langchain.schema.BaseMessage], *, stop: Optional[Sequence[str]] = None, **kwargs: Any) → langchain.schema.BaseMessage#
Predict message from messages.
save(file_path: Union[pathlib.Pa... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-15 | property lc_secrets: Dict[str, str]#
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool#
Return whether or not the class is serializable.
pydantic model langchain.llms.Anyscale[source]#
Wrapper around Anyscale Services.
To use, you should ha... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-16 | Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → lang... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-17 | Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creat... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-18 | Get the token present in the text.
json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, by_alias: bool = False, skip_defaults: Optional[bool] = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: ... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-19 | eg. [“langchain”, “llms”, “openai”]
property lc_secrets: Dict[str, str]#
Return a map of constructor argument names to secret ids.
eg. {“openai_api_key”: “OPENAI_API_KEY”}
property lc_serializable: bool#
Return whether or not the class is serializable.
pydantic model langchain.llms.Aviary[source]#
Allow you to use an A... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
14c2a1c4ad3b-20 | Check Cache and run the LLM on the given prompt and input.
async agenerate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]] = None, *, tags: Optional[List[str]] = None, **kwargs: Any) → lang... | rtdocs_stable/api.python.langchain.com/en/stable/reference/modules/llms.html |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.