id
stringlengths
14
16
source
stringlengths
49
117
text
stringlengths
16
2.73k
1ae35bf65e45-149
https://python.langchain.com/en/latest/reference/modules/llms.html
__call__(prompt: str, stop: Optional[List[str]] = None, callbacks: Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]] = None) → str# Check Cache and run the LLM on the given prompt and input. async agenerate(prompts: List[str], stop: Optional[List[str]] = N...
1ae35bf65e45-150
https://python.langchain.com/en/latest/reference/modules/llms.html
copy(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, update: Optional[DictStrAny] = None, deep: bool = False) → Model# Duplicate a model, optionally choose which fields to include, exclude and change. Parameters include – fie...
1ae35bf65e45-151
https://python.langchain.com/en/latest/reference/modules/llms.html
get_num_tokens_from_messages(messages: List[langchain.schema.BaseMessage]) → int# Get the number of tokens in the message. get_token_ids(text: str) → List[int]# Get the token present in the text. json(*, include: Optional[Union[AbstractSetIntStr, MappingIntStrAny]] = None, exclude: Optional[Union[AbstractSetIntStr, Map...
1ae35bf65e45-152
https://python.langchain.com/en/latest/reference/modules/llms.html
set with your API key. Example from langchain.llms import StochasticAI stochasticai = StochasticAI(api_url="") Validators build_extra » all fields raise_deprecation » all fields set_verbose » verbose validate_environment » all fields field api_url: str = ''# Model name to use. field model_kwargs: Dict[str, Any] [Option...
1ae35bf65e45-153
https://python.langchain.com/en/latest/reference/modules/llms.html
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model# Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values...
1ae35bf65e45-154
https://python.langchain.com/en/latest/reference/modules/llms.html
get_num_tokens(text: str) → int# Get the number of tokens present in the text. get_num_tokens_from_messages(messages: List[langchain.schema.BaseMessage]) → int# Get the number of tokens in the message. get_token_ids(text: str) → List[int]# Get the token present in the text. json(*, include: Optional[Union[AbstractSetIn...
1ae35bf65e45-155
https://python.langchain.com/en/latest/reference/modules/llms.html
Wrapper around Google Vertex AI large language models. Validators raise_deprecation » all fields set_verbose » verbose validate_environment » all fields field credentials: Any = None# The default custom credentials (google.auth.credentials.Credentials) to use field location: str = 'us-central1'# The default location to...
1ae35bf65e45-156
https://python.langchain.com/en/latest/reference/modules/llms.html
async agenerate_prompt(prompts: List[langchain.schema.PromptValue], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]] = None) → langchain.schema.LLMResult# Take in a list of prompt values and return an LLMResult...
1ae35bf65e45-157
https://python.langchain.com/en/latest/reference/modules/llms.html
generate(prompts: List[str], stop: Optional[List[str]] = None, callbacks: Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]] = None) → langchain.schema.LLMResult# Run the LLM on the given prompt and input. generate_prompt(prompts: List[langchain.schema.Prom...
1ae35bf65e45-158
https://python.langchain.com/en/latest/reference/modules/llms.html
predict_messages(messages: List[langchain.schema.BaseMessage], *, stop: Optional[Sequence[str]] = None) → langchain.schema.BaseMessage# Predict message from messages. save(file_path: Union[pathlib.Path, str]) → None# Save the LLM. Parameters file_path – Path to file to save the LLM to. Example: .. code-block:: python l...
1ae35bf65e45-159
https://python.langchain.com/en/latest/reference/modules/llms.html
Penalizes repeated tokens according to frequency. field stop: Optional[List[str]] = None# Sequences when completion generation will stop. field temperature: Optional[float] = None# What sampling temperature to use. field top_p: Optional[float] = None# Total probability mass of tokens to consider at each step. field ver...
1ae35bf65e45-160
https://python.langchain.com/en/latest/reference/modules/llms.html
classmethod construct(_fields_set: Optional[SetStr] = None, **values: Any) → Model# Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = ‘allow’ was set since it adds all passed values...
1ae35bf65e45-161
https://python.langchain.com/en/latest/reference/modules/llms.html
get_num_tokens(text: str) → int# Get the number of tokens present in the text. get_num_tokens_from_messages(messages: List[langchain.schema.BaseMessage]) → int# Get the number of tokens in the message. get_token_ids(text: str) → List[int]# Get the token present in the text. json(*, include: Optional[Union[AbstractSetIn...
1ae35bf65e45-162
https://python.langchain.com/en/latest/reference/modules/llms.html
© Copyright 2023, Harrison Chase. Last updated on Jun 04, 2023.
9c5eb9c561e7-0
https://python.langchain.com/en/latest/reference/modules/embeddings.html
.rst .pdf Embeddings Embeddings# Wrappers around embedding modules. pydantic model langchain.embeddings.AlephAlphaAsymmetricSemanticEmbedding[source]# Wrapper for Aleph Alpha’s Asymmetric Embeddings AA provides you with an endpoint to embed a document and a query. The models were optimized to make the embeddings of doc...
9c5eb9c561e7-1
https://python.langchain.com/en/latest/reference/modules/embeddings.html
Call out to Aleph Alpha’s asymmetric Document endpoint. Parameters texts – The list of texts to embed. Returns List of embeddings, one for each text. embed_query(text: str) → List[float][source]# Call out to Aleph Alpha’s asymmetric, query embedding endpoint :param text: The text to embed. Returns Embeddings for the te...
9c5eb9c561e7-2
https://python.langchain.com/en/latest/reference/modules/embeddings.html
Make sure the credentials / roles used have the required policies to access the Bedrock service. field credentials_profile_name: Optional[str] = None# The name of the profile in the ~/.aws/credentials or ~/.aws/config files, which has either access keys or role information specified. If not specified, the default crede...
9c5eb9c561e7-3
https://python.langchain.com/en/latest/reference/modules/embeddings.html
To use, you should have the cohere python package installed, and the environment variable COHERE_API_KEY set with your API key or pass it as a named parameter to the constructor. Example from langchain.embeddings import CohereEmbeddings cohere = CohereEmbeddings( model="embed-english-light-v2.0", cohere_api_key="my...
9c5eb9c561e7-4
https://python.langchain.com/en/latest/reference/modules/embeddings.html
texts (List[str]) – A list of document text strings to generate embeddings for. Returns A list of embeddings, one for each document in the inputlist. Return type List[List[float]] embed_query(text: str) → List[float][source]# Generate an embedding for a single query text. Parameters text (str) – The query text to gener...
9c5eb9c561e7-5
https://python.langchain.com/en/latest/reference/modules/embeddings.html
# es_user="bar", # es_password="baz", ) documents = [ "This is an example document.", "Another example document to generate embeddings for.", ] embeddings_generator.embed_documents(documents) classmethod from_es_connection(model_id: str, es_connection: Elasticsearch, input_field: str = 'text_field') → Elast...
9c5eb9c561e7-6
https://python.langchain.com/en/latest/reference/modules/embeddings.html
pydantic model langchain.embeddings.FakeEmbeddings[source]# embed_documents(texts: List[str]) → List[List[float]][source]# Embed search docs. embed_query(text: str) → List[float][source]# Embed query text. pydantic model langchain.embeddings.HuggingFaceEmbeddings[source]# Wrapper around sentence_transformers embedding ...
9c5eb9c561e7-7
https://python.langchain.com/en/latest/reference/modules/embeddings.html
pydantic model langchain.embeddings.HuggingFaceHubEmbeddings[source]# Wrapper around HuggingFaceHub embedding models. To use, you should have the huggingface_hub python package installed, and the environment variable HUGGINGFACEHUB_API_TOKEN set with your API token, or pass it as a named parameter to the constructor. E...
9c5eb9c561e7-8
https://python.langchain.com/en/latest/reference/modules/embeddings.html
model_kwargs = {'device': 'cpu'} encode_kwargs = {'normalize_embeddings': True} hf = HuggingFaceInstructEmbeddings( model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs ) field cache_folder: Optional[str] = None# Path to store models. Can be also set by SENTENCE_TRANSFORMERS_HOME en...
9c5eb9c561e7-9
https://python.langchain.com/en/latest/reference/modules/embeddings.html
llama = LlamaCppEmbeddings(model_path="/path/to/model.bin") field f16_kv: bool = False# Use half-precision for key/value cache. field logits_all: bool = False# Return logits for all tokens, not just the last token. field n_batch: Optional[int] = 8# Number of tokens to process in parallel. Should be a number between 1 a...
9c5eb9c561e7-10
https://python.langchain.com/en/latest/reference/modules/embeddings.html
MINIMAX_API_KEY set with your API token, or pass it as a named parameter to the constructor. Example from langchain.embeddings import MiniMaxEmbeddings embeddings = MiniMaxEmbeddings() query_text = "This is a test query." query_result = embeddings.embed_query(query_text) document_text = "This is a test document." docum...
9c5eb9c561e7-11
https://python.langchain.com/en/latest/reference/modules/embeddings.html
field model_id: str = 'damo/nlp_corom_sentence-embedding_english-base'# Model name to use. embed_documents(texts: List[str]) → List[List[float]][source]# Compute doc embeddings using a modelscope embedding model. Parameters texts – The list of texts to embed. Returns List of embeddings, one for each text. embed_query(t...
9c5eb9c561e7-12
https://python.langchain.com/en/latest/reference/modules/embeddings.html
Embed documents using a MosaicML deployed instructor embedding model. Parameters texts – The list of texts to embed. Returns List of embeddings, one for each text. embed_query(text: str) → List[float][source]# Embed a query using a MosaicML deployed instructor embedding model. Parameters text – The text to embed. Retur...
9c5eb9c561e7-13
https://python.langchain.com/en/latest/reference/modules/embeddings.html
api_base="https://your-endpoint.openai.azure.com/", api_type="azure", ) text = "This is a test query." query_result = embeddings.embed_query(text) field chunk_size: int = 1000# Maximum number of texts to embed in each batch field max_retries: int = 6# Maximum number of retries to make when generating. field request...
9c5eb9c561e7-14
https://python.langchain.com/en/latest/reference/modules/embeddings.html
See: https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html field content_handler: langchain.embeddings.sagemaker_endpoint.EmbeddingsContentHandler [Required]# The content handler class that provides an input and output transform functions to handle formats between LLM and the endpoint. field credentials...
9c5eb9c561e7-15
https://python.langchain.com/en/latest/reference/modules/embeddings.html
embed_query(text: str) → List[float][source]# Compute query embeddings using a SageMaker inference endpoint. Parameters text – The text to embed. Returns Embeddings for the text. pydantic model langchain.embeddings.SelfHostedEmbeddings[source]# Runs custom embedding models on self-hosted remote hardware. Supported hard...
9c5eb9c561e7-16
https://python.langchain.com/en/latest/reference/modules/embeddings.html
embeddings = SelfHostedHFEmbeddings.from_pipeline( pipeline="models/pipeline.pkl", hardware=gpu, model_reqs=["./", "torch", "transformers"], ) Validators raise_deprecation » all fields set_verbose » verbose field inference_fn: Callable = <function _embed_documents># Inference function to extract the embeddi...
9c5eb9c561e7-17
https://python.langchain.com/en/latest/reference/modules/embeddings.html
field hardware: Any = None# Remote hardware to send the inference function to. field inference_fn: Callable = <function _embed_documents># Inference function to extract the embeddings. field load_fn_kwargs: Optional[dict] = None# Key word arguments to pass to the model load function. field model_id: str = 'sentence-tra...
9c5eb9c561e7-18
https://python.langchain.com/en/latest/reference/modules/embeddings.html
field model_reqs: List[str] = ['./', 'InstructorEmbedding', 'torch']# Requirements to install on hardware to inference the model. field query_instruction: str = 'Represent the question for retrieving supporting documents: '# Instruction to use for embedding query. embed_documents(texts: List[str]) → List[List[float]][s...
9c5eb9c561e7-19
https://python.langchain.com/en/latest/reference/modules/embeddings.html
Chat Models next Indexes By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Jun 04, 2023.
5adc047fdeb4-0
https://python.langchain.com/en/latest/getting_started/getting_started.html
.md .pdf Quickstart Guide Contents Installation Environment Setup Building a Language Model Application: LLMs LLMs: Get predictions from a language model Prompt Templates: Manage prompts for LLMs Chains: Combine LLMs and prompts in multi-step workflows Agents: Dynamically Call Chains Based on User Input Memory: Add S...
5adc047fdeb4-1
https://python.langchain.com/en/latest/getting_started/getting_started.html
LangChain provides many modules that can be used to build language model applications. Modules can be combined to create more complex applications, or be used individually for simple applications. LLMs: Get predictions from a language model# The most basic building block of LangChain is calling an LLM on some input. Le...
5adc047fdeb4-2
https://python.langchain.com/en/latest/getting_started/getting_started.html
from langchain.prompts import PromptTemplate prompt = PromptTemplate( input_variables=["product"], template="What is a good name for a company that makes {product}?", ) Let’s now see how this works! We can call the .format method to format it. print(prompt.format(product="colorful socks")) What is a good name f...
5adc047fdeb4-3
https://python.langchain.com/en/latest/getting_started/getting_started.html
There we go! There’s the first chain - an LLM Chain. This is one of the simpler types of chains, but understanding how it works will set you up well for working with more complex chains. For more details, check out the getting started guide for chains. Agents: Dynamically Call Chains Based on User Input# So far the cha...
5adc047fdeb4-4
https://python.langchain.com/en/latest/getting_started/getting_started.html
from langchain.agents import initialize_agent from langchain.agents import AgentType from langchain.llms import OpenAI # First, let's load the language model we're going to use to control the agent. llm = OpenAI(temperature=0) # Next, let's load some tools to use. Note that the `llm-math` tool uses an LLM, so we need t...
5adc047fdeb4-5
https://python.langchain.com/en/latest/getting_started/getting_started.html
So far, all the chains and agents we’ve gone through have been stateless. But often, you may want a chain or agent to have some concept of “memory” so that it may remember information about its previous interactions. The clearest and simple example of this is when designing a chatbot - you want it to remember previous ...
5adc047fdeb4-6
https://python.langchain.com/en/latest/getting_started/getting_started.html
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know. Current conversation: Human: Hi there! AI: Hello! How are you today? Human: I'm doing we...
5adc047fdeb4-7
https://python.langchain.com/en/latest/getting_started/getting_started.html
You can also pass in multiple messages for OpenAI’s gpt-3.5-turbo and gpt-4 models. messages = [ SystemMessage(content="You are a helpful assistant that translates English to French."), HumanMessage(content="I love programming.") ] chat(messages) # -> AIMessage(content="J'aime programmer.", additional_kwargs={}...
5adc047fdeb4-8
https://python.langchain.com/en/latest/getting_started/getting_started.html
Similar to LLMs, you can make use of templating by using a MessagePromptTemplate. You can build a ChatPromptTemplate from one or more MessagePromptTemplates. You can use ChatPromptTemplate’s format_prompt – this returns a PromptValue, which you can convert to a string or Message object, depending on whether you want to...
5adc047fdeb4-9
https://python.langchain.com/en/latest/getting_started/getting_started.html
system_message_prompt = SystemMessagePromptTemplate.from_template(template) human_template = "{text}" human_message_prompt = HumanMessagePromptTemplate.from_template(human_template) chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt]) chain = LLMChain(llm=chat, prompt=chat_promp...
5adc047fdeb4-10
https://python.langchain.com/en/latest/getting_started/getting_started.html
Thought: I need to use a search engine to find Olivia Wilde's boyfriend and a calculator to raise his age to the 0.23 power. Action: { "action": "Search", "action_input": "Olivia Wilde boyfriend" } Observation: Sudeikis and Wilde's relationship ended in November 2020. Wilde was publicly served with court documents ...
5adc047fdeb4-11
https://python.langchain.com/en/latest/getting_started/getting_started.html
prompt = ChatPromptTemplate.from_messages([ SystemMessagePromptTemplate.from_template("The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know...
5adc047fdeb4-12
https://python.langchain.com/en/latest/getting_started/getting_started.html
Memory: Add State to Chains and Agents By Harrison Chase © Copyright 2023, Harrison Chase. Last updated on Jun 04, 2023.
3d0e4bc9ebee-0
https://python.langchain.com/en/latest/getting_started/concepts.html
.md .pdf Concepts Contents Chain of Thought Action Plan Generation ReAct Self-ask Prompt Chaining Memetic Proxy Self Consistency Inception MemPrompt Concepts# These are concepts and terminology commonly used when developing LLM applications. It contains reference to external papers or sources where the concept was fi...
3d0e4bc9ebee-1
https://python.langchain.com/en/latest/getting_started/concepts.html
will result in that type of response. For example, as a conversation between a student and a teacher. Paper Self Consistency# Self Consistency is a decoding strategy that samples a diverse set of reasoning paths and then selects the most consistent answer. Is most effective when combined with Chain-of-thought prompting...
e3a29f8afabb-0
https://python.langchain.com/en/latest/getting_started/tutorials.html
.md .pdf Tutorials Contents DeepLearning.AI course Handbook Tutorials Tutorials# ⛓ icon marks a new addition [last update 2023-05-15] DeepLearning.AI course# ⛓LangChain for LLM Application Development by Harrison Chase presented by Andrew Ng Handbook# LangChain AI Handbook By James Briggs and Francisco Ingham Tutoria...
e3a29f8afabb-1
https://python.langchain.com/en/latest/getting_started/tutorials.html
Connect Google Drive Files To OpenAI YouTube Transcripts + OpenAI Question A 300 Page Book (w/ OpenAI + Pinecone) Workaround OpenAI's Token Limit With Chain Types Build Your Own OpenAI + LangChain Web App in 23 Minutes Working With The New ChatGPT API OpenAI + LangChain Wrote Me 100 Custom Sales Emails Structured Outpu...
e3a29f8afabb-2
https://python.langchain.com/en/latest/getting_started/tutorials.html
⛓ Using LangChain with DuckDuckGO Wikipedia & PythonREPL Tools ⛓ Building Custom Tools and Agents with LangChain (gpt-3.5-turbo) ⛓ LangChain Retrieval QA Over Multiple Files with ChromaDB ⛓ LangChain Retrieval QA with Instructor Embeddings & ChromaDB for PDFs ⛓ LangChain + Retrieval Local LLMs for Retrieval QA - No Ope...
e3a29f8afabb-3
https://python.langchain.com/en/latest/getting_started/tutorials.html
Analyze Custom CSV Data with GPT-4 using Langchain ⛓ Build ChatGPT Chatbots with LangChain Memory: Understanding and Implementing Memory in Conversations ⛓ icon marks a new addition [last update 2023-05-15] previous Concepts next Models Contents DeepLearning.AI course Handbook Tutorials By Harrison Chase ...