id stringlengths 14 16 | text stringlengths 4 1.28k | source stringlengths 54 121 |
|---|---|---|
0e0436a883fe-11 | Take in a list of prompt values and return an LLMResult.
Parameters
prompts (List[langchain.schema.PromptValue]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
kwargs (Any) β
Return type
langchain.schema.... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-12 | Return type
List[int]
json(*, include=None, exclude=None, by_alias=False, skip_defaults=None, exclude_unset=False, exclude_defaults=False, exclude_none=False, encoder=None, models_as_dict=True, **dumps_kwargs)ο
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optio... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-13 | models_as_dict (bool) β
dumps_kwargs (Any) β
Return type
unicode
predict(text, *, stop=None, **kwargs)ο
Predict text from text.
Parameters
text (str) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
str
predict_messages(messages, *, stop=None, **kwargs)ο
Predict message from messages.
Parameters
messa... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-14 | classmethod update_forward_refs(**localns)ο
Try to update ForwardRefs on fields based on this Model, globalns and localns.
Parameters
localns (Any) β
Return type
None
property lc_attributes: Dictο
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by th... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-15 | class langchain.llms.AlephAlpha(*, cache=None, verbose=None, callbacks=None, callback_manager=None, tags=None, client=None, model='luminous-base', maximum_tokens=64, temperature=0.0, top_k=0, top_p=0.0, presence_penalty=0.0, frequency_penalty=0.0, repetition_penalties_include_prompt=False, use_multiplicative_presence_p... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-16 | completion_bias_exclusion_first_token_only=False, contextual_control_threshold=None, control_log_additive=True, repetition_penalties_include_completion=True, raw_completion=False, aleph_alpha_api_key=None, stop_sequences=None)[source]ο | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-17 | Bases: langchain.llms.base.LLM
Wrapper around Aleph Alpha large language models.
To use, you should have the aleph_alpha_client python package installed, and the
environment variable ALEPH_ALPHA_API_KEY set with your API key, or pass
it as a named parameter to the constructor.
Parameters are explained more in depth her... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-18 | verbose (bool) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
callback_manager (Optional[langchain.callbacks.base.BaseCallbackManager]) β
tags (Optional[List[str]]) β
client (Any) β
model (Optional[str]) β
maximum_tokens (int) β
t... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-19 | n (int) β
logit_bias (Optional[Dict[int, float]]) β
log_probs (Optional[int]) β
tokens (Optional[bool]) β
disable_optimizations (Optional[bool]) β
minimum_tokens (Optional[int]) β
echo (bool) β
use_multiplicative_frequency_penalty (bool) β
sequence_penalty (float) β
sequence_penalty_min_length (int) β
use_mul... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-20 | raw_completion (bool) β
aleph_alpha_api_key (Optional[str]) β
stop_sequences (Optional[List[str]]) β
Return type
None
attribute aleph_alpha_api_key: Optional[str] = Noneο
API key for Aleph Alpha API.
attribute best_of: Optional[int] = Noneο
returns the one with the βbest ofβ results
(highest log probability per toke... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-21 | True: apply control by adding the log(control_factor) to attention scores.
False: (attention_scores - - attention_scores.min(-1)) * control_factor
attribute echo: bool = Falseο
Echo the prompt in the completion.
attribute frequency_penalty: float = 0.0ο
Penalizes repeated tokens according to frequency.
attribute log_pr... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-22 | Model name to use.
attribute n: int = 1ο
How many completions to generate for each prompt.
attribute penalty_bias: Optional[str] = Noneο
Penalty bias for the completion.
attribute penalty_exceptions: Optional[List[str]] = Noneο
List of strings that may be generated without penalty,
regardless of other penalty settings
... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-23 | Flag deciding whether presence penalty or frequency penalty are
updated from the prompt.
attribute stop_sequences: Optional[List[str]] = Noneο
Stop sequences to use.
attribute tags: Optional[List[str]] = Noneο
Tags to add to the run trace.
attribute temperature: float = 0.0ο
A non-negative float that tunes the degree o... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-24 | attribute verbose: bool [Optional]ο
Whether to print out response text.
__call__(prompt, stop=None, callbacks=None, **kwargs)ο
Check Cache and run the LLM on the given prompt and input.
Parameters
prompt (str) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler],... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-25 | tags (Optional[List[str]]) β
kwargs (Any) β
Return type
langchain.schema.LLMResult
async agenerate_prompt(prompts, stop=None, callbacks=None, **kwargs)ο
Take in a list of prompt values and return an LLMResult.
Parameters
prompts (List[langchain.schema.PromptValue]) β
stop (Optional[List[str]]) β
callbacks (Optional... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-26 | Predict message from messages.
Parameters
messages (List[langchain.schema.BaseMessage]) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
langchain.schema.BaseMessage
classmethod construct(_fields_set=None, **values)ο
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated d... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-27 | exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β fields to exclude from new model, as with values this takes precedence over include
update (Optional[DictStrAny]) β values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep (bool... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-28 | stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
tags (Optional[List[str]]) β
kwargs (Any) β
Return type
langchain.schema.LLMResult
generate_prompt(prompts, stop=None, callbacks=None, **kwargs)ο
Take in a lis... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-29 | Return type
int
get_num_tokens_from_messages(messages)ο
Get the number of tokens in the message.
Parameters
messages (List[langchain.schema.BaseMessage]) β
Return type
int
get_token_ids(text)ο
Get the token present in the text.
Parameters
text (str) β
Return type
List[int]
json(*, include=None, exclude=None, by_alias... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-30 | exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β
by_alias (bool) β
skip_defaults (Optional[bool]) β
exclude_unset (bool) β
exclude_defaults (bool) β
exclude_none (bool) β
encoder (Optional[Callable[[Any], Any]]) β
models_as_dict (bool) β
dumps_kwargs (Any) β
Return type
unicode
predict(text, *,... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-31 | kwargs (Any) β
Return type
langchain.schema.BaseMessage
save(file_path)ο
Save the LLM.
Parameters
file_path (Union[pathlib.Path, str]) β Path to file to save the LLM to.
Return type
None
Example:
.. code-block:: python
llm.save(file_path=βpath/llm.yamlβ)
classmethod update_forward_refs(**localns)ο
Try to update Forwar... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-32 | property lc_secrets: Dict[str, str]ο
Return a map of constructor argument names to secret ids.
eg. {βopenai_api_keyβ: βOPENAI_API_KEYβ}
property lc_serializable: boolο
Return whether or not the class is serializable.
class langchain.llms.AmazonAPIGateway(*, cache=None, verbose=None, callbacks=None, callback_manager=Non... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-33 | tags (Optional[List[str]]) β
api_url (str) β
model_kwargs (Optional[Dict]) β
content_handler (langchain.llms.amazon_api_gateway.ContentHandlerAmazonAPIGateway) β
Return type
None
attribute api_url: str [Required]ο
API Gateway URL
attribute content_handler: langchain.llms.amazon_api_gateway.ContentHandlerAmazonAPIGa... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-34 | Whether to print out response text.
__call__(prompt, stop=None, callbacks=None, **kwargs)ο
Check Cache and run the LLM on the given prompt and input.
Parameters
prompt (str) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallba... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-35 | kwargs (Any) β
Return type
langchain.schema.LLMResult
async agenerate_prompt(prompts, stop=None, callbacks=None, **kwargs)ο
Take in a list of prompt values and return an LLMResult.
Parameters
prompts (List[langchain.schema.PromptValue]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callback... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-36 | Predict message from messages.
Parameters
messages (List[langchain.schema.BaseMessage]) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
langchain.schema.BaseMessage
classmethod construct(_fields_set=None, **values)ο
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated d... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-37 | exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β fields to exclude from new model, as with values this takes precedence over include
update (Optional[DictStrAny]) β values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep (bool... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-38 | stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
tags (Optional[List[str]]) β
kwargs (Any) β
Return type
langchain.schema.LLMResult
generate_prompt(prompts, stop=None, callbacks=None, **kwargs)ο
Take in a lis... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-39 | Return type
int
get_num_tokens_from_messages(messages)ο
Get the number of tokens in the message.
Parameters
messages (List[langchain.schema.BaseMessage]) β
Return type
int
get_token_ids(text)ο
Get the token present in the text.
Parameters
text (str) β
Return type
List[int]
json(*, include=None, exclude=None, by_alias... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-40 | exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β
by_alias (bool) β
skip_defaults (Optional[bool]) β
exclude_unset (bool) β
exclude_defaults (bool) β
exclude_none (bool) β
encoder (Optional[Callable[[Any], Any]]) β
models_as_dict (bool) β
dumps_kwargs (Any) β
Return type
unicode
predict(text, *,... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-41 | kwargs (Any) β
Return type
langchain.schema.BaseMessage
save(file_path)ο
Save the LLM.
Parameters
file_path (Union[pathlib.Path, str]) β Path to file to save the LLM to.
Return type
None
Example:
.. code-block:: python
llm.save(file_path=βpath/llm.yamlβ)
classmethod update_forward_refs(**localns)ο
Try to update Forwar... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-42 | property lc_secrets: Dict[str, str]ο
Return a map of constructor argument names to secret ids.
eg. {βopenai_api_keyβ: βOPENAI_API_KEYβ}
property lc_serializable: boolο
Return whether or not the class is serializable.
class langchain.llms.Anthropic(*, client=None, model='claude-v1', max_tokens_to_sample=256, temperature... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-43 | Wrapper around Anthropicβs large language models.
To use, you should have the anthropic python package installed, and the
environment variable ANTHROPIC_API_KEY set with your API key, or pass
it as a named parameter to the constructor.
Example
import anthropic
from langchain.llms import Anthropic
model = Anthropic(mode... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-44 | response = model(prompt)
Parameters
client (Any) β
model (str) β
max_tokens_to_sample (int) β
temperature (Optional[float]) β
top_k (Optional[int]) β
top_p (Optional[float]) β
streaming (bool) β
default_request_timeout (Optional[Union[float, Tuple[float, float]]]) β
anthropic_api_url (Optional[str]) β
anthropi... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-45 | tags (Optional[List[str]]) β
Return type
None
attribute default_request_timeout: Optional[Union[float, Tuple[float, float]]] = Noneο
Timeout for requests to Anthropic Completion API. Default is 600 seconds.
attribute max_tokens_to_sample: int = 256ο
Denotes the number of tokens to predict per generation.
attribute mod... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-46 | Total probability mass of tokens to consider at each step.
attribute verbose: bool [Optional]ο
Whether to print out response text.
__call__(prompt, stop=None, callbacks=None, **kwargs)ο
Check Cache and run the LLM on the given prompt and input.
Parameters
prompt (str) β
stop (Optional[List[str]]) β
callbacks (Optiona... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-47 | tags (Optional[List[str]]) β
kwargs (Any) β
Return type
langchain.schema.LLMResult
async agenerate_prompt(prompts, stop=None, callbacks=None, **kwargs)ο
Take in a list of prompt values and return an LLMResult.
Parameters
prompts (List[langchain.schema.PromptValue]) β
stop (Optional[List[str]]) β
callbacks (Optional... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-48 | Predict message from messages.
Parameters
messages (List[langchain.schema.BaseMessage]) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
langchain.schema.BaseMessage
classmethod construct(_fields_set=None, **values)ο
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated d... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-49 | exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β fields to exclude from new model, as with values this takes precedence over include
update (Optional[DictStrAny]) β values to change/add in the new model. Note: the data is not validated before creating
the new model: you should trust this data
deep (bool... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-50 | stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
tags (Optional[List[str]]) β
kwargs (Any) β
Return type
langchain.schema.LLMResult
generate_prompt(prompts, stop=None, callbacks=None, **kwargs)ο
Take in a lis... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-51 | Return type
int
get_num_tokens_from_messages(messages)ο
Get the number of tokens in the message.
Parameters
messages (List[langchain.schema.BaseMessage]) β
Return type
int
get_token_ids(text)ο
Get the token present in the text.
Parameters
text (str) β
Return type
List[int]
json(*, include=None, exclude=None, by_alias... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-52 | exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β
by_alias (bool) β
skip_defaults (Optional[bool]) β
exclude_unset (bool) β
exclude_defaults (bool) β
exclude_none (bool) β
encoder (Optional[Callable[[Any], Any]]) β
models_as_dict (bool) β
dumps_kwargs (Any) β
Return type
unicode
predict(text, *,... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-53 | kwargs (Any) β
Return type
langchain.schema.BaseMessage
save(file_path)ο
Save the LLM.
Parameters
file_path (Union[pathlib.Path, str]) β Path to file to save the LLM to.
Return type
None
Example:
.. code-block:: python
llm.save(file_path=βpath/llm.yamlβ)
stream(prompt, stop=None)[source]ο
Call Anthropic completion_str... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-54 | prompt = f"\n\nHuman: {prompt}\n\nAssistant:"
generator = anthropic.stream(prompt)
for token in generator:
yield token
classmethod update_forward_refs(**localns)ο
Try to update ForwardRefs on fields based on this Model, globalns and localns.
Parameters
localns (Any) β
Return type
None
property lc_attributes: Dictο... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-55 | property lc_serializable: boolο
Return whether or not the class is serializable.
class langchain.llms.Anyscale(*, cache=None, verbose=None, callbacks=None, callback_manager=None, tags=None, model_kwargs=None, anyscale_service_url=None, anyscale_service_route=None, anyscale_service_token=None)[source]ο
Bases: langchain.... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-56 | prompt_list=[]
@ray.remote
def send_query(llm, prompt):
resp = llm(prompt)
return resp
futures = [send_query.remote(anyscale, prompt) for prompt in prompt_list]
results = ray.get(futures)
Parameters
cache (Optional[bool]) β
verbose (bool) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackH... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-57 | attribute tags: Optional[List[str]] = Noneο
Tags to add to the run trace.
attribute verbose: bool [Optional]ο
Whether to print out response text.
__call__(prompt, stop=None, callbacks=None, **kwargs)ο
Check Cache and run the LLM on the given prompt and input.
Parameters
prompt (str) β
stop (Optional[List[str]]) β
cal... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-58 | stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
tags (Optional[List[str]]) β
kwargs (Any) β
Return type
langchain.schema.LLMResult
async agenerate_prompt(prompts, stop=None, callbacks=None, **kwargs)ο
Take i... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-59 | Parameters
text (str) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
str
async apredict_messages(messages, *, stop=None, **kwargs)ο
Predict message from messages.
Parameters
messages (List[langchain.schema.BaseMessage]) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
langchain.schema.... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-60 | Return type
Model
copy(*, include=None, exclude=None, update=None, deep=False)ο
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β fields to include in new model
exclude (Optional[Union[AbstractSetIntStr, MappingI... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-61 | kwargs (Any) β
Return type
Dict
generate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs)ο
Run the LLM on the given prompt and input.
Parameters
prompts (List[str]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCal... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-62 | kwargs (Any) β
Return type
langchain.schema.LLMResult
get_num_tokens(text)ο
Get the number of tokens present in the text.
Parameters
text (str) β
Return type
int
get_num_tokens_from_messages(messages)ο
Get the number of tokens in the message.
Parameters
messages (List[langchain.schema.BaseMessage]) β
Return type
int... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-63 | encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().
Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β
by_alias (bool) β
skip_defaults (Optional[bool]) β
exclude_unset (bo... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-64 | Return type
str
predict_messages(messages, *, stop=None, **kwargs)ο
Predict message from messages.
Parameters
messages (List[langchain.schema.BaseMessage]) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
langchain.schema.BaseMessage
save(file_path)ο
Save the LLM.
Parameters
file_path (Union[pathlib.Pat... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-65 | serialized kwargs. These attributes must be accepted by the
constructor.
property lc_namespace: List[str]ο
Return the namespace of the langchain object.
eg. [βlangchainβ, βllmsβ, βopenaiβ]
property lc_secrets: Dict[str, str]ο
Return a map of constructor argument names to secret ids.
eg. {βopenai_api_keyβ: βOPENAI_API_K... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-66 | Allow you to use an Aviary.
Aviary is a backend for hosted models. You can
find out more about aviary at
http://github.com/ray-project/aviary
To get a list of the models supported on an
aviary, follow the instructions on the web site to
install the aviary CLI and then use:
aviary models
AVIARY_URL and AVIARY_TOKEN envi... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-67 | verbose (bool) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
callback_manager (Optional[langchain.callbacks.base.BaseCallbackManager]) β
tags (Optional[List[str]]) β
model (str) β
aviary_url (Optional[str]) β
aviary_token (Optiona... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-68 | stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
kwargs (Any) β
Return type
str
async agenerate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs)ο
Run the LLM on the given prompt and input.
Parameter... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-69 | Parameters
prompts (List[langchain.schema.PromptValue]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
kwargs (Any) β
Return type
langchain.schema.LLMResult
async apredict(text, *, stop=None, **kwargs)ο
P... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-70 | Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = βallowβ was set since it adds all passed values
Parameters
_fields_set (Optional[SetStr]) β
values (Any) β
Return type
Model
copy... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-71 | the new model: you should trust this data
deep (bool) β set to True to make a deep copy of the model
self (Model) β
Returns
new model instance
Return type
Model
dict(**kwargs)ο
Return a dictionary of the LLM.
Parameters
kwargs (Any) β
Return type
Dict
generate(prompts, stop=None, callbacks=None, *, tags=None, **kwarg... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-72 | Take in a list of prompt values and return an LLMResult.
Parameters
prompts (List[langchain.schema.PromptValue]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
kwargs (Any) β
Return type
langchain.schema.... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-73 | Return type
List[int]
json(*, include=None, exclude=None, by_alias=False, skip_defaults=None, exclude_unset=False, exclude_defaults=False, exclude_none=False, encoder=None, models_as_dict=True, **dumps_kwargs)ο
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optio... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-74 | models_as_dict (bool) β
dumps_kwargs (Any) β
Return type
unicode
predict(text, *, stop=None, **kwargs)ο
Predict text from text.
Parameters
text (str) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
str
predict_messages(messages, *, stop=None, **kwargs)ο
Predict message from messages.
Parameters
messa... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-75 | classmethod update_forward_refs(**localns)ο
Try to update ForwardRefs on fields based on this Model, globalns and localns.
Parameters
localns (Any) β
Return type
None
property lc_attributes: Dictο
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by th... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-76 | class langchain.llms.AzureMLOnlineEndpoint(*, cache=None, verbose=None, callbacks=None, callback_manager=None, tags=None, endpoint_url='', endpoint_api_key='', deployment_name='', http_client=None, content_formatter=None, model_kwargs=None)[source]ο
Bases: langchain.llms.base.LLM, pydantic.main.BaseModel
Wrapper around... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-77 | callback_manager (Optional[langchain.callbacks.base.BaseCallbackManager]) β
tags (Optional[List[str]]) β
endpoint_url (str) β
endpoint_api_key (str) β
deployment_name (str) β
http_client (Any) β
content_formatter (Any) β
model_kwargs (Optional[dict]) β
Return type
None
attribute content_formatter: Any = Noneο
T... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-78 | attribute endpoint_url: str = ''ο
URL of pre-existing Endpoint. Should be passed to constructor or specified as
env var AZUREML_ENDPOINT_URL.
attribute model_kwargs: Optional[dict] = Noneο
Key word arguments to pass to the model.
attribute tags: Optional[List[str]] = Noneο
Tags to add to the run trace.
attribute verbos... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-79 | Return type
str
async agenerate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs)ο
Run the LLM on the given prompt and input.
Parameters
prompts (List[str]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManag... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-80 | kwargs (Any) β
Return type
langchain.schema.LLMResult
async apredict(text, *, stop=None, **kwargs)ο
Predict text from text.
Parameters
text (str) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
str
async apredict_messages(messages, *, stop=None, **kwargs)ο
Predict message from messages.
Parameters
mes... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-81 | Parameters
_fields_set (Optional[SetStr]) β
values (Any) β
Return type
Model
copy(*, include=None, exclude=None, update=None, deep=False)ο
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β fields to include in ... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-82 | Model
dict(**kwargs)ο
Return a dictionary of the LLM.
Parameters
kwargs (Any) β
Return type
Dict
generate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs)ο
Run the LLM on the given prompt and input.
Parameters
prompts (List[str]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.call... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-83 | stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
kwargs (Any) β
Return type
langchain.schema.LLMResult
get_num_tokens(text)ο
Get the number of tokens present in the text.
Parameters
text (str) β
Return type
i... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-84 | Return type
List[int]
json(*, include=None, exclude=None, by_alias=False, skip_defaults=None, exclude_unset=False, exclude_defaults=False, exclude_none=False, encoder=None, models_as_dict=True, **dumps_kwargs)ο
Generate a JSON representation of the model, include and exclude arguments as per dict().
encoder is an optio... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-85 | models_as_dict (bool) β
dumps_kwargs (Any) β
Return type
unicode
predict(text, *, stop=None, **kwargs)ο
Predict text from text.
Parameters
text (str) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
str
predict_messages(messages, *, stop=None, **kwargs)ο
Predict message from messages.
Parameters
messa... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-86 | classmethod update_forward_refs(**localns)ο
Try to update ForwardRefs on fields based on this Model, globalns and localns.
Parameters
localns (Any) β
Return type
None
property lc_attributes: Dictο
Return a list of attribute names that should be included in the
serialized kwargs. These attributes must be accepted by th... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-87 | class langchain.llms.AzureOpenAI(*, cache=None, verbose=None, callbacks=None, callback_manager=None, tags=None, client=None, model='text-davinci-003', temperature=0.7, max_tokens=256, top_p=1, frequency_penalty=0, presence_penalty=0, n=1, best_of=1, model_kwargs=None, openai_api_key=None, openai_api_base=None, openai_o... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-88 | environment variable OPENAI_API_KEY set with your API key.
Any parameters that are valid to be passed to the openai.create call can be passed
in, even if not explicitly saved on this class.
Example
from langchain.llms import AzureOpenAI
openai = AzureOpenAI(model_name="text-davinci-003")
Parameters
cache (Optional[bool... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-89 | presence_penalty (float) β
n (int) β
best_of (int) β
model_kwargs (Dict[str, Any]) β
openai_api_key (Optional[str]) β
openai_api_base (Optional[str]) β
openai_organization (Optional[str]) β
openai_proxy (Optional[str]) β
batch_size (int) β
request_timeout (Optional[Union[float, Tuple[float, float]]]) β
logit_... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-90 | openai_api_type (str) β
openai_api_version (str) β
Return type
None
attribute allowed_special: Union[Literal['all'], AbstractSet[str]] = {}ο
Set of special tokens that are allowedγ
attribute batch_size: int = 20ο
Batch size to use when passing multiple documents to generate.
attribute best_of: int = 1ο
Generates best... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-91 | Adjust the probability of specific tokens being generated.
attribute max_retries: int = 6ο
Maximum number of retries to make when generating.
attribute max_tokens: int = 256ο
The maximum number of tokens to generate in the completion.
-1 returns as many tokens as possible given the prompt and
the models maximal context... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-92 | Timeout for requests to OpenAI completion API. Default is 600 seconds.
attribute streaming: bool = Falseο
Whether to stream the results or not.
attribute tags: Optional[List[str]] = Noneο
Tags to add to the run trace.
attribute temperature: float = 0.7ο
What sampling temperature to use.
attribute tiktoken_model_name: O... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-93 | API but with different models. In those cases, in order to avoid erroring
when tiktoken is called, you can specify a model name to use here.
attribute top_p: float = 1ο
Total probability mass of tokens to consider at each step.
attribute verbose: bool [Optional]ο
Whether to print out response text.
__call__(prompt, sto... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-94 | Parameters
prompts (List[str]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
tags (Optional[List[str]]) β
kwargs (Any) β
Return type
langchain.schema.LLMResult
async agenerate_prompt(prompts, stop=None,... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-95 | Predict text from text.
Parameters
text (str) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Return type
str
async apredict_messages(messages, *, stop=None, **kwargs)ο
Predict message from messages.
Parameters
messages (List[langchain.schema.BaseMessage]) β
stop (Optional[Sequence[str]]) β
kwargs (Any) β
Retur... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-96 | Return type
Model
copy(*, include=None, exclude=None, update=None, deep=False)ο
Duplicate a model, optionally choose which fields to include, exclude and change.
Parameters
include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) β fields to include in new model
exclude (Optional[Union[AbstractSetIntStr, MappingI... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-97 | Parameters
choices (Any) β
prompts (List[str]) β
token_usage (Dict[str, int]) β
Return type
langchain.schema.LLMResult
dict(**kwargs)ο
Return a dictionary of the LLM.
Parameters
kwargs (Any) β
Return type
Dict
generate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs)ο
Run the LLM on the given prompt and ... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-98 | Take in a list of prompt values and return an LLMResult.
Parameters
prompts (List[langchain.schema.PromptValue]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
kwargs (Any) β
Return type
langchain.schema.... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-99 | params (Dict[str, Any]) β
prompts (List[str]) β
stop (Optional[List[str]]) β
Return type
List[List[str]]
get_token_ids(text)ο
Get the token IDs using the tiktoken package.
Parameters
text (str) β
Return type
List[int]
json(*, include=None, exclude=None, by_alias=False, skip_defaults=None, exclude_unset=False, exclu... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-100 | by_alias (bool) β
skip_defaults (Optional[bool]) β
exclude_unset (bool) β
exclude_defaults (bool) β
exclude_none (bool) β
encoder (Optional[Callable[[Any], Any]]) β
models_as_dict (bool) β
dumps_kwargs (Any) β
Return type
unicode
max_tokens_for_prompt(prompt)ο
Calculate the maximum number of tokens possible to ... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-101 | Parameters
modelname (str) β The modelname we want to know the context size for.
Returns
The maximum context size
Return type
int
Example
max_tokens = openai.modelname_to_contextsize("text-davinci-003")
predict(text, *, stop=None, **kwargs)ο
Predict text from text.
Parameters
text (str) β
stop (Optional[Sequence[str]]... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-102 | Return type
Dict[str, Any]
save(file_path)ο
Save the LLM.
Parameters
file_path (Union[pathlib.Path, str]) β Path to file to save the LLM to.
Return type
None
Example:
.. code-block:: python
llm.save(file_path=βpath/llm.yamlβ)
stream(prompt, stop=None)ο
Call OpenAI with streaming flag and return the resulting generator.... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-103 | for token in generator:
yield token
classmethod update_forward_refs(**localns)ο
Try to update ForwardRefs on fields based on this Model, globalns and localns.
Parameters
localns (Any) β
Return type
None
property lc_attributes: Dictο
Return a list of attribute names that should be included in the
serialized kwargs.... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-104 | property max_context_size: intο
Get max context size for this model.
class langchain.llms.Banana(*, cache=None, verbose=None, callbacks=None, callback_manager=None, tags=None, model_key='', model_kwargs=None, banana_api_key=None)[source]ο
Bases: langchain.llms.base.LLM
Wrapper around Banana large language models.
To us... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-105 | callback_manager (Optional[langchain.callbacks.base.BaseCallbackManager]) β
tags (Optional[List[str]]) β
model_key (str) β
model_kwargs (Dict[str, Any]) β
banana_api_key (Optional[str]) β
Return type
None
attribute model_key: str = ''ο
model endpoint to use
attribute model_kwargs: Dict[str, Any] [Optional]ο
Holds ... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-106 | stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
kwargs (Any) β
Return type
str
async agenerate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs)ο
Run the LLM on the given prompt and input.
Parameter... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-107 | Parameters
prompts (List[langchain.schema.PromptValue]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
kwargs (Any) β
Return type
langchain.schema.LLMResult
async apredict(text, *, stop=None, **kwargs)ο
P... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-108 | Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
Default values are respected, but no other validation is performed.
Behaves as if Config.extra = βallowβ was set since it adds all passed values
Parameters
_fields_set (Optional[SetStr]) β
values (Any) β
Return type
Model
copy... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-109 | the new model: you should trust this data
deep (bool) β set to True to make a deep copy of the model
self (Model) β
Returns
new model instance
Return type
Model
dict(**kwargs)ο
Return a dictionary of the LLM.
Parameters
kwargs (Any) β
Return type
Dict
generate(prompts, stop=None, callbacks=None, *, tags=None, **kwarg... | https://api.python.langchain.com/en/latest/modules/llms.html |
0e0436a883fe-110 | Take in a list of prompt values and return an LLMResult.
Parameters
prompts (List[langchain.schema.PromptValue]) β
stop (Optional[List[str]]) β
callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) β
kwargs (Any) β
Return type
langchain.schema.... | https://api.python.langchain.com/en/latest/modules/llms.html |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.