id
stringlengths
14
16
text
stringlengths
4
1.28k
source
stringlengths
54
121
0e0436a883fe-11
Take in a list of prompt values and return an LLMResult. Parameters prompts (List[langchain.schema.PromptValue]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – kwargs (Any) – Return type langchain.schema....
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-12
Return type List[int] json(*, include=None, exclude=None, by_alias=False, skip_defaults=None, exclude_unset=False, exclude_defaults=False, exclude_none=False, encoder=None, models_as_dict=True, **dumps_kwargs) Generate a JSON representation of the model, include and exclude arguments as per dict(). encoder is an optio...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-13
models_as_dict (bool) – dumps_kwargs (Any) – Return type unicode predict(text, *, stop=None, **kwargs) Predict text from text. Parameters text (str) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type str predict_messages(messages, *, stop=None, **kwargs) Predict message from messages. Parameters messa...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-14
classmethod update_forward_refs(**localns) Try to update ForwardRefs on fields based on this Model, globalns and localns. Parameters localns (Any) – Return type None property lc_attributes: Dict Return a list of attribute names that should be included in the serialized kwargs. These attributes must be accepted by th...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-15
class langchain.llms.AlephAlpha(*, cache=None, verbose=None, callbacks=None, callback_manager=None, tags=None, client=None, model='luminous-base', maximum_tokens=64, temperature=0.0, top_k=0, top_p=0.0, presence_penalty=0.0, frequency_penalty=0.0, repetition_penalties_include_prompt=False, use_multiplicative_presence_p...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-16
completion_bias_exclusion_first_token_only=False, contextual_control_threshold=None, control_log_additive=True, repetition_penalties_include_completion=True, raw_completion=False, aleph_alpha_api_key=None, stop_sequences=None)[source]
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-17
Bases: langchain.llms.base.LLM Wrapper around Aleph Alpha large language models. To use, you should have the aleph_alpha_client python package installed, and the environment variable ALEPH_ALPHA_API_KEY set with your API key, or pass it as a named parameter to the constructor. Parameters are explained more in depth her...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-18
verbose (bool) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – callback_manager (Optional[langchain.callbacks.base.BaseCallbackManager]) – tags (Optional[List[str]]) – client (Any) – model (Optional[str]) – maximum_tokens (int) – t...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-19
n (int) – logit_bias (Optional[Dict[int, float]]) – log_probs (Optional[int]) – tokens (Optional[bool]) – disable_optimizations (Optional[bool]) – minimum_tokens (Optional[int]) – echo (bool) – use_multiplicative_frequency_penalty (bool) – sequence_penalty (float) – sequence_penalty_min_length (int) – use_mul...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-20
raw_completion (bool) – aleph_alpha_api_key (Optional[str]) – stop_sequences (Optional[List[str]]) – Return type None attribute aleph_alpha_api_key: Optional[str] = None API key for Aleph Alpha API. attribute best_of: Optional[int] = None returns the one with the β€œbest of” results (highest log probability per toke...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-21
True: apply control by adding the log(control_factor) to attention scores. False: (attention_scores - - attention_scores.min(-1)) * control_factor attribute echo: bool = False Echo the prompt in the completion. attribute frequency_penalty: float = 0.0 Penalizes repeated tokens according to frequency. attribute log_pr...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-22
Model name to use. attribute n: int = 1 How many completions to generate for each prompt. attribute penalty_bias: Optional[str] = None Penalty bias for the completion. attribute penalty_exceptions: Optional[List[str]] = None List of strings that may be generated without penalty, regardless of other penalty settings ...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-23
Flag deciding whether presence penalty or frequency penalty are updated from the prompt. attribute stop_sequences: Optional[List[str]] = None Stop sequences to use. attribute tags: Optional[List[str]] = None Tags to add to the run trace. attribute temperature: float = 0.0 A non-negative float that tunes the degree o...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-24
attribute verbose: bool [Optional] Whether to print out response text. __call__(prompt, stop=None, callbacks=None, **kwargs) Check Cache and run the LLM on the given prompt and input. Parameters prompt (str) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler],...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-25
tags (Optional[List[str]]) – kwargs (Any) – Return type langchain.schema.LLMResult async agenerate_prompt(prompts, stop=None, callbacks=None, **kwargs) Take in a list of prompt values and return an LLMResult. Parameters prompts (List[langchain.schema.PromptValue]) – stop (Optional[List[str]]) – callbacks (Optional...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-26
Predict message from messages. Parameters messages (List[langchain.schema.BaseMessage]) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type langchain.schema.BaseMessage classmethod construct(_fields_set=None, **values) Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated d...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-27
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to exclude from new model, as with values this takes precedence over include update (Optional[DictStrAny]) – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data deep (bool...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-28
stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – tags (Optional[List[str]]) – kwargs (Any) – Return type langchain.schema.LLMResult generate_prompt(prompts, stop=None, callbacks=None, **kwargs) Take in a lis...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-29
Return type int get_num_tokens_from_messages(messages) Get the number of tokens in the message. Parameters messages (List[langchain.schema.BaseMessage]) – Return type int get_token_ids(text) Get the token present in the text. Parameters text (str) – Return type List[int] json(*, include=None, exclude=None, by_alias...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-30
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – by_alias (bool) – skip_defaults (Optional[bool]) – exclude_unset (bool) – exclude_defaults (bool) – exclude_none (bool) – encoder (Optional[Callable[[Any], Any]]) – models_as_dict (bool) – dumps_kwargs (Any) – Return type unicode predict(text, *,...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-31
kwargs (Any) – Return type langchain.schema.BaseMessage save(file_path) Save the LLM. Parameters file_path (Union[pathlib.Path, str]) – Path to file to save the LLM to. Return type None Example: .. code-block:: python llm.save(file_path=”path/llm.yaml”) classmethod update_forward_refs(**localns) Try to update Forwar...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-32
property lc_secrets: Dict[str, str] Return a map of constructor argument names to secret ids. eg. {β€œopenai_api_key”: β€œOPENAI_API_KEY”} property lc_serializable: bool Return whether or not the class is serializable. class langchain.llms.AmazonAPIGateway(*, cache=None, verbose=None, callbacks=None, callback_manager=Non...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-33
tags (Optional[List[str]]) – api_url (str) – model_kwargs (Optional[Dict]) – content_handler (langchain.llms.amazon_api_gateway.ContentHandlerAmazonAPIGateway) – Return type None attribute api_url: str [Required] API Gateway URL attribute content_handler: langchain.llms.amazon_api_gateway.ContentHandlerAmazonAPIGa...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-34
Whether to print out response text. __call__(prompt, stop=None, callbacks=None, **kwargs) Check Cache and run the LLM on the given prompt and input. Parameters prompt (str) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallba...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-35
kwargs (Any) – Return type langchain.schema.LLMResult async agenerate_prompt(prompts, stop=None, callbacks=None, **kwargs) Take in a list of prompt values and return an LLMResult. Parameters prompts (List[langchain.schema.PromptValue]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callback...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-36
Predict message from messages. Parameters messages (List[langchain.schema.BaseMessage]) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type langchain.schema.BaseMessage classmethod construct(_fields_set=None, **values) Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated d...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-37
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to exclude from new model, as with values this takes precedence over include update (Optional[DictStrAny]) – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data deep (bool...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-38
stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – tags (Optional[List[str]]) – kwargs (Any) – Return type langchain.schema.LLMResult generate_prompt(prompts, stop=None, callbacks=None, **kwargs) Take in a lis...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-39
Return type int get_num_tokens_from_messages(messages) Get the number of tokens in the message. Parameters messages (List[langchain.schema.BaseMessage]) – Return type int get_token_ids(text) Get the token present in the text. Parameters text (str) – Return type List[int] json(*, include=None, exclude=None, by_alias...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-40
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – by_alias (bool) – skip_defaults (Optional[bool]) – exclude_unset (bool) – exclude_defaults (bool) – exclude_none (bool) – encoder (Optional[Callable[[Any], Any]]) – models_as_dict (bool) – dumps_kwargs (Any) – Return type unicode predict(text, *,...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-41
kwargs (Any) – Return type langchain.schema.BaseMessage save(file_path) Save the LLM. Parameters file_path (Union[pathlib.Path, str]) – Path to file to save the LLM to. Return type None Example: .. code-block:: python llm.save(file_path=”path/llm.yaml”) classmethod update_forward_refs(**localns) Try to update Forwar...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-42
property lc_secrets: Dict[str, str] Return a map of constructor argument names to secret ids. eg. {β€œopenai_api_key”: β€œOPENAI_API_KEY”} property lc_serializable: bool Return whether or not the class is serializable. class langchain.llms.Anthropic(*, client=None, model='claude-v1', max_tokens_to_sample=256, temperature...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-43
Wrapper around Anthropic’s large language models. To use, you should have the anthropic python package installed, and the environment variable ANTHROPIC_API_KEY set with your API key, or pass it as a named parameter to the constructor. Example import anthropic from langchain.llms import Anthropic model = Anthropic(mode...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-44
response = model(prompt) Parameters client (Any) – model (str) – max_tokens_to_sample (int) – temperature (Optional[float]) – top_k (Optional[int]) – top_p (Optional[float]) – streaming (bool) – default_request_timeout (Optional[Union[float, Tuple[float, float]]]) – anthropic_api_url (Optional[str]) – anthropi...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-45
tags (Optional[List[str]]) – Return type None attribute default_request_timeout: Optional[Union[float, Tuple[float, float]]] = None Timeout for requests to Anthropic Completion API. Default is 600 seconds. attribute max_tokens_to_sample: int = 256 Denotes the number of tokens to predict per generation. attribute mod...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-46
Total probability mass of tokens to consider at each step. attribute verbose: bool [Optional] Whether to print out response text. __call__(prompt, stop=None, callbacks=None, **kwargs) Check Cache and run the LLM on the given prompt and input. Parameters prompt (str) – stop (Optional[List[str]]) – callbacks (Optiona...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-47
tags (Optional[List[str]]) – kwargs (Any) – Return type langchain.schema.LLMResult async agenerate_prompt(prompts, stop=None, callbacks=None, **kwargs) Take in a list of prompt values and return an LLMResult. Parameters prompts (List[langchain.schema.PromptValue]) – stop (Optional[List[str]]) – callbacks (Optional...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-48
Predict message from messages. Parameters messages (List[langchain.schema.BaseMessage]) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type langchain.schema.BaseMessage classmethod construct(_fields_set=None, **values) Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated d...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-49
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to exclude from new model, as with values this takes precedence over include update (Optional[DictStrAny]) – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data deep (bool...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-50
stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – tags (Optional[List[str]]) – kwargs (Any) – Return type langchain.schema.LLMResult generate_prompt(prompts, stop=None, callbacks=None, **kwargs) Take in a lis...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-51
Return type int get_num_tokens_from_messages(messages) Get the number of tokens in the message. Parameters messages (List[langchain.schema.BaseMessage]) – Return type int get_token_ids(text) Get the token present in the text. Parameters text (str) – Return type List[int] json(*, include=None, exclude=None, by_alias...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-52
exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – by_alias (bool) – skip_defaults (Optional[bool]) – exclude_unset (bool) – exclude_defaults (bool) – exclude_none (bool) – encoder (Optional[Callable[[Any], Any]]) – models_as_dict (bool) – dumps_kwargs (Any) – Return type unicode predict(text, *,...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-53
kwargs (Any) – Return type langchain.schema.BaseMessage save(file_path) Save the LLM. Parameters file_path (Union[pathlib.Path, str]) – Path to file to save the LLM to. Return type None Example: .. code-block:: python llm.save(file_path=”path/llm.yaml”) stream(prompt, stop=None)[source] Call Anthropic completion_str...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-54
prompt = f"\n\nHuman: {prompt}\n\nAssistant:" generator = anthropic.stream(prompt) for token in generator: yield token classmethod update_forward_refs(**localns) Try to update ForwardRefs on fields based on this Model, globalns and localns. Parameters localns (Any) – Return type None property lc_attributes: Dict...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-55
property lc_serializable: bool Return whether or not the class is serializable. class langchain.llms.Anyscale(*, cache=None, verbose=None, callbacks=None, callback_manager=None, tags=None, model_kwargs=None, anyscale_service_url=None, anyscale_service_route=None, anyscale_service_token=None)[source] Bases: langchain....
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-56
prompt_list=[] @ray.remote def send_query(llm, prompt): resp = llm(prompt) return resp futures = [send_query.remote(anyscale, prompt) for prompt in prompt_list] results = ray.get(futures) Parameters cache (Optional[bool]) – verbose (bool) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackH...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-57
attribute tags: Optional[List[str]] = None Tags to add to the run trace. attribute verbose: bool [Optional] Whether to print out response text. __call__(prompt, stop=None, callbacks=None, **kwargs) Check Cache and run the LLM on the given prompt and input. Parameters prompt (str) – stop (Optional[List[str]]) – cal...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-58
stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – tags (Optional[List[str]]) – kwargs (Any) – Return type langchain.schema.LLMResult async agenerate_prompt(prompts, stop=None, callbacks=None, **kwargs) Take i...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-59
Parameters text (str) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type str async apredict_messages(messages, *, stop=None, **kwargs) Predict message from messages. Parameters messages (List[langchain.schema.BaseMessage]) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type langchain.schema....
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-60
Return type Model copy(*, include=None, exclude=None, update=None, deep=False) Duplicate a model, optionally choose which fields to include, exclude and change. Parameters include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to include in new model exclude (Optional[Union[AbstractSetIntStr, MappingI...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-61
kwargs (Any) – Return type Dict generate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs) Run the LLM on the given prompt and input. Parameters prompts (List[str]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCal...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-62
kwargs (Any) – Return type langchain.schema.LLMResult get_num_tokens(text) Get the number of tokens present in the text. Parameters text (str) – Return type int get_num_tokens_from_messages(messages) Get the number of tokens in the message. Parameters messages (List[langchain.schema.BaseMessage]) – Return type int...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-63
encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps(). Parameters include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – exclude (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – by_alias (bool) – skip_defaults (Optional[bool]) – exclude_unset (bo...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-64
Return type str predict_messages(messages, *, stop=None, **kwargs) Predict message from messages. Parameters messages (List[langchain.schema.BaseMessage]) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type langchain.schema.BaseMessage save(file_path) Save the LLM. Parameters file_path (Union[pathlib.Pat...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-65
serialized kwargs. These attributes must be accepted by the constructor. property lc_namespace: List[str] Return the namespace of the langchain object. eg. [β€œlangchain”, β€œllms”, β€œopenai”] property lc_secrets: Dict[str, str] Return a map of constructor argument names to secret ids. eg. {β€œopenai_api_key”: β€œOPENAI_API_K...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-66
Allow you to use an Aviary. Aviary is a backend for hosted models. You can find out more about aviary at http://github.com/ray-project/aviary To get a list of the models supported on an aviary, follow the instructions on the web site to install the aviary CLI and then use: aviary models AVIARY_URL and AVIARY_TOKEN envi...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-67
verbose (bool) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – callback_manager (Optional[langchain.callbacks.base.BaseCallbackManager]) – tags (Optional[List[str]]) – model (str) – aviary_url (Optional[str]) – aviary_token (Optiona...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-68
stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – kwargs (Any) – Return type str async agenerate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs) Run the LLM on the given prompt and input. Parameter...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-69
Parameters prompts (List[langchain.schema.PromptValue]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – kwargs (Any) – Return type langchain.schema.LLMResult async apredict(text, *, stop=None, **kwargs) P...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-70
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = β€˜allow’ was set since it adds all passed values Parameters _fields_set (Optional[SetStr]) – values (Any) – Return type Model copy...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-71
the new model: you should trust this data deep (bool) – set to True to make a deep copy of the model self (Model) – Returns new model instance Return type Model dict(**kwargs) Return a dictionary of the LLM. Parameters kwargs (Any) – Return type Dict generate(prompts, stop=None, callbacks=None, *, tags=None, **kwarg...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-72
Take in a list of prompt values and return an LLMResult. Parameters prompts (List[langchain.schema.PromptValue]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – kwargs (Any) – Return type langchain.schema....
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-73
Return type List[int] json(*, include=None, exclude=None, by_alias=False, skip_defaults=None, exclude_unset=False, exclude_defaults=False, exclude_none=False, encoder=None, models_as_dict=True, **dumps_kwargs) Generate a JSON representation of the model, include and exclude arguments as per dict(). encoder is an optio...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-74
models_as_dict (bool) – dumps_kwargs (Any) – Return type unicode predict(text, *, stop=None, **kwargs) Predict text from text. Parameters text (str) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type str predict_messages(messages, *, stop=None, **kwargs) Predict message from messages. Parameters messa...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-75
classmethod update_forward_refs(**localns) Try to update ForwardRefs on fields based on this Model, globalns and localns. Parameters localns (Any) – Return type None property lc_attributes: Dict Return a list of attribute names that should be included in the serialized kwargs. These attributes must be accepted by th...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-76
class langchain.llms.AzureMLOnlineEndpoint(*, cache=None, verbose=None, callbacks=None, callback_manager=None, tags=None, endpoint_url='', endpoint_api_key='', deployment_name='', http_client=None, content_formatter=None, model_kwargs=None)[source] Bases: langchain.llms.base.LLM, pydantic.main.BaseModel Wrapper around...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-77
callback_manager (Optional[langchain.callbacks.base.BaseCallbackManager]) – tags (Optional[List[str]]) – endpoint_url (str) – endpoint_api_key (str) – deployment_name (str) – http_client (Any) – content_formatter (Any) – model_kwargs (Optional[dict]) – Return type None attribute content_formatter: Any = None T...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-78
attribute endpoint_url: str = '' URL of pre-existing Endpoint. Should be passed to constructor or specified as env var AZUREML_ENDPOINT_URL. attribute model_kwargs: Optional[dict] = None Key word arguments to pass to the model. attribute tags: Optional[List[str]] = None Tags to add to the run trace. attribute verbos...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-79
Return type str async agenerate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs) Run the LLM on the given prompt and input. Parameters prompts (List[str]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManag...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-80
kwargs (Any) – Return type langchain.schema.LLMResult async apredict(text, *, stop=None, **kwargs) Predict text from text. Parameters text (str) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type str async apredict_messages(messages, *, stop=None, **kwargs) Predict message from messages. Parameters mes...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-81
Parameters _fields_set (Optional[SetStr]) – values (Any) – Return type Model copy(*, include=None, exclude=None, update=None, deep=False) Duplicate a model, optionally choose which fields to include, exclude and change. Parameters include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to include in ...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-82
Model dict(**kwargs) Return a dictionary of the LLM. Parameters kwargs (Any) – Return type Dict generate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs) Run the LLM on the given prompt and input. Parameters prompts (List[str]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.call...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-83
stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – kwargs (Any) – Return type langchain.schema.LLMResult get_num_tokens(text) Get the number of tokens present in the text. Parameters text (str) – Return type i...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-84
Return type List[int] json(*, include=None, exclude=None, by_alias=False, skip_defaults=None, exclude_unset=False, exclude_defaults=False, exclude_none=False, encoder=None, models_as_dict=True, **dumps_kwargs) Generate a JSON representation of the model, include and exclude arguments as per dict(). encoder is an optio...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-85
models_as_dict (bool) – dumps_kwargs (Any) – Return type unicode predict(text, *, stop=None, **kwargs) Predict text from text. Parameters text (str) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type str predict_messages(messages, *, stop=None, **kwargs) Predict message from messages. Parameters messa...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-86
classmethod update_forward_refs(**localns) Try to update ForwardRefs on fields based on this Model, globalns and localns. Parameters localns (Any) – Return type None property lc_attributes: Dict Return a list of attribute names that should be included in the serialized kwargs. These attributes must be accepted by th...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-87
class langchain.llms.AzureOpenAI(*, cache=None, verbose=None, callbacks=None, callback_manager=None, tags=None, client=None, model='text-davinci-003', temperature=0.7, max_tokens=256, top_p=1, frequency_penalty=0, presence_penalty=0, n=1, best_of=1, model_kwargs=None, openai_api_key=None, openai_api_base=None, openai_o...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-88
environment variable OPENAI_API_KEY set with your API key. Any parameters that are valid to be passed to the openai.create call can be passed in, even if not explicitly saved on this class. Example from langchain.llms import AzureOpenAI openai = AzureOpenAI(model_name="text-davinci-003") Parameters cache (Optional[bool...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-89
presence_penalty (float) – n (int) – best_of (int) – model_kwargs (Dict[str, Any]) – openai_api_key (Optional[str]) – openai_api_base (Optional[str]) – openai_organization (Optional[str]) – openai_proxy (Optional[str]) – batch_size (int) – request_timeout (Optional[Union[float, Tuple[float, float]]]) – logit_...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-90
openai_api_type (str) – openai_api_version (str) – Return type None attribute allowed_special: Union[Literal['all'], AbstractSet[str]] = {} Set of special tokens that are allowed。 attribute batch_size: int = 20 Batch size to use when passing multiple documents to generate. attribute best_of: int = 1 Generates best...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-91
Adjust the probability of specific tokens being generated. attribute max_retries: int = 6 Maximum number of retries to make when generating. attribute max_tokens: int = 256 The maximum number of tokens to generate in the completion. -1 returns as many tokens as possible given the prompt and the models maximal context...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-92
Timeout for requests to OpenAI completion API. Default is 600 seconds. attribute streaming: bool = False Whether to stream the results or not. attribute tags: Optional[List[str]] = None Tags to add to the run trace. attribute temperature: float = 0.7 What sampling temperature to use. attribute tiktoken_model_name: O...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-93
API but with different models. In those cases, in order to avoid erroring when tiktoken is called, you can specify a model name to use here. attribute top_p: float = 1 Total probability mass of tokens to consider at each step. attribute verbose: bool [Optional] Whether to print out response text. __call__(prompt, sto...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-94
Parameters prompts (List[str]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – tags (Optional[List[str]]) – kwargs (Any) – Return type langchain.schema.LLMResult async agenerate_prompt(prompts, stop=None,...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-95
Predict text from text. Parameters text (str) – stop (Optional[Sequence[str]]) – kwargs (Any) – Return type str async apredict_messages(messages, *, stop=None, **kwargs) Predict message from messages. Parameters messages (List[langchain.schema.BaseMessage]) – stop (Optional[Sequence[str]]) – kwargs (Any) – Retur...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-96
Return type Model copy(*, include=None, exclude=None, update=None, deep=False) Duplicate a model, optionally choose which fields to include, exclude and change. Parameters include (Optional[Union[AbstractSetIntStr, MappingIntStrAny]]) – fields to include in new model exclude (Optional[Union[AbstractSetIntStr, MappingI...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-97
Parameters choices (Any) – prompts (List[str]) – token_usage (Dict[str, int]) – Return type langchain.schema.LLMResult dict(**kwargs) Return a dictionary of the LLM. Parameters kwargs (Any) – Return type Dict generate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs) Run the LLM on the given prompt and ...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-98
Take in a list of prompt values and return an LLMResult. Parameters prompts (List[langchain.schema.PromptValue]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – kwargs (Any) – Return type langchain.schema....
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-99
params (Dict[str, Any]) – prompts (List[str]) – stop (Optional[List[str]]) – Return type List[List[str]] get_token_ids(text) Get the token IDs using the tiktoken package. Parameters text (str) – Return type List[int] json(*, include=None, exclude=None, by_alias=False, skip_defaults=None, exclude_unset=False, exclu...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-100
by_alias (bool) – skip_defaults (Optional[bool]) – exclude_unset (bool) – exclude_defaults (bool) – exclude_none (bool) – encoder (Optional[Callable[[Any], Any]]) – models_as_dict (bool) – dumps_kwargs (Any) – Return type unicode max_tokens_for_prompt(prompt) Calculate the maximum number of tokens possible to ...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-101
Parameters modelname (str) – The modelname we want to know the context size for. Returns The maximum context size Return type int Example max_tokens = openai.modelname_to_contextsize("text-davinci-003") predict(text, *, stop=None, **kwargs) Predict text from text. Parameters text (str) – stop (Optional[Sequence[str]]...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-102
Return type Dict[str, Any] save(file_path) Save the LLM. Parameters file_path (Union[pathlib.Path, str]) – Path to file to save the LLM to. Return type None Example: .. code-block:: python llm.save(file_path=”path/llm.yaml”) stream(prompt, stop=None) Call OpenAI with streaming flag and return the resulting generator....
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-103
for token in generator: yield token classmethod update_forward_refs(**localns) Try to update ForwardRefs on fields based on this Model, globalns and localns. Parameters localns (Any) – Return type None property lc_attributes: Dict Return a list of attribute names that should be included in the serialized kwargs....
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-104
property max_context_size: int Get max context size for this model. class langchain.llms.Banana(*, cache=None, verbose=None, callbacks=None, callback_manager=None, tags=None, model_key='', model_kwargs=None, banana_api_key=None)[source] Bases: langchain.llms.base.LLM Wrapper around Banana large language models. To us...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-105
callback_manager (Optional[langchain.callbacks.base.BaseCallbackManager]) – tags (Optional[List[str]]) – model_key (str) – model_kwargs (Dict[str, Any]) – banana_api_key (Optional[str]) – Return type None attribute model_key: str = '' model endpoint to use attribute model_kwargs: Dict[str, Any] [Optional] Holds ...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-106
stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – kwargs (Any) – Return type str async agenerate(prompts, stop=None, callbacks=None, *, tags=None, **kwargs) Run the LLM on the given prompt and input. Parameter...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-107
Parameters prompts (List[langchain.schema.PromptValue]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – kwargs (Any) – Return type langchain.schema.LLMResult async apredict(text, *, stop=None, **kwargs) P...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-108
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = β€˜allow’ was set since it adds all passed values Parameters _fields_set (Optional[SetStr]) – values (Any) – Return type Model copy...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-109
the new model: you should trust this data deep (bool) – set to True to make a deep copy of the model self (Model) – Returns new model instance Return type Model dict(**kwargs) Return a dictionary of the LLM. Parameters kwargs (Any) – Return type Dict generate(prompts, stop=None, callbacks=None, *, tags=None, **kwarg...
https://api.python.langchain.com/en/latest/modules/llms.html
0e0436a883fe-110
Take in a list of prompt values and return an LLMResult. Parameters prompts (List[langchain.schema.PromptValue]) – stop (Optional[List[str]]) – callbacks (Optional[Union[List[langchain.callbacks.base.BaseCallbackHandler], langchain.callbacks.base.BaseCallbackManager]]) – kwargs (Any) – Return type langchain.schema....
https://api.python.langchain.com/en/latest/modules/llms.html