Table of Contents
LCToolFlow
LCToolFlow Objects
class LCToolFlow(AtomicFlow)
A flow that runs a tool using langchain. For example, a tool could be to excute a query with the duckduckgo search engine.
Configuration Parameters:
name
(str): The name of the flow. Default: "search"description
(str): A description of the flow. This description is used to generate the help message of the flow. Default: "useful when you need to look for the answer online, especially for recent events."keep_raw_response
(bool): If True, the raw response of the tool is kept. Default: Falseclear_flow_namespase_on_run_end
(bool): If True, the flow namespace is cleared at the end of the run. Default: Falsebackend
(Dict[str, Any]): The configuration of the backend. Default: langchain.tools.DuckDuckGoSearchRun- Other parameters are inherited from the default configuration of AtomicFlow (see AtomicFlow)
Input Interface:
query
(str): the query to run the tool on
Output Interface:
observation
(str): the observation returned by the tool
Arguments:
backend
(BaseTool
): The backend of the flow. It is a tool that is run by the flow. (e.g. duckduckgo search engine)\**kwargs
: Additional arguments to pass to the flow. See :class:aiflows.base_flows.AtomicFlow
for more details.
instantiate_from_config
@classmethod
def instantiate_from_config(cls, config: Dict[str, Any]) -> LCToolFlow
This method instantiates the flow from a configuration file
Arguments:
config
(Dict[str, Any]
): The configuration of the flow.
Returns:
LCToolFlow
: The instantiated flow.
run
def run(input_message: FlowMessage)
This method runs the flow. It runs the backend on the input data.
Arguments:
input_message
(FlowMessage
): The input message of the flow.
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.