YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Table of Contents

Structure of Coder

     goal, memory_files (dict)
             |
             v
    +-------------------+
    |  MemoryReading    |  Reads in the content of the memory files
    |    Flow           |
    +-------------------+
             |
             | (memory_content)
             |
             v
    +-------------------+
    |    PlanWriter     |  Writes a step-by-step plan to achieve the goal
    +-------------------+
             |
             | (plan)
             |
             v
    +-------------------+
    |   CtrlExMemFlow   |  Illustrated below. Carries out the plan in an controller-executor fashion,
    |                   |  with memory management mechanisms.
    +-------------------+
             |
    (summary, result)

Here is the structure of the CtrlExMemFlow:

    plan, memory_files, memory_content, goal
                |
                v
        +---------------+
        |  Controller   | --------<<<<-----------+
        +---------------+                        |
                |                                |
                | (command, command args)        |
                |                                |
                v                                |
        +------------------+                     |
        |   Executor       |  Each branch is an  |
        | (Tree Structure) |  executor           |
        +------------------+                     |
                |                                ^
                | (execution results)            ^
                |                                ^
                v                                ^
        +---------------+                        ^
        | MemWriteFlow  |  Updates memory files  ^
        +---------------+                        ^
                |                                ^
                | (summary)                      |
                |                                |
                v                                |
        +---------------+                        |
        | MemReadFlow   |  Reads updated memory  |
        +---------------+                        |
                |                                |
                | (updated memory content)       |
                |                                |
                +-> goes back to the Controller>-+

Structure of the Executors:

                 +-------------------+
                 |   Branching       |
                 |    Executor       |
                 +-------------------+
        /          |       |       |         \
       /           |       |       |          \
      /            |       |       |           \
     /             |       |       |            \
Extend_library  ask_user re_plan update_plan   run_code

Memory files of Coder:

  • plan_coder.txt
  • logs_coder.txt
  • library.py

About the branches:

  • ExtendLibrary: Writes and tests code functions in an interactive fashion, finally saves the written function to the code library.
  • ask_user: Ask user for info / confirmation, etc.
  • re_plan: One branch of the executors, when something goes wrong, re-draft the plan.
  • update_plan: One branch of the executors, when the controller realizes that one (or some, depending on the LLM's response) step of the plan is (are) done, it generates a new plan that marks the step(s) as done.
  • run_code: Runs the code written by the Controller, will first open up a temp file with the code for user confirmation and editing, then the code is passed to the InterpreterFlow.

CtrlExMem_CoderFlow

CtrlExMem_CoderFlow Objects

class CtrlExMem_CoderFlow(CtrlExMemFlow)

This class inherits from the CtrlExMemFlow class from AbstractBossFlowModule. See: https://huggingface.co/aiflows/AbstractBossFlowModule/blob/main/CtrlExMemFlow.py

Input Interface:

  • plan
  • logs
  • code_library: the signatures and docstring of the functions in the code library.
  • memory_files
  • goal

Output Interface

  • result (str): The result of the flow, the result will be returned to the caller.
  • summary (str): The summary of the flow, the summary will be logged into the logs of the caller flow.

Configuration Parameters:

detect_finish_or_continue

@CircularFlow.output_msg_payload_processor
def detect_finish_or_continue(output_payload: Dict[str, Any],
                              src_flow) -> Dict[str, Any]

This method is called when the flow receives a message from a subflow.

This method will check the message and decide whether the flow should continue or finish.

Arguments:

  • output_payload: the output payload of the subflow
  • src_flow: the source flow of the message

Returns:

Dict[str, Any]: the output payload of the flow

run_coder

Planner_CoderFlow

Planner_CoderFlow Objects

class Planner_CoderFlow(PlanWriterFlow)

Planner of the coder flow, it inherits from PlanWriterFlow, see: https://huggingface.co/aiflows/PlanWriterFlowModule This flow is responsible for generating the plan for the coder flow.

Input Interface

  • goal
  • memory_files
  • code_library

Output Interface

  • plan
  • summary
  • status

Configuration Parameters:

  • Look at the configuration parameters of PlanWriterFlowModule for detailed info. (https://huggingface.co/aiflows/PlanWriterFlowModule)
  • input_interface: the input interface of the flow, default: ["goal", "memory_files", "code_library"]
  • output_interface: the output interface of the flow, default: ["plan", "summary", "status"]
  • subflows_config: configuration of the subflows.
  • topology: topology of the subflows.

Controller_CoderFlow

Controller_CoderFlow Objects

class Controller_CoderFlow(ChatAtomicFlow)

Refer to: https://huggingface.co/aiflows/JarvisFlowModule/blob/main/Controller_JarvisFlow.py This flow is used to control the coder flow.

Input Interface Non Initialized:

  • goal
  • plan
  • code_library
  • logs
  • memory_files

Input Interface Initialized:

  • goal
  • plan
  • code_library
  • logs
  • memory_files
  • result

Output Interface:

  • command
  • command_args

Configuration Parameters:

  • Input Interface Non Initialized: Input interface before the conversation is initialized.
  • Input Interface Initialized: Input interface after the conversation is initialized.
  • Output Interface: Output interface.
  • backend: The backend of the LLM.
  • command: A list of available commands for the controller to call.
  • system_message_prompt_template: The template of the system message prompt.
  • init_human_message_prompt_template: The template of the initial human message prompt.
  • human_message_prompt_template: The template of the human message prompt.
  • previous_messages: The sliding window of previous messages.

__init__

def __init__(commands: List[Command], **kwargs)

Initialize the flow.

Arguments:

  • commands (List[Command]): A list of available commands for the controller to call.
  • kwargs (Dict[str, Any]): Refer to the configuration parameters.

instantiate_from_config

@classmethod
def instantiate_from_config(cls, config)

Instantiate the flow from the configuration.

Arguments:

  • config (Dict[str, Any]): The configuration.

Returns:

Controller_CoderFlow: The instantiated flow.

run

def run(input_data: Dict[str, Any]) -> Dict[str, Any]

Run the flow.

Arguments:

  • input_data (Dict[str, Any]): The input data.

Returns:

Dict[str, Any]: The output data.

UpdatePlanAtomicFlow

UpdatePlanAtomicFlow Objects

class UpdatePlanAtomicFlow(AtomicFlow)

This flow updates the plan file with the updated plan. The controller should pass the updated plan to this flow. This design (controller reflect on the existing plan--update plan) is intended to let the controller more aware of the plan it has. However one setback is that this process in then not deterministic.

Input Interface

  • updated_plan: the updated plan in string format

Output Interface

  • result: the result of the update plan operation

Configuration Parameters:

  • input_interface: the input interface of the flow, default: ["updated_plan"]
  • output_interface: the output interface of the flow, default: ["result"]

run

def run(input_data: Dict[str, Any])

The run function of the flow.

Arguments:

  • input_data (Dict[str, Any]): the input data to the flow

Returns:

Dict[str, Any]: the output data of the flow

CoderFlow

CoderFlow Objects

class CoderFlow(AbstractBossFlow)

Coder flow is one executor branch of the Jarvis flow. At a higher level, it is a flow that writes and runs code given a goal. In the Jarvis flow, the Coder flow in invoked by the controller, The Coder flow receives the goal generated by the controller, writes and runs code in an interactive fashion.

The Coder flow has the similar structure as the Jarvis flow (inherited from AbstractBossFlow).

Input Interface (expected input)

  • goal (str): The goal from the caller (source flow, i.e. JarvisFlow)

Output Interface (expected output)

  • result (str): The result of the flow, the result will be returned to the caller (i.e. JarvisFlow).
  • summary (str): The summary of the flow, the summary will be logged into the logs of the caller flow (i.e. JarvisFlow).

Configuration Parameters: (Also see super class: AbstractBossFlow)

  • memory_files (dict): The memory files of the flow. The memory files are the files that the flow reads and writes. Typically it should contain plan, logs, and code_library.

Typical workflow of Coder: 0. JarvisFlow calls Coder with a goal.

  1. MemoryReading reads plans, logs and code library.
  2. Planner makes plan based on goal.
  3. Extend library with the goal given by the controller.
  4. Run code with code (possibly calls the newly written function) given by the controller.
  5. Finish and give an answer.

run

def run(input_data: Dict[str, Any]) -> Dict[str, Any]

The run function of the Coder flow.

Arguments:

  • input_data (Dict[str, Any]): The input data of the flow.

Returns:

Dict[str, Any]: The output data of the flow.

__init__

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support