---
title: Langflow TypeScript client
slug: /typescript-client
---

import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';

The Langflow TypeScript client allows your TypeScript applications to programmatically interact with the Langflow API.

For the client code repository, see [langflow-client-ts](https://github.com/datastax/langflow-client-ts/).

For the npm package, see [@datastax/langflow-client](https://www.npmjs.com/package/@datastax/langflow-client).

## Install the Langflow TypeScript package

To install the Langflow typescript client package, use one of the following commands:

<Tabs>
<TabItem value="npm" label="npm" default>

```bash
npm install @datastax/langflow-client
```

</TabItem>
<TabItem value="yarn" label="yarn">

```bash
yarn add @datastax/langflow-client
```

</TabItem>
<TabItem value="pnpm" label="pnpm">

```bash
pnpm add @datastax/langflow-client
```

</TabItem>
</Tabs>

## Initialize the Langflow TypeScript client

1. Import the client into your code.

    ```tsx
    import { LangflowClient } from "@datastax/langflow-client";
    ```

2. Initialize a `LangflowClient` object to interact with your server:

    ```tsx
    const baseUrl = "BASE_URL";
    const apiKey = "API_KEY";
    const client = new LangflowClient({ baseUrl, apiKey });
    ```

    Replace `BASE_URL` and `API_KEY` with values from your deployment.
    The default Langflow base URL is `http://localhost:7860`.
    To create an API key, see [API keys and authentication](/api-keys-and-authentication).

## Connect to your server and get responses

1. With your Langflow client initialized, test the connection by calling your Langflow server.

    The following example runs a flow (`runFlow`) by sending the flow ID and a chat input string:

    ```tsx
    import { LangflowClient } from "@datastax/langflow-client";

    const baseUrl = "http://localhost:7860";
    const client = new LangflowClient({ baseUrl });

    async function runFlow() {
        const flowId = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf";
        const flow = client.flow(flowId);
        const input = "Is anyone there?";

        const response = await flow.run(input);
        console.log(response);
    }

    runFlow().catch(console.error);
    ```

    Replace the following:

    * `baseUrl`: The URL of your Langflow server.
    * `flowId`: The ID of the flow you want to run.
    * `input`: The chat input message you want to send to trigger the flow.
    This is only valid for flows with a **Chat Input** component.

2. Review the result to confirm that the client connected to your Langflow server.

    The following example shows the response from a well-formed `runFlow` request that reached the Langflow server and successfully started the flow:

    ```
    FlowResponse {
      sessionId: 'aa5a238b-02c0-4f03-bc5c-cc3a83335cdf',
      outputs: [ { inputs: [Object], outputs: [Array] } ]
    }
    ```

    In this case, the response includes a [`sessionID`](/session-id) that is a unique identifier for the client-server session and an `outputs` array that contains information about the flow run.

3. Optional: If you want to get full response objects from the server, change `console.log` to stringify the returned JSON object:

    ```tsx
    console.log(JSON.stringify(response, null, 2));
    ```

    The exact structure of the returned `inputs` and `outputs` objects depends on the components and configuration of your flow.

4. Optional: If you want the response to include only the chat message from the **Chat Output** component, change `console.log` to use the `chatOutputText` convenience function:

    ```tsx
    console.log(response.chatOutputText());
    ```

## Use advanced TypeScript client features

The TypeScript client can do more than just connect to your server and run a flow.

This example builds on the quickstart with additional features for interacting with Langflow:

1. Pass [tweaks](/concepts-publish#input-schema) as an object with the request.
Tweaks are programmatic run-time overrides for component settings.

    This example changes the LLM used by a language model component in a flow::

    ```tsx
    const tweaks = { model_name: "gpt-4o-mini" };
    ```

2. Pass a [session ID](/session-id) with the request to separate the conversation from other flow runs, and to be able to continue this conversation by calling the same session ID in the future:

    ```tsx
    const session_id = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf";
    ```

3. Instead of calling `run` on the Flow object, call `stream` with the same arguments to get a streaming response:

    ```tsx
    const response = await client.flow(flowId).stream(input);

    for await (const event of response) {
      console.log(event);
    }
    ```

    The response is a [`ReadableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) of objects.
    For more information on streaming Langflow responses, see the [`/run` endpoint](/api-flows-run#run-flow).

4. Run the modified TypeScript application to run the flow with `tweaks`, `session_id`, and streaming:

    ```tsx
    import { LangflowClient } from "@datastax/langflow-client";

    const baseUrl = "http://localhost:7860";
    const client = new LangflowClient({ baseUrl });

    async function runFlow() {
        const flowId = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf";
        const input = "Is anyone there?";
        const tweaks = { model_name: "gpt-4o-mini" };
        const session_id = "test-session";

        const response = await client.flow(flowId).stream(input, {
            session_id,
            tweaks,
          });

        for await (const event of response) {
            console.log(event);
        }

    }
    runFlow().catch(console.error);
    ```

    Replace the following:

    * `baseUrl`: The URL of your Langflow server.
    * `flowId`: The ID of the flow you want to run.
    * `input`: The chat input message you want to send to trigger the flow, assuming the flow has a **Chat Input** component.
    * `tweaks`: Any tweak modifiers to apply to the flow run.
    This example changes the LLM used by a component in the flow.
    * `session_id`: Pass a custom session ID.
    If omitted or empty, the flow ID is the default session ID.

    <details>
    <summary>Result</summary>

    With streaming enabled, the response includes the flow metatadata and timestamped events for flow activity.
    For example:

    ```text
    {
      event: 'add_message',
      data: {
        timestamp: '2025-05-23 15:52:48 UTC',
        sender: 'User',
        sender_name: 'User',
        session_id: 'test-session',
        text: 'Is anyone there?',
        files: [],
        error: false,
        edit: false,
        properties: {
          text_color: '',
          background_color: '',
          edited: false,
          source: [Object],
          icon: '',
          allow_markdown: false,
          positive_feedback: null,
          state: 'complete',
          targets: []
        },
        category: 'message',
        content_blocks: [],
        id: '7f096715-3f2d-4d84-88d6-5e2f76bf3fbe',
        flow_id: 'aa5a238b-02c0-4f03-bc5c-cc3a83335cdf',
        duration: null
      }
    }
    {
      event: 'token',
      data: {
        chunk: 'Absolutely',
        id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',
        timestamp: '2025-05-23 15:52:48 UTC'
      }
    }
    {
      event: 'token',
      data: {
        chunk: ',',
        id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',
        timestamp: '2025-05-23 15:52:48 UTC'
      }
    }
    {
      event: 'token',
      data: {
        chunk: " I'm",
        id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',
        timestamp: '2025-05-23 15:52:48 UTC'
      }
    }
    {
      event: 'token',
      data: {
        chunk: ' here',
        id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',
        timestamp: '2025-05-23 15:52:48 UTC'
      }
    }

    // this response is abbreviated

    {
      event: 'end',
      data: { result: { session_id: 'test-session', outputs: [Array] } }
    }
    ```

    </details>

## Retrieve Langflow logs with the TypeScript client

To retrieve [Langflow logs](/logging), you must enable log retrieval on your Langflow server by including the following values in your Langflow `.env` file:

```text
LANGFLOW_ENABLE_LOG_RETRIEVAL=True
LANGFLOW_LOG_RETRIEVER_BUFFER_SIZE=10000
LANGFLOW_LOG_LEVEL=DEBUG
```

The following example script starts streaming logs in the background, and then runs a flow so you can monitor the flow run:

```tsx
import { LangflowClient } from "@datastax/langflow-client";

const baseUrl = "http://localhost:7863";
const flowId = "86f0bf45-0544-4e88-b0b1-8e622da7a7f0";

async function runFlow(client: LangflowClient) {
    const input = "Is anyone there?";
    const response = await client.flow(flowId).run(input);
    console.log('Flow response:', response);
}

async function main() {
    const client = new LangflowClient({ baseUrl: baseUrl });

    // Start streaming logs
    console.log('Starting log stream...');
    for await (const log of await client.logs.stream()) {
        console.log('Log:', log);
    }

    // Run the flow
    await runFlow(client);

}

main().catch(console.error);
```

Replace the following:

* `baseUrl`: The URL of your Langflow server.
* `flowId`: The ID of the flow you want to run.
* `input`: The chat input message you want to send to trigger the flow, assuming the flow has a **Chat Input** component.

Logs begin streaming indefinitely, and the flow runs once.

<details>
<summary>Result</summary>

The following example result is truncated for readability, but you can follow the messages to see how the flow instantiates its components, configures its model, and processes the outputs.

The `FlowResponse` object, at the end of the stream, is returned to the client with the flow result in the `outputs` array.

```text
Starting log stream...
Log: Log {
  timestamp: 2025-05-30T11:49:16.006Z,
  message: '2025-05-30T07:49:16.006127-0400 DEBUG Instantiating ChatInput of type component\n'
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.029Z,
  message: '2025-05-30T07:49:16.029957-0400 DEBUG Instantiating Prompt of type component\n'
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.049Z,
  message: '2025-05-30T07:49:16.049520-0400 DEBUG Instantiating ChatOutput of type component\n'
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.069Z,
  message: '2025-05-30T07:49:16.069359-0400 DEBUG Instantiating OpenAIModel of type component\n'
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.086Z,
  message: "2025-05-30T07:49:16.086426-0400 DEBUG Running layer 0 with 2 tasks, ['ChatInput-xjucM', 'Prompt-I3pxU']\n"
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.101Z,
  message: '2025-05-30T07:49:16.101766-0400 DEBUG Building Chat Input\n'
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.113Z,
  message: '2025-05-30T07:49:16.113343-0400 DEBUG Building Prompt\n'
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.131Z,
  message: '2025-05-30T07:49:16.131423-0400 DEBUG Logged vertex build: 6bd9fe9c-5eea-4f05-a96d-f6de9dc77e3c\n'
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.143Z,
  message: '2025-05-30T07:49:16.143295-0400 DEBUG Logged vertex build: 39c68ec9-3859-4fff-9b14-80b3271f8fbf\n'
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.188Z,
  message: "2025-05-30T07:49:16.188730-0400 DEBUG Running layer 1 with 1 tasks, ['OpenAIModel-RtlZm']\n"
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.201Z,
  message: '2025-05-30T07:49:16.201946-0400 DEBUG Building OpenAI\n'
}
Log: Log {
  timestamp: 2025-05-30T11:49:16.216Z,
  message: '2025-05-30T07:49:16.216622-0400 INFO Model name: gpt-4.1-mini\n'
}
Flow response: FlowResponse {
  sessionId: '86f0bf45-0544-4e88-b0b1-8e622da7a7f0',
  outputs: [ { inputs: [Object], outputs: [Array] } ]
}
Log: Log {
  timestamp: 2025-05-30T11:49:18.094Z,
  message: `2025-05-30T07:49:18.094364-0400 DEBUG Vertex OpenAIModel-RtlZm, result: <langflow.graph.utils.UnbuiltResult object at 0x364d24dd0>, object: {'text_output': "Hey there! I'm here and ready to help you build something awesome with AI. What are you thinking about creating today?"}\n`
}
```

</details>

For more information, see [Logs endpoints](/api-logs).