💻Create a Web Interface for your LLM in Python

Community Article Published January 23, 2024

In this tutorial we will create a simple chatbot web interface and deploy it using an open-source Python library called Taipy.

Render of the app

Here we will use HuggingFace's API with google/flan-t5-xxl. This tutorial can easily be adapted to other LLMs.

Step 1: Install Requirements

Create a requirements.txt file with the following content:

taipy==3.0.0

Install the requirements using pip in a terminal:

pip install -r requirements.txt

Step 2: Imports

Create a main.py file with the following imports:

import requests
from taipy.gui import Gui, State, notify

Step 3: Initialize variables

Initialize the following variables in the main.py file:

context = "The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly.\n\nHuman: Hello, who are you?\nAI: I am an AI created by Google. How can I help you today? "
conversation = {
    "Conversation": ["Who are you?", "Hi! I am FLAN-T5 XXL. How can I help you today?"]
}
current_user_message = ""
  • context is the initial context for the conversation, the LLM will use this to understand what behaviour is expected from it.
  • conversation is a dictionary that will store the conversation history to be displayed
  • current_user_message is the current message that the user is typing

Step 4: Create a function to generate responses

This step is the one that needs to be adapted if you want to use a different LLM.

Initialize the HuggingFace variables with your Access Token. You can find your Access Token here.

API_URL = "https://api-inference.huggingface.co/models/google/flan-t5-xxl"
headers = {"Authorization": "Bearer [YOUR ACCESS TOKEN]"}

Create a function that takes as input a string prompt which is the user message and returns a string which is the response from the LLM.

def query(payload):
    response = requests.post(API_URL, headers=headers, json=payload)
    return response.json()

def request(state: State, prompt: str) -> str:
    """
    Send a prompt to the HuggingFace API and return the response.

    Args:
        - state: The current state of the app.
        - prompt: The prompt to send to the API.

    Returns:
        The response from the API.
    """
    
    output = query(
        {
            "inputs": prompt,
        }
    )
    print(output)
    return output[0]["generated_text"]

Step 5: Create a function to add the new messages to the conversation

Create a function that gets triggered when the user sends a message. This function will add the user's message to the context, send it to the API, get the response, add the response to the context and to the displayed conversation.

def send_message(state: State) -> None:
    """
    Send the user's message to the API and update the conversation.

    Args:
        - state: The current state of the app.
    """
    # Add the user's message to the context
    state.context += f"Human: \n {state.current_user_message}\n\n AI:"
    # Send the user's message to the API and get the response
    answer = request(state, state.context).replace("\n", "")
    # Add the response to the context for future messages
    state.context += answer
    # Update the conversation
    conv = state.conversation._dict.copy()
    conv["Conversation"] += [state.current_user_message, answer]
    state.conversation = conv
    # Clear the input field
    state.current_user_message = ""

Step 6: Create the User Interface

In Taipy, one way to define pages is to use Markdown strings. Here we use a table to display the conversation dictionary and an input so that the user can type their message. When the user presses enter, the send_message function is triggered.

page = """
<|{conversation}|table|show_all|width=100%|>
<|{current_user_message}|input|label=Write your message here...|on_action=send_message|class_name=fullwidth|>
"""

Step 7: Run the application

Finally we run the application:

if __name__ == "__main__":
    Gui(page).run(dark_mode=True, title="Taipy Chat")

And here is the result:

Render of the app

Step 8: Styling

The app's style is Taipy's default stylekit. We are going to make some changes so that it looks more like a chat app.

First in a main.css file, create styles to display user and AI messages differently:

.gpt_message td {
    margin-left: 30px;
    margin-bottom: 20px;
    margin-top: 20px;
    position: relative;
    display: inline-block;
    padding: 20px;
    background-color: #ff462b;
    border-radius: 20px;
    max-width: 80%;
    box-shadow: 0 4px 8px 0 rgba(0, 0, 0, 0.2), 0 6px 20px 0 rgba(0, 0, 0, 0.19);
    font-size: large;
}

.user_message td {
    margin-right: 30px;
    margin-bottom: 20px;
    margin-top: 20px;
    position: relative;
    display: inline-block;
    padding: 20px;
    background-color: #140a1e;
    border-radius: 20px;
    max-width: 80%;
    float: right;
    box-shadow: 0 4px 8px 0 rgba(0, 0, 0, 0.2), 0 6px 20px 0 rgba(0, 0, 0, 0.19);
    font-size: large;
}

We now need to tell Taipy to apply these styles to the rows in the table. We'll first create a function that will return the correct class name for each row:

def style_conv(state: State, idx: int, row: int) -> str:
    """
    Apply a style to the conversation table depending on the message's author.

    Args:
        - state: The current state of the app.
        - idx: The index of the message in the table.
        - row: The row of the message in the table.

    Returns:
        The style to apply to the message.
    """
    if idx is None:
        return None
    elif idx % 2 == 0:
        return "user_message"
    else:
        return "gpt_message"

We then apply this function to the table by adding the style property

<|{conversation}|table|show_all|style=style_conv|>

And voilà:

The styled application

Step 9: More features

I have added notifications, a sidebar with a button to clear the conversation and a history of previous conversations. I won't go into the details of how to do this here, but you can find the full code in the GitHub repository

Step 10: Deploying the app to Taipy Cloud

We are now going to deploy the app to Taipy Cloud so it is accessible from anyone with a link.

Firstly we need to store the API key in an environment variable. Replace the line that defines headers in Step 4 with:

import os
headers = {"Authorization": f"Bearer {os.environ['HUGGINGFACE_API_KEY']}"}

Now, instead of having our API key in the code, the app will look for it in the environment variables.

We can now deploy the app to Taipy Cloud:

  1. Connect to Taipy Cloud and sign in
  2. Click on "Add Machine" and fill in the fields
  3. Select the created machine and click on "Add app"
  4. Zip the main.py, main.css and requirements.txt files and upload the zip file to the "App files" field. Fill in the other fields
  5. In the "Environment Variables" tab, create a new environment variable called HUGGINGFACE_API_KEY and paste your API key as the value like in the image below
  6. Press "Deploy app"

Environment Variables Tab

After a while, your app should be running and will be accessible from the displayed link!

Taipy Cloud Interface

The final application