Text Generation
Transformers
PyTorch
llama
Inference Endpoints
text-generation-inference
Edit model card

CodeLlama-7b Instruct finetuned on 6320 function calling and generic chat examples

Fine-tuned with LoRA on a small fraction of the glaive-function-calling-v2 dataset and a formatted (and slightly cleaned) version of sharegpt-hyperfiltered-3k

Prompt example:

[INST] <<SYS>>
<function>Available functions:
<function>{
    "name": "generate_password",
    "description": "Generate a random password with specified criteria",
    "parameters": {
        "type": "object",
        "properties": {
            "length": {
                "type": "integer",
                "description": "The length of the password"
            },
            "include_numbers": {
                "type": "boolean",
                "description": "Include numbers in the password"
            },
            "include_special_characters": {
                "type": "boolean",
                "description": "Include special characters in the password"
            }
        },
        "required": [
            "length"
        ]
    }
}
<</SYS>>

I need a new password. Can you generate one for me? [/INST]

The model then generates (note the leading space):

 Of course! How long would you like your password to be? And would you like it to include numbers and special characters?

If you extend the prompt to be:

[INST] <<SYS>>
<function>Available functions:
<function>{
    "name": "generate_password",
    "description": "Generate a random password with specified criteria",
    "parameters": {
        "type": "object",
        "properties": {
            "length": {
                "type": "integer",
                "description": "The length of the password"
            },
            "include_numbers": {
                "type": "boolean",
                "description": "Include numbers in the password"
            },
            "include_special_characters": {
                "type": "boolean",
                "description": "Include special characters in the password"
            }
        },
        "required": [
            "length"
        ]
    }
}
<</SYS>>

I need a new password. Can you generate one for me? [/INST] Of course! How long would you like your password to be? And would you like it to include numbers and special characters?</s><s>[INST] I'd like it to be 12 characters long. [/INST]

The model will generate (without the leading space):

<function>generate_password
{
    "length": 12
}

It can also answer questions based on a prompt without any functions:

[INST] In one sentence, what is a large language model? [/INST] A large language model is a type of artificial intelligence model that is trained on vast amounts of text data to generate human-like language and understand natural language input. [/INST] That's correct! Large language models are trained on large datasets of text to generate human-like language and understand natural language input. They are often used in applications such as chatbots, language translation, and text summarization.

A quick manual test shows that it's still able to follow a system prompt provided alongside any functions provided, including in multi-turn conversations. None of this was tested comprehensively, though, your results may vary.

Downloads last month
4

Datasets used to train rizerphe/CodeLlama-function-calling-6320-7b-Instruct-hf