gorilla-openfunctions-v2-q4f32_1-MLC

This is the gorilla-openfunctions-v2 model in MLC format q4f32_1. The model can be used for projects MLC-LLM and WebLLM.

Example Usage

Here are some examples of using this model in MLC LLM. Before running the examples, please install MLC LLM by following the installation documentation.

Gorilla OpenFunctions v2

💡 SoTA for open-source models. On-par with GPT-4.

🚀 Check out the Berkeley Function Calling Leaderboard
📣 Read more in our OpenFunctions v2 release blog and Berkeley Function Calling Leaderboard blog
🟢 Check out Quantized GGUF models in gorilla-llm/gorilla-openfunctions-v2-gguf

Introduction

Gorilla OpenFunctions extends Large Language Model(LLM) Chat Completion feature to formulate executable APIs call given natural language instructions and API context. With OpenFunctions v2, we now support:

  1. Multiple functions - choose betwen functions
  2. Parallel functions - call the same function N time with different parameter values
  3. Multiple & parallel - both of the above in a single chatcompletion call (one generation)
  4. Relevance detection - when chatting, chat. When asked for function, returns a function
  5. Python - supports string, number, boolean, list, tuple, dict parameter datatypes and Any for those not natively supported.
  6. JAVA - support for byte, short, int, float, double, long, boolean, char, Array, ArrayList, Set, HashMap, Hashtable, Queue, Stack, and Any datatypes.
  7. JavaScript - support for String, Number, Bigint, Boolean, dict (object), Array, Date, and Any datatypes.
  8. REST - native REST support

Performance

Model Overall Accuracy*
GPT-4-0125-Preview 85.12%
Gorilla-openfunctions-v2 83.67%
GPT-3.5-turbo 82.23%
Mistral-medium 79.70%
Nexusflow Raven-v2 55.72%
GPT-4-0613 54.16%
*: Overall Accuracy is defined in Berkeley Function Calling Leaderboard blog, read more details if you are interested!

Models Available

Model Functionality
gorilla-openfunctions-v2 Multiple, parallel, multiple & parallel, relevance detection, Python + JAVA + JS + REST
gorilla-openfunctions-v1 Parallel functions, and can choose between functions
gorilla-openfunctions-v0 Given a function, and user intent, returns properly formatted json with the right arguments

All of our models are hosted on our Huggingface UC Berkeley gorilla-llm org: gorilla-openfunctions-v2, gorilla-openfunctions-v1, and gorilla-openfunctions-v0.

Training

Gorilla Openfunctions v2 is a 7B parameter model, and is built on top of the deepseek coder LLM. Check out openfunctions-v2 blog to learn more about the data composition and some insights into the training process.

Documentation

For more information on MLC LLM project, please visit our documentation and GitHub repo.

License

Gorilla OpenFunctions v2 is distributed under the Apache 2.0 license. This software incorporates elements from the Deepseek model. Consequently, the licensing of Gorilla OpenFunctions v2 adheres to the Apache 2.0 license, with additional terms as outlined in Appendix A of the Deepseek license.

Contributing

Gorilla is an open source effort from UC Berkeley and we welcome contributors. Please email us your comments, criticism, and questions. More information about the project can be found at https://gorilla.cs.berkeley.edu/

Downloads last month
7
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for smalinin/gorilla-openfunctions-v2_q4f32_1-MLC

Quantized
(10)
this model