--- license: apache-2.0 base_model: google-bert/bert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: tool-bert results: [] --- # tool-bert This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased). It uses a custom made dataset of sample user instructions, which are classified to a number of possible local assistant function calling endpoints. For example, given an input query, tool-bert returns a prediction as to what tool to use to augment a downstream LLM generated output with. More information on these tools to follow, but example tools are "play music", "check the weather", "get the news", "take a photo", or use no tool. Basically, this model is meant to be a means of allowing very small LLMs (i.e. 8B and below) to use function calling. All limitations and biases are inherited from the parent model. ### Framework versions - Transformers 4.41.1 - Pytorch 2.3.0 - Datasets 2.19.1 - Tokenizers 0.19.1