File size: 2,421 Bytes
115a50a
8c1837e
115a50a
 
8c1837e
 
 
 
 
115a50a
 
1e8e186
115a50a
1e8e186
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
115a50a
8c1837e
 
 
 
115a50a
8c1837e
 
14241c9
8c1837e
 
 
 
 
 
 
 
 
 
 
a8f8758
 
 
 
 
 
 
 
 
8c1837e
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
---
language: en
tags:
- llama
- llama-3.2
- function-calling
- instruction-tuning
- conversational
license: llama2
---

# NeuralTau Functions 3B v1

NeuralTau Functions 3B v1 is the inaugural model in the NeuralTau series, designed to deliver specialized, expert AI capabilities. This pilot model explores the potential for creating AI teammates and autonomous AI-driven businesses.

## Key Features

- **Complexity and Expertise**  
  NeuralTau Functions 3B v1 is engineered with advanced complexity to tackle niche and specialized tasks, making it ideal for applications requiring deep expertise.

- **Purpose-Driven Development**  
  This version serves as a pilot to evaluate performance and usability, laying the groundwork for future iterations aimed at building AI teammates and autonomous AI systems.

- **Usability**  
  Designed for developers seeking to integrate specialized AI solutions, the model supports applications requiring autonomous functionality or expert-level knowledge.

## Future Vision

This model represents the first step in the NeuralTau journey. Future iterations will build upon insights gained from v1 to create more refined, efficient, and specialized AI models, continually enhancing performance and usability.

## Model Variants Available
- 16-bit full model
- GGUF Q4_K_M quantized version (recommended for most use cases)
- GGUF Q8_0 quantized version (higher quality, larger size)

## Training Details
- Base Model: unsloth/Llama-3.2-3B-Instruct
- Training Dataset: 0xroyce/NeuralTau-With-Functions-chat (https://huggingface.co/datasets/0xroyce/NeuralTau-With-Functions-chat) 

## Usage

The model follows the Llama chat format. You can interact with it using:

```python
messages = [
    {"role": "user", "content": "Your instruction or question here"},
]
```


Function calling example:
```
>>> how do i do a function for weather? use <tool_call> </tool_call>
<tool_call>
{"arguments": {"location": "Los Angeles", "time_period": "current"}, "name": "get_weather_data"}
</tool_call>
```

## Model Capabilities
- Understanding and following complex instructions
- Providing detailed explanations and analysis
- Breaking down complex topics into understandable components
- Function-like operations and systematic problem-solving
- Maintaining context in multi-turn conversations
- Generating clear and structured responses

## License
This model is subject to the Llama 2 license.