talk_tuah / README.md
elijah0528's picture
Update README.md
bf6d90e verified
|
raw
history blame
592 Bytes

Talk-Tuah-1

Talk-Tuah-1 is an 80 million parameter GPT trained on all of Hailey Welch's inspirational podcast 'Talk Tuah'. This SOTA frontier model is trained on 13 hours of 'Talk Tuah' on an A100 for an hour. The rationale was the discourse in the 'Talk Tuah' podcast is the most enlightened media that any human has created. Therefore, it should outperform any other LLM on any benchmark. With sufficient training and additional compute – Talk-Tuah-1 can outperform OpenAI and Anthropic's flagship models - o3 and sonnet. The architecture was adapted from Andrej Karpathy's nanogpt.