7B BaseModels
Collection
All Current Base Models are 32k/128 context and FP16
•
25 items
•
Updated
This model will be focused on learning wikipedia styled knowledege as well as text corpora of books and historical data and religious data etc. The future iterations will be simular minded llm models
This is also a very high scoring model
This model will also become more of a thinking brain , ie according to chain of thoughts and steps. perhaps even imagining a task ... before performing it.