DrNicefellow's picture
Update README.md
12d3e4d verified
|
raw
history blame
641 Bytes
metadata
license: apache-2.0

Self trained microscopic Olmo. Around 2G parameters.

The tokenizer is the one from https://huggingface.co/allenai/OLMo-1B-hf.

It is being trained on around 400B tokens and this is step 27.6k.

The evaluation is being conducted now.

License

This model is available under the Apache 2.0 License.

Discord Server

Join our Discord server here.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffe or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?