Titus von Koeller

Titus-von-Koeller

AI & ML interests

NN Quantization, Generative AI, LLMs, alignment, algorithms for social justice, ethical humanism, mitigating gender bias, audio compression, AGI

Articles

Organizations

Titus-von-Koeller's activity

posted an update about 2 months ago
view post
Post
1766
๐Ÿ”ฅ Level up your model training w/ GaLore + Transformers for SOTA results on consumer-grade hardware!

โฌ‡๏ธ 82.5% less optimizer state memory footprint without performance degradation by expressing the gradient weight matrix as low rank.

๐Ÿ‘ฉ๐Ÿฟโ€๐Ÿ’ป Install via pip install transformers>=4.39.0 galore-torch. #ProudlyGpuPoor

The integration of GaLore into the training of large language models (LLMs) marks a significant advancement in the field of deep learning, particularly in terms of memory efficiency and the democratization of AI research. By allowing for the training of billion-parameter models on consumer-grade hardware, reducing memory footprint in optimizer states, and leveraging advanced projection matrix techniques, GaLore opens new horizons for researchers and practitioners with limited access to high-end computational resources.

๐Ÿ”ฌ Find out more about GaLore and investigate lots of juicy technical details: https://huggingface.co/blog/galore

๐Ÿค— Huge thanks to everyone involved โค๏ธ:

โ€ข authors: @jiaweizhao @Kyriection @beidic Zhangyang Wang @animakumar @tydsh
โ€ข community contributors: @hiyouga @mdouglas and others!
โ€ข @ybelkada for taking such swift action in composing and coordinating necessary PRs to get this live at โšก speed!

๐Ÿ—๏ธ๐Ÿ“ˆ Super rewarding to see how @timdettmers work with optimizers is being built upon to achieve even greater heights!

๐Ÿšง Actually, there are ongoing works to integrate GaLore into bitsandbytes and optimize memory efficiency even further ๐Ÿ’ช. We'll keep you posted!
  • 1 reply
ยท
posted an update 2 months ago
view post
Post
We just released bitsandbytes==0.43.0 ๐Ÿ“ฆ , with these significant new additions:

โ€ฃ ๐Ÿ›ซ FSDP+QLoRA support (alpha release)
โ—ฆ now anyone with 2 powerful gaming GPUs can fine-tune 70B param models at home!
โ—ฆ in collab with Jeremy Howard + team @ answer.ai
โ—ฆ answer.ai blogpost: https://www.answer.ai/posts/2024-03-06-fsdp-qlora.html
โ—ฆ example repo: https://github.com/AnswerDotAI/fsdp_qlora/

โ€ฃ ๐ŸŒˆโŠž Official Windows support
โ—ฆ now via simple pip install bitsandbytes>=0.43.0

โ€ฃ ๐Ÿ“„ Huge docs update:
โ—ฆ https://huggingface.co/docs/bitsandbytes/main
โ—ฆ Be sure to check out the optimizers and the API docs
โ—ฆ ... even more upcoming ...

Under the hood there we have many other improvements, due to extensive maintenance activity, community contributions by super active + knowledgable volunteers โœจ ๐Ÿš€ and the official sponsorship by Hugging Face that makes all this possible ๐Ÿค— โค๏ธ ๐ŸŒ

We would greatly appreciate any further community contributions, be it to help with refactorings, exterminating flaky tests, writing doc-strings, tutorials, new features. Don't be shy, just contact us and we see where this leads us:
https://github.com/TimDettmers/bitsandbytes/discussions

Have a great weekend everyone!
  • 1 reply
ยท
posted an update 3 months ago
view post
Post
Exciting news for bitsandbytes! We're thrilled to announce the release of the initial version of our new documentation: https://huggingface.co/docs/bitsandbytes/main/en/index .

Please let us know what you think: Your feedback is essential to us, and we would greatly appreciate any insights you have on how we can further enhance it or even better be happy to merge your contributions, filling in some blanks: Especially doc-strings are still a big topic and there several placeholder that would be super helpful to have filled in. Please post your feedback here: https://github.com/TimDettmers/bitsandbytes/discussions/1090

Since taking over maintenance together with Younes Belkada and since Hugging Face graciously agreed to support the library, we've already made enormous strides and community contributions have sprung back to life: It's so motivating to have so many knowledgeable contributors that often invest extensive free-time and bring their unique ideas to the table.

A notable example are our ongoing efforts to enable cross-platform support, including Intel, Apple Silicon, AMD, and Windows. Simultaneously, we're working diligently to streamline community contributions in BNB, making the process more accessible for everyone. A heartfelt thank you to all who have contributed thus far!

With HuggingFace's committed to supporting bitsandbytes going forward, we're sure to promptly respond to and integrate additional community contributions.

Looking forward to growing bitsandbytes further as part of the FOSS community: pushing forward the state of the art in democratization of AI!