Mayank Mishra

mayank-mishra

AI & ML interests

Large Language Models, Distributed Training and Inference

Articles

Organizations

mayank-mishra's activity

posted an update about 1 month ago
replied to ordagan's post about 1 month ago
posted an update about 1 month ago
view post
Post
1822
Current LLMs are very susceptible to generating toxic, harmful and even dangerous content. They can also generate outputs with gender or racial biases.

The Biden-Harris Executive Order (https://www.federalregister.gov/documents/2023/11/01/2023-24283/safe-secure-and-trustworthy-development-and-use-of-artificial-intelligence) sets forth guidelines on what is considered a safe AI system.

Following up on these guidelines, we present the world's first open source Biden-Harris Executive Order Red teamed Multilingual Language Model: Aurora-M.

The model is trained on 5 languages: English, Hindi, Japanese, Vietnamese and Finnish.

Blog: https://huggingface.co/blog/mayank-mishra/aurora
Paper coming out soon.

Base model: aurora-m/aurora-m-base (not safety tuned)
Instruct model: aurora-m/aurora-m-instruct (not safety tuned)
Red teamed model: aurora-m/aurora-m-biden-harris-redteamed (safety tuned according to the order mentioned above)
replied to their post about 2 months ago
replied to their post about 2 months ago
replied to their post about 2 months ago
replied to their post 2 months ago
view reply

yeah, its just that people have not been using this for finetuning where it can give considerable memory savings. I guess the issue is the core design of HF transformers.

I am planning to release the code for this sometime soon :)

posted an update 2 months ago
view post
Post
I have just published my first blog post.

While FlashAttention has been readily integrated into HuggingFace transformers, there are much higher gains to be had (at least theoretically) for finetuning models with examples of variable sequence lengths in a batch.

For a deeper dive, please go through my blog at https://huggingface.co/blog/mayank-mishra/padding-free-transformer.
Β·