Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
Ali-C137Β 
posted an update Feb 8
Post
Hi friends, i'am happy to share with you all a tool that i built a week ago or so, i'am talking here about the "LLM Training Cost Calculator" - a handy tool now available on Hugging Face Spaces! This interactive Gradio app provides an easy-to-use interface for estimating the training costs of large language models (LLMs).

(I've been asked to provide a report about the cost of finetuning each model etc... so i decided to do the lazy job and build a tool for it, Prof later can choose whatever config he likes πŸ˜†)

πŸ” But Why this is important?
As LLMs continue to grow in size and complexity, understanding the computational and financial requirements is crucial for planning and managing AI projects. I believe this tool simplifies this process, giving you insights into potential expenses based on the number of parameters and tokens in your dataset.

🌟 Features:
- Input the number of parameters (in billions) and tokens (in trillions).
- Adjust for GPU utilization rates and overhead costs.
- Get an instant estimate of your training costs.
+ Choose your GPU (A100 80GB PCle, A100 80GB SXM, V100, H100 SXM, H100 PCle)

πŸ“ˆ Coming Soon:
Plans are in place to expand the calculator's capabilities to include fine-tuning costs for models using LoRA or QLoRA. You'll be able to input a model ID from the Hugging Face Hub, select your fine-tuning strategy, and specify quantization details if using QLoRA.

I believe this tool will be a valuable asset to the AI community, helping to plan and allocate resources more effectively πŸ€—.

Should you have any suggestions or feedback, please don't hesitate to contribute your thoughts in the comments below. Together, we can refine and enhance this resource for all.

πŸ”— Try it here : Ali-C137/LLM-Training-Cost-Calculator

PS : All thanks to Gradio, Hugging Face and the community ofc πŸ”₯ πŸ˜‰
In this post