Apply for community grant: Personal project (gpu and storage)

#1
by fb700 - opened
  This is an unparalleled chatglm-6b intensive training model, which is an intelligent AI user BoFan, using nearly 1 million data, through intensive training.
   Through training, its Chinese summary ability exceeds all versions of GPT3.5, the level of health consultation is also outstanding in the same parameter scale model, and the 

performance has been greatly improved on the original model, especially in the academic field, it has also made a major breakthrough, especially in context, the maximum length of support is much larger than 4k, 8K, 16K and even infinite。
By training us to have a deeper understanding of the model, LLM has been evolving, and good methods and data can tap into the greater potential of the model. This successful training makes 6b a smaller model that can also have the capability of a larger model with hundreds of billions of parameters in a specific field, and it may be the preferred model for any individual and mid-to-lower enterprises.

Linkage of models : https://huggingface.co/fb700/chatglm-fitness-RLHF

       这是一个无以伦比的chatglm-6b强化训练模型,它是有智能AI用户 帛凡,使用近100万条数据,经过强化训练而成。
     通过训练它的中文总结能力超越了GPT3.5各版本,健康咨询水平在同参数规模模型中也出类拔萃,性能原在原模型上有了巨大提升,特别在学术它也取得了重大突破特别在context,支持的最大长度远大于4k、8K、16K甚至实现无穷。
      通过训练我们对模型有了更深刻的认知,LLM在一直在进化,好的方法和数据可以挖掘出模型的更大潜能。这次成功的训练让6b这种较小的模型,在特定领域也可以具有千亿级参数更大规模模型的能力,它可能是任何个人和中下企业首选模型。

Hi @fb700 , we have assigned a gpu to this space. Note that GPU Grants are provided temporarily and might be removed after some time if the usage is very low.

To learn more about GPUs in Spaces, please check out https://huggingface.co/docs/hub/spaces-gpus

Sign up or log in to comment