fix the runtimeError of in-place operation when use transformers for training d050322 verified kingsley01 commited on Sep 18
feat: add eos_token_id to generation_config.json (needed by vllm infer) (#12) 989a689 verified czczup wxsm commited on Aug 22
Fix InternLM2ForCausalLM does not support Flash Attention 2.0 yet (#3) 743a544 verified czczup kosung commited on Jul 7