Change use_cache to True which significantly speeds up inference (#2) ca45eff ehartford TheBloke commited on May 5, 2023
Upload fast tokenizer, which is the recommended and default for transformers now (#1) 278bf3c ehartford TheBloke commited on May 5, 2023
Merge branch 'main' of https://huggingface.co/ehartford/WizardLM-7B-Uncensored into main f871962 Ubuntu commited on May 4, 2023