The model is not responding.

#13
by PhelixZhen - opened

The model is not responding. I loaded the model locally using text-generation-webui, but when I try to have a conversation with the model in the chat tab, it doesn't produce any output. Here are the settings I used when loading it:

The AutoGPTQ params are: {'model_basename': 'model', 'device': 'cuda:0', 'use_triton': True, 'inject_fused_attention': True, 'inject_fused_mlp': True, 'use_safetensors': True, 'trust_remote_code': False, 'max_memory': {0: '23700MiB', 1: '20200MiB', 'cpu': '51400MiB'}, 'quantize_config': None, 'use_cuda_fp16': True, 'disable_exllama': False}

Sign up or log in to comment