colossal-llama-2-7b-base-gguf / configuration.json
shaowenchen's picture
add meta files
75b20ea
raw
history blame
197 Bytes
{
"framework": "pytorch",
"task": "text-generation",
"model": {
"type": "llama2"
},
"pipeline": {
"type": "colossal-llama-2-7b-base-text-generation-pipe"
}
}