File size: 193 Bytes
6ecb03e
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
{
    "framework": "pytorch",
    "task": "text-generation",
    "model": {
        "type": "Atom-13B"
    },
    "pipeline": {
        "type": "Atom-7B-pipe"
    },
    "allow_remote": true
}