File size: 424 Bytes
2616968
 
 
 
 
 
 
edb6434
2616968
 
 
 
 
 
 
edb6434
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
library_name: transformers
tags: []
---

# Model Card for Model ID

<!-- This model can be used to get the symptoms of particular disease -->



## Model Details

### Model Description

<!-- DistilGPT2 (short for Distilled-GPT2) is an English-language model pre-trained with the supervision of the smallest version of Generative Pre-trained Transformer 2 (GPT-2). Like GPT-2, DistilGPT2 can be used to generate text.-->