littleworth commited on
Commit
7d20076
1 Parent(s): d0c762f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -23,6 +23,9 @@ This model card describes the distilled version of ProtGPT2, referred to as `pro
23
  ### Performance
24
  The distilled model, `protgpt2-distilled-tiny`, exhibits a significant improvement in inference speed—up to 6 times faster than the pretrained version—while maintaining comparable perplexities.
25
 
 
 
 
26
  ### Use Cases
27
  The distilled version of ProtGPT2 is particularly useful in scenarios where efficiency and speed are crucial without significant compromise on performance. Here are three use cases:
28
 
 
23
  ### Performance
24
  The distilled model, `protgpt2-distilled-tiny`, exhibits a significant improvement in inference speed—up to 6 times faster than the pretrained version—while maintaining comparable perplexities.
25
 
26
+ ![Running time](https://images.mobilism.org/?di=Y7IS2NH7)
27
+
28
+
29
  ### Use Cases
30
  The distilled version of ProtGPT2 is particularly useful in scenarios where efficiency and speed are crucial without significant compromise on performance. Here are three use cases:
31