buruzaemon
commited on
Commit
•
71332ad
1
Parent(s):
2266291
Update README.md
Browse files
README.md
CHANGED
@@ -34,7 +34,7 @@ It achieves the following results on the evaluation set:
|
|
34 |
|
35 |
## Model description
|
36 |
|
37 |
-
This is an initial example of knowledge-distillation where the student loss is all cross-entropy loss \\(L_{CE}\\) of the ground-truth labels and none of the distillation loss \\(L_{KD}\\).
|
38 |
|
39 |
## Intended uses & limitations
|
40 |
|
|
|
34 |
|
35 |
## Model description
|
36 |
|
37 |
+
This is an initial example of knowledge-distillation where the student loss is all cross-entropy loss \\(L_{CE}\\) of the ground-truth labels and none of the knowledge-distillation loss \\(L_{KD}\\).
|
38 |
|
39 |
## Intended uses & limitations
|
40 |
|