prasadsachin
commited on
Commit
•
808ebc9
1
Parent(s):
2c27dae
Update README.md
Browse files
README.md
CHANGED
@@ -1,5 +1,10 @@
|
|
1 |
---
|
2 |
library_name: keras-hub
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
4 |
## Model Overview
|
5 |
DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases.
|
@@ -142,4 +147,4 @@ classifier = keras_hub.models.DistilBertClassifier.from_preset(
|
|
142 |
preprocessor=None,
|
143 |
)
|
144 |
classifier.fit(x=features, y=labels, batch_size=2)
|
145 |
-
```
|
|
|
1 |
---
|
2 |
library_name: keras-hub
|
3 |
+
license: apache-2.0
|
4 |
+
language:
|
5 |
+
- en
|
6 |
+
tags:
|
7 |
+
- text-classification
|
8 |
---
|
9 |
## Model Overview
|
10 |
DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases.
|
|
|
147 |
preprocessor=None,
|
148 |
)
|
149 |
classifier.fit(x=features, y=labels, batch_size=2)
|
150 |
+
```
|