Update README.md with new model card content
Browse files
README.md
CHANGED
|
@@ -1,7 +1,7 @@
|
|
| 1 |
---
|
| 2 |
library_name: keras-hub
|
| 3 |
---
|
| 4 |
-
|
| 5 |
DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases.
|
| 6 |
|
| 7 |
Weights and Keras model code are released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE).
|
|
@@ -35,7 +35,7 @@ The following model checkpoints are provided by the Keras team. Full code exampl
|
|
| 35 |
| distil_bert_base_en | 65.19M | 6-layer model where case is maintained. |
|
| 36 |
| distil_bert_base_multi | 134.73M | 6-layer multi-linguage model where case is maintained. |
|
| 37 |
|
| 38 |
-
|
| 39 |
```python
|
| 40 |
import keras
|
| 41 |
import keras_hub
|
|
|
|
| 1 |
---
|
| 2 |
library_name: keras-hub
|
| 3 |
---
|
| 4 |
+
## Model Overview
|
| 5 |
DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases.
|
| 6 |
|
| 7 |
Weights and Keras model code are released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE).
|
|
|
|
| 35 |
| distil_bert_base_en | 65.19M | 6-layer model where case is maintained. |
|
| 36 |
| distil_bert_base_multi | 134.73M | 6-layer multi-linguage model where case is maintained. |
|
| 37 |
|
| 38 |
+
## Example Usage
|
| 39 |
```python
|
| 40 |
import keras
|
| 41 |
import keras_hub
|