add warning
Browse files
README.md
CHANGED
|
@@ -58,6 +58,14 @@ This model is `cardiffnlp/twitter-roberta-base` further fine-tuned on emoji clas
|
|
| 58 |
|
| 59 |
Also tried `google/vit-base-patch32-384`, `google/vit-base-patch16-384` for the vision models, but results were inconclusive.
|
| 60 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 61 |
### ๐ Training Logs
|
| 62 |
|
| 63 |
Training logs can be found [here](https://wandb.ai/cceyda/flax-clip?workspace=user-cceyda)
|
|
@@ -83,6 +91,7 @@ I will definitely be trying out training a similar model for emoji & meme data.
|
|
| 83 |
Training CLIP is just the first step, if we have a well trained CLIP generation is within reach ๐
|
| 84 |
|
| 85 |
# How to use
|
|
|
|
| 86 |
|
| 87 |
```py
|
| 88 |
from model import FlaxHybridCLIP # see demo
|
|
@@ -119,3 +128,5 @@ Example:
|
|
| 119 |
# Demo
|
| 120 |
|
| 121 |
https://huggingface.co/spaces/flax-community/clip-reply-demo
|
|
|
|
|
|
|
|
|
| 58 |
|
| 59 |
Also tried `google/vit-base-patch32-384`, `google/vit-base-patch16-384` for the vision models, but results were inconclusive.
|
| 60 |
|
| 61 |
+
## Result Interpretation (Warning)
|
| 62 |
+
|
| 63 |
+
It would be wrong to claim that this model learned 'semantic reasoning' between sentence and image features.
|
| 64 |
+
|
| 65 |
+
It more likely learned a mapping between sentence sentiment and image occurrence statistics. Because the set of gif images repeat across the dataset, although paired with different sentences.
|
| 66 |
+
|
| 67 |
+
That is not to say that learning such semantic relations isn't feasible with this model. And it is well worth working on in the future, with a larger and better constructed dataset.
|
| 68 |
+
|
| 69 |
### ๐ Training Logs
|
| 70 |
|
| 71 |
Training logs can be found [here](https://wandb.ai/cceyda/flax-clip?workspace=user-cceyda)
|
|
|
|
| 91 |
Training CLIP is just the first step, if we have a well trained CLIP generation is within reach ๐
|
| 92 |
|
| 93 |
# How to use
|
| 94 |
+
The final model available [here](https://huggingface.co/ceyda/clip-reply)
|
| 95 |
|
| 96 |
```py
|
| 97 |
from model import FlaxHybridCLIP # see demo
|
|
|
|
| 128 |
# Demo
|
| 129 |
|
| 130 |
https://huggingface.co/spaces/flax-community/clip-reply-demo
|
| 131 |
+
|
| 132 |
+
|