diff --git "a/README.md" "b/README.md"
--- "a/README.md"
+++ "b/README.md"
@@ -1,160 +1,85 @@
-
-
-
-
-
-

-
-
-
-

-
-

-
-
-
-
-
-
+---
+language: en
+thumbnail: https://www.huggingtweets.com/v23242526/1638049876119/predictions.png
+tags:
+- huggingtweets
+widget:
+- text: "My dream is"
+---
+
+
+
+
🤖 AI BOT 🤖
+
v
+
@v23242526
+
-
-
Looks like something went wrong!
+I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
-
We track these errors automatically, but if the problem persists feel free to contact us. In the meantime, try refreshing.
+Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
-
+## How does it work?
-
-
-
+The model uses the following pipeline.
-
-
-
-
+
+
+To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
+
+## Training data
+
+The model was trained on tweets from v.
+
+| Data | v |
+| --- | --- |
+| Tweets downloaded | 322 |
+| Retweets | 7 |
+| Short tweets | 146 |
+| Tweets kept | 169 |
+
+[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/ms3xysdk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
+
+## Training procedure
+
+The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @v23242526's tweets.
+
+Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3gcrzkfj) for full transparency and reproducibility.
+
+At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3gcrzkfj/artifacts) is logged and versioned.
+
+## How to use
+
+You can use this model directly with a pipeline for text generation:
+
+```python
+from transformers import pipeline
+generator = pipeline('text-generation',
+ model='huggingtweets/v23242526')
+generator("My dream is", num_return_sequences=5)
+```
+
+## Limitations and bias
+
+The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
+
+In addition, the data present in the user's tweets further affects the text generated by the model.
+
+## About
+
+*Built by Boris Dayma*
+
+[](https://twitter.com/intent/follow?screen_name=borisdayma)
+
+For more details, visit the project repository.
-
-
-
+[](https://github.com/borisdayma/huggingtweets)