| | --- |
| | language: en |
| | thumbnail: https://www.huggingtweets.com/cheascake/1617656786247/predictions.png |
| | tags: |
| | - huggingtweets |
| | widget: |
| | - text: "My dream is" |
| | --- |
| | |
| | <div> |
| | <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1378827669790461953/GLEmzCyo_400x400.jpg')"> |
| | </div> |
| | <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Eel Enthusiast 🤖 AI Bot </div> |
| | <div style="font-size: 15px">@cheascake bot</div> |
| | </div> |
| |
|
| | I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). |
| |
|
| | Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! |
| |
|
| | ## How does it work? |
| |
|
| | The model uses the following pipeline. |
| |
|
| |  |
| |
|
| | To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). |
| |
|
| | ## Training data |
| |
|
| | The model was trained on [@cheascake's tweets](https://twitter.com/cheascake). |
| |
|
| | | Data | Quantity | |
| | | --- | --- | |
| | | Tweets downloaded | 3248 | |
| | | Retweets | 216 | |
| | | Short tweets | 732 | |
| | | Tweets kept | 2300 | |
| |
|
| | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1pgthrar/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. |
| |
|
| | ## Training procedure |
| |
|
| | The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @cheascake's tweets. |
| |
|
| | Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ndb8e5s3) for full transparency and reproducibility. |
| |
|
| | At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ndb8e5s3/artifacts) is logged and versioned. |
| |
|
| | ## How to use |
| |
|
| | You can use this model directly with a pipeline for text generation: |
| |
|
| | ```python |
| | from transformers import pipeline |
| | generator = pipeline('text-generation', |
| | model='huggingtweets/cheascake') |
| | generator("My dream is", num_return_sequences=5) |
| | ``` |
| |
|
| | ## Limitations and bias |
| |
|
| | The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). |
| |
|
| | In addition, the data present in the user's tweets further affects the text generated by the model. |
| |
|
| | ## About |
| |
|
| | *Built by Boris Dayma* |
| |
|
| | [](https://twitter.com/intent/follow?screen_name=borisdayma) |
| |
|
| | For more details, visit the project repository. |
| |
|
| | [](https://github.com/borisdayma/huggingtweets) |
| |
|