modelId stringlengths 4 112 | sha stringlengths 40 40 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringclasses 29 values | private bool 1 class | author stringlengths 2 38 β | config null | id stringlengths 4 112 | downloads float64 0 36.8M β | likes float64 0 712 β | library_name stringclasses 17 values | readme stringlengths 0 186k | embedding list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
huggingtweets/926stories-superachnural | 071169369f4a88696d04d01955d250b6150a5015 | 2021-06-08T08:31:13.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/926stories-superachnural | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/926stories-superachnural/1623141069426/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1402093786457526273/DCJaU_cD_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1401905139414278147/p2g20UkB_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Rummyyyy & vada pavlov</div>
<div style="text-align: center; font-size: 14px;">@926stories-superachnural</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Rummyyyy & vada pavlov.
| Data | Rummyyyy | vada pavlov |
| --- | --- | --- |
| Tweets downloaded | 1428 | 3194 |
| Retweets | 157 | 702 |
| Short tweets | 141 | 473 |
| Tweets kept | 1130 | 2019 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3srkphhe/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @926stories-superachnural's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/42rozhsq) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/42rozhsq/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/926stories-superachnural')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.020522797480225563,
0.12554538249969482,
-0.009182881563901901,
0.044607751071453094,
0.1709631234407425,
-0.00858368631452322,
-0.04379790648818016,
0.04727876931428909,
0.07173515856266022,
-0.057428568601608276,
0.002120113233104348,
0.06587448716163635,
0.02929406240582466,
-0.02570... |
huggingtweets/926stories | 78832e19990a13cf0204a5a9a57f957c8eed42d5 | 2021-06-08T06:38:28.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/926stories | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/926stories/1623134303273/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1402093786457526273/DCJaU_cD_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Rummyyyy</div>
<div style="text-align: center; font-size: 14px;">@926stories</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Rummyyyy.
| Data | Rummyyyy |
| --- | --- |
| Tweets downloaded | 1420 |
| Retweets | 156 |
| Short tweets | 139 |
| Tweets kept | 1125 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/5frannca/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @926stories's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1dvniaka) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1dvniaka/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/926stories')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.014141525141894817,
0.12468881160020828,
-0.00659671938046813,
0.052490293979644775,
0.16950801014900208,
-0.010537361726164818,
-0.04154929518699646,
0.02603781782090664,
0.07546006888151169,
-0.059441130608320236,
0.0005264159990474582,
0.06319180876016617,
0.02153901755809784,
-0.029... |
huggingtweets/Question | 983bf14e2dae171f73bbeecf798ec51718372ce2 | 2021-05-21T16:44:50.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/Question | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1282836681914085378/PGX9pn9g_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Milanote π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@milanoteapp bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@milanoteapp's tweets](https://twitter.com/milanoteapp).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>831</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>63</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>28</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>740</td>
</tr>
</tbody>
</table>
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/15katy2a/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @milanoteapp's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1y624qmh) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1y624qmh/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/milanoteapp'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
<!--- random size file --> | [
-0.015474756248295307,
0.13265803456306458,
0.02672826685011387,
-0.009138740599155426,
0.16349250078201294,
0.020027223974466324,
0.03767167031764984,
-0.012615453451871872,
0.10672987997531891,
-0.033170972019433975,
-0.03850141912698746,
0.06506368517875671,
0.01532228384166956,
-0.0365... |
huggingtweets/____devii | 6c4a136b340627eae8dc20e948e77b01d7a322c1 | 2021-05-21T16:46:07.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/____devii | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/____devii/1614135668330/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1345142719124021248/z5rYY2JE_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Devii π€ AI Bot </div>
<div style="font-size: 15px">@____devii bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@____devii's tweets](https://twitter.com/____devii).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3084 |
| Retweets | 1521 |
| Short tweets | 288 |
| Tweets kept | 1275 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ffmdq5y/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @____devii's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3bjpt7k3) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3bjpt7k3/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/____devii')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.0775265097618103,
0.14532844722270966,
0.041044801473617554,
0.025987472385168076,
0.14937369525432587,
-0.05813444033265114,
-0.008065005764365196,
-0.012382609769701958,
0.06721340864896774,
-0.057375624775886536,
-0.016350632533431053,
0.04122714698314667,
0.065822072327137,
0.020146... |
huggingtweets/__justplaying | fd629b975323857931665e9129215c130a781e13 | 2021-05-21T16:48:43.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/__justplaying | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/__justplaying/1616931832539/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1347480058508828673/AkXmT_bj_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">alice, dash of wonderland π π€ AI Bot </div>
<div style="font-size: 15px">@__justplaying bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@__justplaying's tweets](https://twitter.com/__justplaying).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3210 |
| Retweets | 706 |
| Short tweets | 518 |
| Tweets kept | 1986 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1og52vt9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @__justplaying's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ir21lg6) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ir21lg6/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/__justplaying')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.04268747940659523,
0.11394783854484558,
0.054455533623695374,
0.0024433135986328125,
0.14380651712417603,
-0.023013822734355927,
0.006051657721400261,
-0.030545074492692947,
0.07364798337221146,
-0.054987601935863495,
-0.02306666038930416,
0.038997434079647064,
0.07302888482809067,
0.01... |
huggingtweets/__solnychko | d958ddb4436ea35192d8443879a04bcf1f045df9 | 2021-05-21T16:49:49.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/__solnychko | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/__solnychko/1616680322908/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1360367235408224263/AwK6rgAZ_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sophia π€ AI Bot </div>
<div style="font-size: 15px">@__solnychko bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@__solnychko's tweets](https://twitter.com/__solnychko).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3206 |
| Retweets | 1278 |
| Short tweets | 208 |
| Tweets kept | 1720 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3aglnv5r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @__solnychko's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/z5yw4btx) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/z5yw4btx/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/__solnychko')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07034118473529816,
0.128562793135643,
0.04506225511431694,
0.014181626960635185,
0.1621362417936325,
-0.04754834994673729,
-0.002760200062766671,
-0.01536106038838625,
0.07624763995409012,
-0.05613031983375549,
-0.020662307739257812,
0.03811606019735336,
0.05057288706302643,
-0.00528988... |
huggingtweets/__stillpoint | 4d65845caca7d83db8a01f4d6e2fca408ed94cf5 | 2021-05-21T16:50:56.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/__stillpoint | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/__stillpoint/1616690567975/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1276525118454468608/dT_52Uvg_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dan Bennett π€ AI Bot </div>
<div style="font-size: 15px">@__stillpoint bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@__stillpoint's tweets](https://twitter.com/__stillpoint).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3237 |
| Retweets | 853 |
| Short tweets | 160 |
| Tweets kept | 2224 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3iopt694/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @__stillpoint's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/27mgd31u) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/27mgd31u/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/__stillpoint')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08758073300123215,
0.13749034702777863,
0.05072573944926262,
0.017847849056124687,
0.1576167494058609,
-0.04520915448665619,
-0.0067839836701750755,
-0.011648540385067463,
0.07292506843805313,
-0.06318873167037964,
-0.02804597094655037,
0.04543571546673775,
0.05783301964402199,
0.015664... |
huggingtweets/__wmww | e6e4ed3a3a09dc94bf215fac00a9e526b42b8672 | 2021-05-21T16:52:08.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/__wmww | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/__wmww/1617758505918/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1376655492060127233/cWuJmF-y_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">William π€ AI Bot </div>
<div style="font-size: 15px">@__wmww bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@__wmww's tweets](https://twitter.com/__wmww).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3237 |
| Retweets | 316 |
| Short tweets | 244 |
| Tweets kept | 2677 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/25xfjd5n/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @__wmww's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2mluxtqr) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2mluxtqr/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/__wmww')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.0806746557354927,
0.13015440106391907,
0.05857328325510025,
0.008749417960643768,
0.14173349738121033,
-0.04466293752193451,
-0.021346071735024452,
-0.019214678555727005,
0.061121147125959396,
-0.05632425472140312,
-0.013708963058888912,
0.03567346930503845,
0.06314747035503387,
0.01417... |
huggingtweets/_alexhirsch | ea51674ad159e96c27feb4e52dcb6ff191c571cc | 2021-05-21T16:53:35.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_alexhirsch | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_alexhirsch/1616542840091/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/661330385465245696/3rnsJokZ_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alex Hirsch π€ AI Bot </div>
<div style="font-size: 15px">@_alexhirsch bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@_alexhirsch's tweets](https://twitter.com/_alexhirsch).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3187 |
| Retweets | 240 |
| Short tweets | 450 |
| Tweets kept | 2497 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1go2kut1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_alexhirsch's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1pe6iqi8) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1pe6iqi8/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_alexhirsch')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07816945016384125,
0.12972600758075714,
0.05097721144556999,
0.036715246737003326,
0.1311199814081192,
-0.05409849062561989,
-0.013349419459700584,
-0.014126286841928959,
0.07811955362558365,
-0.05855502933263779,
-0.036601707339286804,
0.046128157526254654,
0.06274767965078354,
0.00674... |
huggingtweets/_bravit | 57e68e97506d604e715f83745380f2185b283fbb | 2021-11-28T20:07:30.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_bravit | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_bravit/1638130045930/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1322230137493065729/-h1nJf6U_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">ΠΠΈΡΠ°Π»ΠΈΠΉ ΠΡΠ°Π³ΠΈΠ»Π΅Π²ΡΠΊΠΈΠΉ</div>
<div style="text-align: center; font-size: 14px;">@_bravit</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from ΠΠΈΡΠ°Π»ΠΈΠΉ ΠΡΠ°Π³ΠΈΠ»Π΅Π²ΡΠΊΠΈΠΉ.
| Data | ΠΠΈΡΠ°Π»ΠΈΠΉ ΠΡΠ°Π³ΠΈΠ»Π΅Π²ΡΠΊΠΈΠΉ |
| --- | --- |
| Tweets downloaded | 3233 |
| Retweets | 884 |
| Short tweets | 489 |
| Tweets kept | 1860 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3ekzbpfn/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_bravit's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/10wax6wi) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/10wax6wi/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_bravit')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.009011869318783283,
0.12347409874200821,
-0.015561113134026527,
0.05829508602619171,
0.17244523763656616,
-0.019103629514575005,
-0.038264814764261246,
0.026680197566747665,
0.07679388672113419,
-0.05860371142625809,
-0.00015864767192397267,
0.0728418380022049,
0.016056427732110023,
-0.... |
huggingtweets/_colebennett_ | 73116676c765ba582e4d9f928501922807a512c5 | 2021-08-02T19:31:13.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_colebennett_ | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1293450821527638016/qn1MuAX7_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Cole Bennett</div>
<div style="text-align: center; font-size: 14px;">@_colebennett_</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Cole Bennett.
| Data | Cole Bennett |
| --- | --- |
| Tweets downloaded | 3085 |
| Retweets | 1120 |
| Short tweets | 415 |
| Tweets kept | 1550 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2mioy5h4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_colebennett_'s tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1b5bdu4w) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1b5bdu4w/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_colebennett_')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.021304212510585785,
0.12870806455612183,
-0.00771974865347147,
0.04168839380145073,
0.1564301997423172,
-0.01986047625541687,
-0.040180303156375885,
0.035035502165555954,
0.07260031998157501,
-0.04937504231929779,
0.011960710398852825,
0.0865756943821907,
0.021490398794412613,
-0.033163... |
huggingtweets/_deep_winter_ | 43217409301e8622e217b39dd206cbd8f67bbf55 | 2022-03-01T07:42:37.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_deep_winter_ | 0 | null | transformers | ---
language: en
thumbnail: http://www.huggingtweets.com/_deep_winter_/1646120552069/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1344880990464991239/DJ6glcyj_400x400.png')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">erin.</div>
<div style="text-align: center; font-size: 14px;">@_deep_winter_</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from erin..
| Data | erin. |
| --- | --- |
| Tweets downloaded | 3147 |
| Retweets | 716 |
| Short tweets | 243 |
| Tweets kept | 2188 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3bgxbc1v/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_deep_winter_'s tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2dlbw7vo) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2dlbw7vo/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_deep_winter_')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.02156762406229973,
0.12379380315542221,
-0.0026568653993308544,
0.04871291667222977,
0.1700386255979538,
-0.014040510170161724,
-0.03561169281601906,
0.025803906843066216,
0.07806261628866196,
-0.054685451090335846,
0.001456348574720323,
0.06800071895122528,
0.015035031363368034,
-0.032... |
huggingtweets/_djpn | 9a4c6405ba87a655780aa7a2e203431212428da0 | 2021-05-21T16:59:38.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_djpn | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_djpn/1616675085191/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374299527595782145/rfugYBkE_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Consciousness of Egregore (1/100 posts) π€ AI Bot </div>
<div style="font-size: 15px">@_djpn bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@_djpn's tweets](https://twitter.com/_djpn).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 1199 |
| Retweets | 76 |
| Short tweets | 96 |
| Tweets kept | 1027 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/j9cjpc0x/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_djpn's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3f44odlc) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3f44odlc/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_djpn')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08935549855232239,
0.1257665753364563,
0.030223816633224487,
0.006745561491698027,
0.12958602607250214,
-0.04140768200159073,
0.013133370317518711,
-0.01769203506410122,
0.07977361232042313,
-0.06758666038513184,
-0.010075954720377922,
0.02368469350039959,
0.05727945640683174,
-0.005124... |
huggingtweets/_f1rewalker_-staticmeganito | c526d061835e94133aece41f6e03a439f195a207 | 2021-10-31T23:56:27.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_f1rewalker_-staticmeganito | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1421614250116763648/1kZwzXTB_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1453022424610525186/0AbfRVqP_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">PARKER MACMILLAN I & megan ito</div>
<div style="text-align: center; font-size: 14px;">@_f1rewalker_-staticmeganito</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from PARKER MACMILLAN I & megan ito.
| Data | PARKER MACMILLAN I | megan ito |
| --- | --- | --- |
| Tweets downloaded | 2420 | 3248 |
| Retweets | 8 | 137 |
| Short tweets | 297 | 416 |
| Tweets kept | 2115 | 2695 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1avcuseb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_f1rewalker_-staticmeganito's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3hsk5egr) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3hsk5egr/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_f1rewalker_-staticmeganito')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.015827728435397148,
0.12137358635663986,
-0.004671929869800806,
0.038780927658081055,
0.15491458773612976,
-0.01489408127963543,
-0.039458613842725754,
0.04591316357254982,
0.05831637233495712,
-0.04673698544502258,
0.009243085980415344,
0.09086761623620987,
0.02273745834827423,
-0.0279... |
huggingtweets/_f1rewalker_ | 9b9d798d86c2b33df5313c9c9307cc0a35028ae8 | 2021-11-01T00:02:50.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_f1rewalker_ | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_f1rewalker_/1635724966832/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1421614250116763648/1kZwzXTB_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">PARKER MACMILLAN I</div>
<div style="text-align: center; font-size: 14px;">@_f1rewalker_</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from PARKER MACMILLAN I.
| Data | PARKER MACMILLAN I |
| --- | --- |
| Tweets downloaded | 2420 |
| Retweets | 8 |
| Short tweets | 297 |
| Tweets kept | 2115 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1vlix5az/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_f1rewalker_'s tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2p23ltmn) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2p23ltmn/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_f1rewalker_')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.015002078376710415,
0.11948776245117188,
-0.019083524122834206,
0.05285327136516571,
0.1799960434436798,
-0.0159971434623003,
-0.04235120862722397,
0.02671828120946884,
0.08029036223888397,
-0.06122038513422012,
0.002093503950163722,
0.07197214663028717,
0.01577789895236492,
-0.03012882... |
huggingtweets/_its_mino_ | a45e20d61cd03d22366a9ba6c4e6d62aa121cb40 | 2021-06-23T23:34:38.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_its_mino_ | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_its_mino_/1624491273485/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1367907122340593677/kG7PHHk5_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Mino</div>
<div style="text-align: center; font-size: 14px;">@_its_mino_</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Mino.
| Data | Mino |
| --- | --- |
| Tweets downloaded | 1297 |
| Retweets | 269 |
| Short tweets | 152 |
| Tweets kept | 876 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2q2c0dwu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_its_mino_'s tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/zlnlm02d) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/zlnlm02d/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_its_mino_')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.01604730822145939,
0.12451525032520294,
-0.009891565889120102,
0.05221187323331833,
0.17874979972839355,
-0.01032455824315548,
-0.03912327066063881,
0.026571936905384064,
0.07443354278802872,
-0.054277919232845306,
0.0028450589161366224,
0.06533198058605194,
0.023456519469618797,
-0.025... |
huggingtweets/_luisinhobr-beckvencido | 02004ed7c7157b7e7e186c4d1856f08722b11c17 | 2021-12-22T02:57:34.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_luisinhobr-beckvencido | 0 | null | transformers | ---
language: en
thumbnail: http://www.huggingtweets.com/_luisinhobr-beckvencido/1640141850327/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1470914400764715012/YO9XqA0n_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1390224220643278850/LcIZLss-_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">agrummgit agπ & luisfer nando</div>
<div style="text-align: center; font-size: 14px;">@_luisinhobr-beckvencido</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from agrummgit agπ & luisfer nando.
| Data | agrummgit agπ | luisfer nando |
| --- | --- | --- |
| Tweets downloaded | 3226 | 2366 |
| Retweets | 379 | 367 |
| Short tweets | 672 | 503 |
| Tweets kept | 2175 | 1496 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/34idoh6o/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_luisinhobr-beckvencido's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1w6ipjqa) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1w6ipjqa/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_luisinhobr-beckvencido')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.01793450117111206,
0.12428461760282516,
0.007898569107055664,
0.03968027979135513,
0.17020554840564728,
-0.006252162158489227,
-0.041114967316389084,
0.045667313039302826,
0.07702472805976868,
-0.050753459334373474,
0.0046508898958563805,
0.06933509558439255,
0.019880076870322227,
-0.03... |
huggingtweets/_luisinhobr-bryan_paula_-luanaguei | 2f6b8be07a2085cf051987d1bb836551b092d40a | 2021-12-14T18:17:37.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_luisinhobr-bryan_paula_-luanaguei | 0 | null | transformers | ---
language: en
thumbnail: http://www.huggingtweets.com/_luisinhobr-bryan_paula_-luanaguei/1639505852811/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1390224220643278850/LcIZLss-_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1407505852580442113/U6iWBRLs_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1459704723506872320/gLulTAzG_400x400.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">luisfer nando & Dj Cigarro Solto & ajax de uva verde</div>
<div style="text-align: center; font-size: 14px;">@_luisinhobr-bryan_paula_-luanaguei</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from luisfer nando & Dj Cigarro Solto & ajax de uva verde.
| Data | luisfer nando | Dj Cigarro Solto | ajax de uva verde |
| --- | --- | --- | --- |
| Tweets downloaded | 2313 | 3232 | 2237 |
| Retweets | 351 | 645 | 467 |
| Short tweets | 492 | 586 | 598 |
| Tweets kept | 1470 | 2001 | 1172 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/39qoxauq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_luisinhobr-bryan_paula_-luanaguei's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/30onq8vd) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/30onq8vd/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_luisinhobr-bryan_paula_-luanaguei')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.027687955647706985,
0.11931388080120087,
0.00793586764484644,
0.03699580952525139,
0.16960491240024567,
0.0009440977592021227,
-0.033354777842760086,
0.049034517258405685,
0.0761704072356224,
-0.04757231846451759,
0.0058031100779771805,
0.06648354977369308,
0.024288637563586235,
-0.0317... |
huggingtweets/_luisinhobr-nomesdegato-nomesdj | 0df048d691f8a63c7d398e02085beef77599dfc6 | 2021-12-21T14:04:49.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_luisinhobr-nomesdegato-nomesdj | 0 | null | transformers | ---
language: en
thumbnail: http://www.huggingtweets.com/_luisinhobr-nomesdegato-nomesdj/1640095484918/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1390224220643278850/LcIZLss-_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1175884636624510976/KtBI_1GE_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1245550936807874560/j_zCtKSJ_400x400.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">luisfer nando & nomes foda de dj & nomes de gato</div>
<div style="text-align: center; font-size: 14px;">@_luisinhobr-nomesdegato-nomesdj</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from luisfer nando & nomes foda de dj & nomes de gato.
| Data | luisfer nando | nomes foda de dj | nomes de gato |
| --- | --- | --- | --- |
| Tweets downloaded | 2357 | 3250 | 3211 |
| Retweets | 365 | 6 | 69 |
| Short tweets | 503 | 632 | 1710 |
| Tweets kept | 1489 | 2612 | 1432 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1mwm543c/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_luisinhobr-nomesdegato-nomesdj's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3nbxg8c7) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3nbxg8c7/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_luisinhobr-nomesdegato-nomesdj')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.02809431403875351,
0.11371206492185593,
0.004253797698765993,
0.04251442477107048,
0.17525714635849,
-0.0019569650758057833,
-0.039793070405721664,
0.050066281110048294,
0.07350686192512512,
-0.04522797837853432,
0.0033820460084825754,
0.07067061960697174,
0.025122394785284996,
-0.02732... |
huggingtweets/_marfii | 16d226c75e451831e8083cb3ffa4cc2c9868347d | 2021-05-21T17:07:37.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_marfii | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_marfii/1616720799370/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1370199330112557057/IHw8xP8m_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">marf π€ AI Bot </div>
<div style="font-size: 15px">@_marfii bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@_marfii's tweets](https://twitter.com/_marfii).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3248 |
| Retweets | 144 |
| Short tweets | 872 |
| Tweets kept | 2232 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/15dkgbba/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_marfii's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3bq89iuq) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3bq89iuq/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_marfii')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07645708322525024,
0.15135428309440613,
0.053989749401807785,
0.022637996822595596,
0.14932754635810852,
-0.047042280435562134,
-0.015981242060661316,
-0.015329308807849884,
0.073966845870018,
-0.06293553858995438,
-0.021627550944685936,
0.039862364530563354,
0.06733688712120056,
0.0102... |
huggingtweets/_nalian-simondiamondxx | 77176bada8c75cfdf399047004bd05005d703855 | 2021-07-20T14:18:20.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_nalian-simondiamondxx | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_nalian-simondiamondxx/1626790677014/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375921626261434374/S5xS0GV6_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1385553334468218882/kTwhzZRu_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">SimonDiamond β‘VTUBERβ‘ & Nalian</div>
<div style="text-align: center; font-size: 14px;">@_nalian-simondiamondxx</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from SimonDiamond β‘VTUBERβ‘ & Nalian.
| Data | SimonDiamond β‘VTUBERβ‘ | Nalian |
| --- | --- | --- |
| Tweets downloaded | 1867 | 3238 |
| Retweets | 258 | 137 |
| Short tweets | 456 | 541 |
| Tweets kept | 1153 | 2560 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3hbmyc94/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_nalian-simondiamondxx's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3lmxvuzx) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3lmxvuzx/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_nalian-simondiamondxx')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.014953645877540112,
0.1257329285144806,
0.0005772380391135812,
0.04561930149793625,
0.17353522777557373,
-0.005516071803867817,
-0.04483683407306671,
0.043814852833747864,
0.07184188067913055,
-0.049379415810108185,
-0.0005173795507289469,
0.07775227725505829,
0.018429527059197426,
-0.0... |
huggingtweets/_nisagiss-dril-prezoh | b2d57672902dee98823c000fa16d308be89c7b7c | 2021-08-25T22:47:09.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_nisagiss-dril-prezoh | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_nisagiss-dril-prezoh/1629931624717/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1320596112676409344/rgbeQhIA_400x400.png')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/847818629840228354/VXyQHfn0_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1399607079166435328/coD0YgYH_400x400.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Nisa π²π½ & wint & prezoh</div>
<div style="text-align: center; font-size: 14px;">@_nisagiss-dril-prezoh</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Nisa π²π½ & wint & prezoh.
| Data | Nisa π²π½ | wint | prezoh |
| --- | --- | --- | --- |
| Tweets downloaded | 2987 | 3226 | 3250 |
| Retweets | 2556 | 479 | 37 |
| Short tweets | 155 | 312 | 940 |
| Tweets kept | 276 | 2435 | 2273 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3is5qgb7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_nisagiss-dril-prezoh's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8gs7ve4p) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8gs7ve4p/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_nisagiss-dril-prezoh')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.033814992755651474,
0.12033955007791519,
0.003522438695654273,
0.04299243167042732,
0.1661538928747177,
-0.016993513330817223,
-0.038881734013557434,
0.048165131360292435,
0.06500106304883957,
-0.04259531944990158,
0.011376045644283295,
0.06559649854898453,
0.01826481521129608,
-0.03167... |
huggingtweets/_nisagiss-dril | d81b34be458aa8faaa40336f3c2e1e22cc7a311d | 2021-07-25T04:05:01.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_nisagiss-dril | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_nisagiss-dril/1627185897572/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1320596112676409344/rgbeQhIA_400x400.png')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/847818629840228354/VXyQHfn0_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Nisa π²π½ & wint</div>
<div style="text-align: center; font-size: 14px;">@_nisagiss-dril</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Nisa π²π½ & wint.
| Data | Nisa π²π½ | wint |
| --- | --- | --- |
| Tweets downloaded | 3053 | 3229 |
| Retweets | 2657 | 464 |
| Short tweets | 138 | 311 |
| Tweets kept | 258 | 2454 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/op7x5wkb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_nisagiss-dril's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1x66ooaf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1x66ooaf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_nisagiss-dril')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.022887034341692924,
0.12278459966182709,
0.0029632074292749166,
0.04452680051326752,
0.17187811434268951,
-0.011860692873597145,
-0.045172665268182755,
0.05207102745771408,
0.06913489103317261,
-0.04684269055724144,
0.00285147107206285,
0.07102459669113159,
0.02170647494494915,
-0.03548... |
huggingtweets/_nisagiss-dril_gpt2-drilbot_neo | a8d03b731f9d0e18242873b6b5d0d827a8ece576 | 2021-09-07T01:18:25.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_nisagiss-dril_gpt2-drilbot_neo | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_nisagiss-dril_gpt2-drilbot_neo/1630977501917/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374924360780242944/-Q8NfgEr_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1386749605216407555/QIJeyWfE_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1320596112676409344/rgbeQhIA_400x400.png')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">wintbot_neo & wint but Al & Nisa π²π½</div>
<div style="text-align: center; font-size: 14px;">@_nisagiss-dril_gpt2-drilbot_neo</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from wintbot_neo & wint but Al & Nisa π²π½.
| Data | wintbot_neo | wint but Al | Nisa π²π½ |
| --- | --- | --- | --- |
| Tweets downloaded | 3246 | 3198 | 2993 |
| Retweets | 255 | 41 | 2553 |
| Short tweets | 243 | 49 | 158 |
| Tweets kept | 2748 | 3108 | 282 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/xq1ao3o5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_nisagiss-dril_gpt2-drilbot_neo's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/knmkilof) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/knmkilof/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_nisagiss-dril_gpt2-drilbot_neo')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.031391579657793045,
0.13014549016952515,
-0.008517817594110966,
0.05162500962615013,
0.1684255748987198,
-0.008786262944340706,
-0.04420730471611023,
0.04151652753353119,
0.056756459176540375,
-0.030312325805425644,
-0.008609702810645103,
0.07482218742370605,
0.03135879337787628,
-0.020... |
huggingtweets/_phr3nzy | 4306c508fd96d93a244a572361593feb162526aa | 2021-05-21T17:10:49.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_phr3nzy | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_phr3nzy/1607057644985/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1289641403207819265/Oo8L2MDk_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">osama π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@_phr3nzy bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@_phr3nzy's tweets](https://twitter.com/_phr3nzy).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>2236</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>1135</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>155</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>946</td>
</tr>
</tbody>
</table>
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2cywo7wq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_phr3nzy's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2ijwy08u) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2ijwy08u/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/_phr3nzy'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
<!--- random size file --> | [
-0.0328981839120388,
0.12186145782470703,
0.03529619053006172,
0.01812281273305416,
0.18475598096847534,
0.0387912318110466,
0.01732553355395794,
-0.022003034129738808,
0.10386069118976593,
-0.04367903620004654,
-0.04847702756524086,
0.047300390899181366,
0.016451101750135422,
-0.034464724... |
huggingtweets/_pranavnt | afdc7011879409b13d917ea47488e34fc07f2787 | 2021-08-30T21:04:43.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_pranavnt | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_pranavnt/1630357478814/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1414887427706023940/TxmPt4j1_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Pranav β </div>
<div style="text-align: center; font-size: 14px;">@_pranavnt</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Pranav β .
| Data | Pranav β |
| --- | --- |
| Tweets downloaded | 406 |
| Retweets | 86 |
| Short tweets | 86 |
| Tweets kept | 234 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1si2997p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_pranavnt's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3b5uv7sf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3b5uv7sf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_pranavnt')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.018758855760097504,
0.12276707589626312,
-0.011963186785578728,
0.054184362292289734,
0.18296460807323456,
-0.009688938967883587,
-0.03995592147111893,
0.02924826368689537,
0.07388381659984589,
-0.06321367621421814,
-0.0013357080752030015,
0.06604503095149994,
0.015971677377820015,
-0.0... |
huggingtweets/_rdo | 3ddfd683cfc0831167886fd588835dc5d68024a3 | 2021-05-21T17:11:57.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_rdo | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_rdo/1602271625590/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/664614733476175873/Mk9AdCB3_400x400.png')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Rodrigo RicΓ‘rdez π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@_rdo bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@_rdo's tweets](https://twitter.com/_rdo).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>3176</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>1144</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>310</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>1722</td>
</tr>
</tbody>
</table>
[Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1raumdp7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_rdo's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2pxbgnfy) for full transparency and reproducibility.
At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2pxbgnfy/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/_rdo'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
<!--- random size file --> | [
-0.027857443317770958,
0.12237434089183807,
0.03441610932350159,
0.017978224903345108,
0.18967336416244507,
0.038441240787506104,
0.010883186012506485,
-0.022168947383761406,
0.10812123864889145,
-0.03717422112822533,
-0.05099847912788391,
0.04459397494792938,
0.015730461105704308,
-0.0234... |
huggingtweets/_scottcondron | f61f1589aade91f7acf265ce0b9d7198ae521a4d | 2021-07-30T11:31:50.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_scottcondron | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_scottcondron/1627644706283/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1016982898556141569/R09dBwgv_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Scott Condron</div>
<div style="text-align: center; font-size: 14px;">@_scottcondron</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Scott Condron.
| Data | Scott Condron |
| --- | --- |
| Tweets downloaded | 483 |
| Retweets | 59 |
| Short tweets | 15 |
| Tweets kept | 409 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/20roqlwk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_scottcondron's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/y1w16jqr) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/y1w16jqr/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_scottcondron')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.014678526669740677,
0.12203698605298996,
-0.0127753596752882,
0.053220316767692566,
0.1791854202747345,
-0.012755539268255234,
-0.039021287113428116,
0.025096584111452103,
0.07238010317087173,
-0.0535009428858757,
-0.0013173973420634866,
0.06913953274488449,
0.022303439676761627,
-0.025... |
huggingtweets/_stevenfan | 53f1a1ea2e2289e09da8026e6d3ad1187b84b530 | 2021-05-21T17:15:06.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_stevenfan | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_stevenfan/1616641707888/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374753525268369409/PWpYd5eB_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">stan of discovery π€ AI Bot </div>
<div style="font-size: 15px">@_stevenfan bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@_stevenfan's tweets](https://twitter.com/_stevenfan).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3249 |
| Retweets | 255 |
| Short tweets | 319 |
| Tweets kept | 2675 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2spxy5tp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_stevenfan's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2fwadofl) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2fwadofl/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_stevenfan')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.09674543142318726,
0.118917316198349,
0.05906381830573082,
0.024937879294157028,
0.1501777470111847,
-0.05023108422756195,
-0.0006027991767041385,
-0.014664490707218647,
0.0623612254858017,
-0.041739802807569504,
-0.026391340419650078,
0.03232036903500557,
0.06158601865172386,
0.0268925... |
huggingtweets/_sydkit_ | 745c1840ad65605c68b87b9d656a6c20844201c4 | 2021-05-21T17:16:16.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/_sydkit_ | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/_sydkit_/1614138714911/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1256014674799321088/BVrvm4TY_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">sydkit π€ AI Bot </div>
<div style="font-size: 15px">@_sydkit_ bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@_sydkit_'s tweets](https://twitter.com/_sydkit_).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 160 |
| Retweets | 22 |
| Short tweets | 13 |
| Tweets kept | 125 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/b48v48wr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @_sydkit_'s tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/31cgxgnq) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/31cgxgnq/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/_sydkit_')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07116661965847015,
0.1364501714706421,
0.049693722277879715,
0.007110873237252235,
0.1458052545785904,
-0.06094549968838692,
-0.0024211013223975897,
-0.013430698774755001,
0.06742985546588898,
-0.0485072061419487,
-0.013604607433080673,
0.034829478710889816,
0.05963779240846634,
0.00776... |
huggingtweets/a__spaceman | 313347e7964583edaecebad11efff916b8e23bfb | 2021-05-21T17:18:50.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/a__spaceman | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/a__spaceman/1614115779135/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1094391618139049985/zsGr8oMr_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">A Space Man π ππππ¨π π€ AI Bot </div>
<div style="font-size: 15px">@a__spaceman bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@a__spaceman's tweets](https://twitter.com/a__spaceman).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3192 |
| Retweets | 271 |
| Short tweets | 623 |
| Tweets kept | 2298 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/13dzl8xq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @a__spaceman's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/37svkwhk) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/37svkwhk/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/a__spaceman')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08719324320554733,
0.1309635192155838,
0.05900273099541664,
0.01665353588759899,
0.10874126106500626,
-0.05636264756321907,
0.0035719654988497496,
-0.027314981445670128,
0.0727415531873703,
-0.04566444829106331,
-0.011998286470770836,
0.008063801564276218,
0.04840520769357681,
0.0197751... |
huggingtweets/abattoirscreed | 8ce50f4260f6e91b0dcfcb0d3c00d4268bb8800f | 2021-05-21T17:21:00.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/abattoirscreed | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/abattoirscreed/1616724637240/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1192273550423601152/BO7LuJzU_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">AC π€ AI Bot </div>
<div style="font-size: 15px">@abattoirscreed bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@abattoirscreed's tweets](https://twitter.com/abattoirscreed).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3224 |
| Retweets | 163 |
| Short tweets | 310 |
| Tweets kept | 2751 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2x28gd6s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @abattoirscreed's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/26vsx7hg) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/26vsx7hg/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/abattoirscreed')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08530373126268387,
0.15517598390579224,
0.05293719470500946,
0.018644072115421295,
0.12052726745605469,
-0.0646960660815239,
-0.005829549860209227,
-0.01975150592625141,
0.07609812915325165,
-0.054100051522254944,
-0.01808956265449524,
0.03546842932701111,
0.059864409267902374,
0.004239... |
huggingtweets/abcdentminded | 9ead9aab7fe47d7fcbc91a5a8528f916e0973070 | 2021-05-21T17:23:05.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/abcdentminded | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1296648016875548673/0RDPcPIT_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">ββ€abcdentmindedββ π€ AI Bot </div>
<div style="font-size: 15px">@abcdentminded bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@abcdentminded's tweets](https://twitter.com/abcdentminded).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3247 |
| Retweets | 88 |
| Short tweets | 529 |
| Tweets kept | 2630 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2isrhdw8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @abcdentminded's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2u366z3l) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2u366z3l/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/abcdentminded')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.06358139961957932,
0.1572836935520172,
0.0660678967833519,
0.0007934885215945542,
0.13437460362911224,
-0.062394801527261734,
-0.004606261849403381,
-0.011077704839408398,
0.08220122754573822,
-0.05354040861129761,
-0.02185184881091118,
0.05732434242963791,
0.06537432223558426,
-0.00246... |
huggingtweets/abelaer | 3c2fdc425cd2d44a51ae491dfdf0606a92bdc62f | 2021-05-21T17:25:22.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/abelaer | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/abelaer/1616682063676/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1085138421599870976/y1VodNUp_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Abel Jansma π€ AI Bot </div>
<div style="font-size: 15px">@abelaer bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@abelaer's tweets](https://twitter.com/abelaer).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 231 |
| Retweets | 15 |
| Short tweets | 14 |
| Tweets kept | 202 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2fgjwxfv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @abelaer's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/39ffistv) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/39ffistv/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/abelaer')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08224130421876907,
0.1439625769853592,
0.052691299468278885,
0.032655004411935806,
0.13072450459003448,
-0.043870117515325546,
-0.01590854860842228,
-0.006013133563101292,
0.06788290292024612,
-0.06023527681827545,
-0.024790821596980095,
0.03213941305875778,
0.055455587804317474,
0.0122... |
huggingtweets/abnuo113 | 0dbdf6a1aa98e00ad0b088638d3662f7ac5fa166 | 2022-02-09T01:13:41.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/abnuo113 | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1484369552498573313/MP-r9WvV_400x400.png')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">πΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉ</div>
<div style="text-align: center; font-size: 14px;">@abnuo113</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from πΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉ.
| Data | πΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉπΉ |
| --- | --- |
| Tweets downloaded | 3213 |
| Retweets | 316 |
| Short tweets | 1545 |
| Tweets kept | 1352 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/7huohook/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @abnuo113's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2j8kmobh) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2j8kmobh/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/abnuo113')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.02128935419023037,
0.12100059539079666,
-0.004733015783131123,
0.04263154789805412,
0.1607295423746109,
-0.019999681040644646,
-0.04180207476019859,
0.03728007152676582,
0.06438146531581879,
-0.04739071801304817,
0.009957476519048214,
0.08748232573270798,
0.02045147679746151,
-0.0325244... |
huggingtweets/acephallus | c1238c5308ae208552175194196b604e12a18844 | 2021-05-21T17:29:14.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/acephallus | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/acephallus/1617764407342/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1377643728123404290/yXoDgE8c_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">annie halation! πΊ π€ AI Bot </div>
<div style="font-size: 15px">@acephallus bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@acephallus's tweets](https://twitter.com/acephallus).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3227 |
| Retweets | 223 |
| Short tweets | 613 |
| Tweets kept | 2391 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/gd83k7hw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @acephallus's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/32kzz65f) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/32kzz65f/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/acephallus')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08541804552078247,
0.13035255670547485,
0.05932725593447685,
0.024997565895318985,
0.12427815049886703,
-0.05006677657365799,
-0.0012715643970295787,
-0.013808074407279491,
0.06657575070858002,
-0.04356008395552635,
-0.013678763061761856,
0.011874392628669739,
0.05710247531533241,
0.027... |
huggingtweets/actionattheend | a44ba66f47a9108338ffb7de00bc18245a515099 | 2021-05-21T17:30:58.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/actionattheend | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1255634311090298880/Nn88pZZB_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">recliner ocelot π΄π¬ π€ AI Bot </div>
<div style="font-size: 15px">@actionattheend bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@actionattheend's tweets](https://twitter.com/actionattheend).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3204 |
| Retweets | 1194 |
| Short tweets | 304 |
| Tweets kept | 1706 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2iua5in8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @actionattheend's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/7to0e91w) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/7to0e91w/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/actionattheend')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.06162628158926964,
0.13456261157989502,
0.030429469421505928,
0.013288173824548721,
0.12586231529712677,
-0.049616556614637375,
0.0035747995134443045,
-0.00785400066524744,
0.09239894896745682,
-0.0573900081217289,
0.010530889965593815,
0.06499872356653214,
0.049314387142658234,
-0.0166... |
huggingtweets/adamwathan | 1c3ba9e0acb1f0a644ced5929fa381357e8de349 | 2021-05-21T17:33:04.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/adamwathan | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/adamwathan/1600972790062/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/887661330832003072/Zp6rA_e2_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Adam Wathan π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@adamwathan bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@adamwathan's tweets](https://twitter.com/adamwathan).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>3240</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>212</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>165</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>2863</td>
</tr>
</tbody>
</table>
[Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3jzwjo2j/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @adamwathan's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3jg7czwi) for full transparency and reproducibility.
At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3jg7czwi/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/adamwathan'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
<!--- random size file --> | [
-0.02846246026456356,
0.1252956986427307,
0.040645014494657516,
0.012562606483697891,
0.18739262223243713,
0.03507324308156967,
0.01803589053452015,
-0.02715938165783882,
0.10701379925012589,
-0.03903147950768471,
-0.05163279175758362,
0.04834742471575737,
0.01159290224313736,
-0.031784422... |
huggingtweets/adapkepinska | 6975eef106be476c14743f9d4b16959d90f0e214 | 2021-05-21T17:35:04.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/adapkepinska | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/adapkepinska/1616670223225/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1352684180803641344/KJ8CTFUO_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ada KΔpiΕska π€ AI Bot </div>
<div style="font-size: 15px">@adapkepinska bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@adapkepinska's tweets](https://twitter.com/adapkepinska).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3220 |
| Retweets | 287 |
| Short tweets | 152 |
| Tweets kept | 2781 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3swqkm62/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @adapkepinska's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3thoe5t6) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3thoe5t6/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/adapkepinska')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07292726635932922,
0.12418661266565323,
0.04300170764327049,
0.033340927213430405,
0.1298667937517166,
-0.035766150802373886,
-0.004799462854862213,
-0.024046016857028008,
0.062295980751514435,
-0.05649341270327568,
-0.03696272522211075,
0.01918116956949234,
0.029110079631209373,
0.0182... |
huggingtweets/adderallblack | 1abf71cf9b336887cc670f5574f3e6bebcf061af | 2021-05-21T17:36:11.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/adderallblack | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/adderallblack/1621371634510/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1392879033403289600/sb6Ok_0q_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">black arkansas</div>
<div style="text-align: center; font-size: 14px;">@adderallblack</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from black arkansas.
| Data | black arkansas |
| --- | --- |
| Tweets downloaded | 3230 |
| Retweets | 276 |
| Short tweets | 426 |
| Tweets kept | 2528 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1r7zzeri/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @adderallblack's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2p7g8dji) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2p7g8dji/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/adderallblack')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.008499005809426308,
0.12093031406402588,
-0.009732081554830074,
0.05576949566602707,
0.1777830719947815,
-0.015484935604035854,
-0.037427496165037155,
0.026682604104280472,
0.07627213001251221,
-0.05570438876748085,
-0.0010943514062091708,
0.07244677096605301,
0.014783704653382301,
-0.0... |
huggingtweets/adderallia | fb29592e49273f80b504782c6ff67a7991b89245 | 2021-05-21T17:37:15.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/adderallia | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1371446905604087813/2FxI9YMM_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">macy π€ AI Bot </div>
<div style="font-size: 15px">@adderallia bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@adderallia's tweets](https://twitter.com/adderallia).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 295 |
| Retweets | 71 |
| Short tweets | 5 |
| Tweets kept | 219 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/jjeo4uw5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @adderallia's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/g3f6vfg5) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/g3f6vfg5/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/adderallia')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.05884985998272896,
0.1416846662759781,
0.06098730489611626,
0.0024683719966560602,
0.11637866497039795,
-0.06928917020559311,
0.0024755713529884815,
-0.0029468864668160677,
0.07994092255830765,
-0.06705787032842636,
-0.01418622862547636,
0.05934872850775719,
0.05646310746669769,
0.01355... |
huggingtweets/adhd_93 | 41fa5b541c10859dcfa277b0a7f52650c29dcddd | 2021-10-09T01:14:07.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/adhd_93 | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/adhd_93/1633742043558/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1442325298138255362/h2ntdCgO_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">LGBTDHD</div>
<div style="text-align: center; font-size: 14px;">@adhd_93</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from LGBTDHD.
| Data | LGBTDHD |
| --- | --- |
| Tweets downloaded | 3236 |
| Retweets | 296 |
| Short tweets | 153 |
| Tweets kept | 2787 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2o8cqxfu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @adhd_93's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/227a55pn) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/227a55pn/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/adhd_93')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.015543162822723389,
0.11989002674818039,
-0.008558602072298527,
0.05328342691063881,
0.176997110247612,
-0.016286375001072884,
-0.04178925231099129,
0.024142928421497345,
0.0778709203004837,
-0.057152364403009415,
-0.0032432330772280693,
0.07376187294721603,
0.014756639488041401,
-0.026... |
huggingtweets/adhib | 0d4256706d7b65bf85620164ea44a7de2d0f9f21 | 2021-05-21T17:38:33.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/adhib | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/adhib/1617472294749/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1259773709210079234/zy8BML5a_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Adam Hibbert π€ AI Bot </div>
<div style="font-size: 15px">@adhib bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@adhib's tweets](https://twitter.com/adhib).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3247 |
| Retweets | 89 |
| Short tweets | 509 |
| Tweets kept | 2649 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3vmd854v/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @adhib's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/okvjl3od) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/okvjl3od/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/adhib')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08030610531568527,
0.14059263467788696,
0.053849488496780396,
0.02539770118892193,
0.1422688513994217,
-0.056504879146814346,
0.015770986676216125,
-0.021252064034342766,
0.06887555122375488,
-0.053385116159915924,
-0.024730578064918518,
0.03523322567343712,
0.048489805310964584,
0.0212... |
huggingtweets/adiaeu | b42135f34bbcc9fee896b36551e1024ce6b7e0a4 | 2021-05-21T17:40:56.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/adiaeu | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/adiaeu/1608391370887/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1324015708104089600/ZrXV0rUp_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ωenhypen debut π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@adiaeu bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@adiaeu's tweets](https://twitter.com/adiaeu).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>3167</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>285</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>598</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>2284</td>
</tr>
</tbody>
</table>
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2mizccrh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @adiaeu's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1jcjoc84) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1jcjoc84/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/adiaeu'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
| [
-0.02899583801627159,
0.12756697833538055,
0.03543569892644882,
0.020033445209264755,
0.19061189889907837,
0.03092365153133869,
0.02001321129500866,
-0.028901325538754463,
0.11012817174196243,
-0.03742890805006027,
-0.05119912326335907,
0.045185986906290054,
0.010904490947723389,
-0.028539... |
huggingtweets/adjacentgrace | 2f03eb28f3f91dc478575cfc8035f85af4be26f6 | 2021-05-21T17:42:06.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/adjacentgrace | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/adjacentgrace/1616623328480/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1366296275302248448/ZQk6DPNb_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Grace π€ AI Bot </div>
<div style="font-size: 15px">@adjacentgrace bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@adjacentgrace's tweets](https://twitter.com/adjacentgrace).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 561 |
| Retweets | 155 |
| Short tweets | 61 |
| Tweets kept | 345 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1xjsh2v0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @adjacentgrace's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/14n88k4d) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/14n88k4d/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/adjacentgrace')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08878699690103531,
0.12200554460287094,
0.04482943192124367,
0.017167961224913597,
0.12295033037662506,
-0.038298096507787704,
-0.02943284437060356,
-0.0017997102113440633,
0.0758865475654602,
-0.05800508335232735,
-0.023795029148459435,
0.04602828621864319,
0.07128079980611801,
0.00567... |
huggingtweets/adriangregory20 | 5e7ef7444d49e747bf4430ad2d91bb626e6bff84 | 2021-05-21T17:44:07.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/adriangregory20 | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/adriangregory20/1617002077884/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1307765220107001859/cEfzmr1c_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Adrian Gregory π€ AI Bot </div>
<div style="font-size: 15px">@adriangregory20 bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@adriangregory20's tweets](https://twitter.com/adriangregory20).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3246 |
| Retweets | 587 |
| Short tweets | 204 |
| Tweets kept | 2455 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/4phwvtdq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @adriangregory20's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3tlt3nyy) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3tlt3nyy/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/adriangregory20')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.06524127721786499,
0.14641767740249634,
0.04705042019486427,
0.023299308493733406,
0.12568555772304535,
-0.04644676670432091,
-0.006415924057364464,
-0.020102500915527344,
0.0765867680311203,
-0.06807170063257217,
-0.013842109590768814,
0.014352329075336456,
0.057525429874658585,
0.0276... |
huggingtweets/adrienna_w | 4bd2ea7458606d9f55e721bc77be97b213841c48 | 2021-05-21T17:45:21.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/adrienna_w | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/adrienna_w/1610164811243/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1267145246359474176/OtRIrSIL_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Adrienna Wong π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@adrienna_w bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@adrienna_w's tweets](https://twitter.com/adrienna_w).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>2000</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>1570</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>46</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>384</td>
</tr>
</tbody>
</table>
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3r42s34p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @adrienna_w's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3n5znqzh) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3n5znqzh/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/adrienna_w'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
| [
-0.030612368136644363,
0.12682095170021057,
0.039344992488622665,
0.016125429421663284,
0.18760667741298676,
0.030017651617527008,
0.017065120860934258,
-0.02838500775396824,
0.10720658302307129,
-0.04093707352876663,
-0.0478304959833622,
0.04491405561566353,
0.010364427231252193,
-0.03059... |
huggingtweets/aevaeavaevevave | ab8061dc3e0f9a2b83440e93e2181640c4f1c93f | 2022-01-20T15:13:33.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/aevaeavaevevave | 0 | null | transformers | ---
language: en
thumbnail: http://www.huggingtweets.com/aevaeavaevevave/1642691608974/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1471448753353670660/T0h3zXn-_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">aeva</div>
<div style="text-align: center; font-size: 14px;">@aevaeavaevevave</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from aeva.
| Data | aeva |
| --- | --- |
| Tweets downloaded | 3184 |
| Retweets | 985 |
| Short tweets | 659 |
| Tweets kept | 1540 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3g4kejp0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @aevaeavaevevave's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ikuw0pg) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ikuw0pg/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/aevaeavaevevave')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.01832614094018936,
0.12689773738384247,
-0.003618120914325118,
0.04601342976093292,
0.17284682393074036,
-0.0134689686819911,
-0.04816949740052223,
0.034683216363191605,
0.08174088597297668,
-0.050681617110967636,
0.005875518545508385,
0.07689248770475388,
0.017046788707375526,
-0.03798... |
huggingtweets/agencialavieja | e25e930e64deba0f411cbb4b7f90d54d339a3e06 | 2021-05-21T17:49:00.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/agencialavieja | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/agencialavieja/1621053473805/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1223585534561472512/QO-CQ64Z_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Alfredo Casero</div>
<div style="text-align: center; font-size: 14px;">@agencialavieja</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Alfredo Casero.
| Data | Alfredo Casero |
| --- | --- |
| Tweets downloaded | 3197 |
| Retweets | 854 |
| Short tweets | 565 |
| Tweets kept | 1778 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2xpelzjw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @agencialavieja's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1q128hty) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1q128hty/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/agencialavieja')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.010943468660116196,
0.12584491074085236,
-0.015081427991390228,
0.051690246909856796,
0.17591963708400726,
-0.007975763641297817,
-0.04045778885483742,
0.02401808835566044,
0.07717275619506836,
-0.05332116782665253,
0.0020228996872901917,
0.06718733161687851,
0.0186407882720232,
-0.0229... |
huggingtweets/agendernihilist | d4a24c59408591deb4afc849b4fbacb648263675 | 2021-05-21T17:50:24.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/agendernihilist | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/agendernihilist/1617923598463/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1279628481073041409/mtT5QVq__400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">π₯°Gender Nihilist/Nihilist Anarchistπ₯° π€ AI Bot </div>
<div style="font-size: 15px">@agendernihilist bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@agendernihilist's tweets](https://twitter.com/agendernihilist).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3172 |
| Retweets | 1457 |
| Short tweets | 187 |
| Tweets kept | 1528 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/37jo5lqx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @agendernihilist's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3hzj8j9p) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3hzj8j9p/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/agendernihilist')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.06378169357776642,
0.1288198083639145,
0.03855281323194504,
0.01954856887459755,
0.1420542597770691,
-0.07547923922538757,
-0.012397813610732555,
-0.019702572375535965,
0.06932101398706436,
-0.06405944377183914,
0.004031906835734844,
0.03245053440332413,
0.06299806386232376,
0.010465561... |
huggingtweets/agholdier | 85bac29bda45f4250ea92e35c5c0869d3d453064 | 2021-05-26T20:22:22.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/agholdier | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1344775686586847233/QkHU_dIP_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">A.G. Holdier Loves Coors Cat</div>
<div style="text-align: center; font-size: 14px;">@agholdier</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from A.G. Holdier Loves Coors Cat.
| Data | A.G. Holdier Loves Coors Cat |
| --- | --- |
| Tweets downloaded | 3235 |
| Retweets | 460 |
| Short tweets | 423 |
| Tweets kept | 2352 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2xot2p53/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @agholdier's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2fke0tr2) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2fke0tr2/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/agholdier')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.028026428073644638,
0.13624195754528046,
-0.009671562351286411,
0.033851951360702515,
0.1604052484035492,
-0.01641700230538845,
-0.036197859793901443,
0.04363088309764862,
0.07263252884149551,
-0.04964593052864075,
0.01593734696507454,
0.08067993074655533,
0.017888600006699562,
-0.03482... |
huggingtweets/agnescallard | 678ea001b56f7b66777e343041a0c19d86450593 | 2021-05-21T17:52:29.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/agnescallard | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/agnescallard/1616718656775/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1302422740507516929/zD7GvA0H_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Agnes Callard π€ AI Bot </div>
<div style="font-size: 15px">@agnescallard bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@agnescallard's tweets](https://twitter.com/agnescallard).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3240 |
| Retweets | 371 |
| Short tweets | 410 |
| Tweets kept | 2459 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1w2jn5h4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @agnescallard's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/hgprm6he) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/hgprm6he/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/agnescallard')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08524012565612793,
0.12843255698680878,
0.057385556399822235,
0.022644339129328728,
0.10247988998889923,
-0.043723318725824356,
-0.010533527471125126,
-0.0039297291077673435,
0.08150024712085724,
-0.06104223057627678,
-0.024612054228782654,
0.01647430658340454,
0.042753931134939194,
0.0... |
huggingtweets/ahleemuhleek | 92a8a43c0ecda8813c55854ca7732ca378a8790e | 2021-06-15T18:38:34.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/ahleemuhleek | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/ahleemuhleek/1623782310895/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1404846924226695174/_oELkFsx_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">##ahleeuwu</div>
<div style="text-align: center; font-size: 14px;">@ahleemuhleek</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from ##ahleeuwu.
| Data | ##ahleeuwu |
| --- | --- |
| Tweets downloaded | 480 |
| Retweets | 149 |
| Short tweets | 86 |
| Tweets kept | 245 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/17rz3rct/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ahleemuhleek's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/32bqa4q7) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/32bqa4q7/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/ahleemuhleek')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.016515742987394333,
0.11925161629915237,
-0.018216190859675407,
0.04785408452153206,
0.17687731981277466,
-0.017124246805906296,
-0.03714573383331299,
0.027608847245573997,
0.08616543561220169,
-0.06023795157670975,
0.001657278393395245,
0.07481399923563004,
0.013711405918002129,
-0.032... |
huggingtweets/ai_hexcrawl-gptmicrofic | 1a9e3236396bc4f4a02cd68f7d8163e8df84dbd4 | 2021-09-18T03:18:36.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/ai_hexcrawl-gptmicrofic | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/ai_hexcrawl-gptmicrofic/1631934945678/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1391882949650440200/lmEKl2ZQ_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1261895681561804800/r6vOZGoH_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">AI Hexcrawl & GPT2-Microfic</div>
<div style="text-align: center; font-size: 14px;">@ai_hexcrawl-gptmicrofic</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from AI Hexcrawl & GPT2-Microfic.
| Data | AI Hexcrawl | GPT2-Microfic |
| --- | --- | --- |
| Tweets downloaded | 737 | 1127 |
| Retweets | 26 | 9 |
| Short tweets | 1 | 9 |
| Tweets kept | 710 | 1109 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2cmbpada/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ai_hexcrawl-gptmicrofic's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/5g9tts1o) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/5g9tts1o/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/ai_hexcrawl-gptmicrofic')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.026711195707321167,
0.12074261158704758,
0.00014510411710944027,
0.04908686876296997,
0.17188464105129242,
-0.00742868147790432,
-0.03776717185974121,
0.051925498992204666,
0.06684886664152145,
-0.04712508246302605,
0.009442496113479137,
0.07756844162940979,
0.0248375553637743,
-0.02636... |
huggingtweets/ai_hexcrawl | c36d72fe170781e0295c1d10571338e2ceb618bd | 2021-12-15T19:46:29.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/ai_hexcrawl | 0 | null | transformers | ---
language: en
thumbnail: http://www.huggingtweets.com/ai_hexcrawl/1639597537705/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1467327234365181953/gFho8YCv_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">AI Hexcrawl</div>
<div style="text-align: center; font-size: 14px;">@ai_hexcrawl</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from AI Hexcrawl.
| Data | AI Hexcrawl |
| --- | --- |
| Tweets downloaded | 1164 |
| Retweets | 42 |
| Short tweets | 2 |
| Tweets kept | 1120 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/vdxugbwr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ai_hexcrawl's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/r9ejkubu) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/r9ejkubu/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/ai_hexcrawl')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.016031501814723015,
0.1169140562415123,
-0.020917965099215508,
0.04658784344792366,
0.1818363219499588,
-0.013920776546001434,
-0.046014562249183655,
0.0352175198495388,
0.07817843556404114,
-0.06028205156326294,
0.006141320802271366,
0.06805850565433502,
0.023380964994430542,
-0.031515... |
huggingtweets/aijritter | 3153b5661707eb048c005f77619120dc0c96bd1b | 2021-05-21T17:54:45.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/aijritter | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/aijritter/1619426792472/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374061160132186116/NV6XVCdH_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jritter AI π€ AI Bot </div>
<div style="font-size: 15px">@aijritter bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@aijritter's tweets](https://twitter.com/aijritter).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 2484 |
| Retweets | 21 |
| Short tweets | 271 |
| Tweets kept | 2192 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/16pwaloe/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @aijritter's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1l866lhx) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1l866lhx/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/aijritter')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07860233634710312,
0.1413174271583557,
0.06137623265385628,
0.018871568143367767,
0.14024607837200165,
-0.05670936033129692,
-0.015445773489773273,
-0.018774505704641342,
0.07710172235965729,
-0.060730986297130585,
-0.024160973727703094,
0.02894687093794346,
0.05329764634370804,
0.01583... |
huggingtweets/aimbotaimy-coldjiangshi-ladydarknest | d8e99c1661dab61ef909569281a2f02e2e9feed4 | 2021-07-25T20:00:21.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/aimbotaimy-coldjiangshi-ladydarknest | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/aimbotaimy-coldjiangshi-ladydarknest/1627243217316/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374872808136835072/hPahIg-A_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1409725677495009283/RPVDIGan_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1413348777243512833/dvnUJ-du_400x400.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">AimbotAimy ππ NSFW V-Tuber & Demon Lord Yeefi NSFWπ & ADMIRAL JIANGSHI πππΉπ΄ββ οΈ</div>
<div style="text-align: center; font-size: 14px;">@aimbotaimy-coldjiangshi-ladydarknest</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from AimbotAimy ππ NSFW V-Tuber & Demon Lord Yeefi NSFWπ & ADMIRAL JIANGSHI πππΉπ΄ββ οΈ.
| Data | AimbotAimy ππ NSFW V-Tuber | Demon Lord Yeefi NSFWπ | ADMIRAL JIANGSHI πππΉπ΄ββ οΈ |
| --- | --- | --- | --- |
| Tweets downloaded | 518 | 3242 | 2899 |
| Retweets | 60 | 957 | 1462 |
| Short tweets | 127 | 392 | 324 |
| Tweets kept | 331 | 1893 | 1113 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/348if7b6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @aimbotaimy-coldjiangshi-ladydarknest's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1dzd34gb) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1dzd34gb/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/aimbotaimy-coldjiangshi-ladydarknest')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.030059175565838814,
0.1240207627415657,
-0.004137605894356966,
0.048687346279621124,
0.16292235255241394,
-0.006801746319979429,
-0.03814931586384773,
0.04976441338658333,
0.07509905844926834,
-0.04237264767289162,
0.011433148756623268,
0.07315599918365479,
0.020604519173502922,
-0.0341... |
huggingtweets/aimbotaimy | 59d61e5baae9e330d3b333aea223ccaab0f8d259 | 2021-07-25T03:52:26.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/aimbotaimy | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/aimbotaimy/1627185142630/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374872808136835072/hPahIg-A_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">AimbotAimy ππ NSFW V-Tuber</div>
<div style="text-align: center; font-size: 14px;">@aimbotaimy</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from AimbotAimy ππ NSFW V-Tuber.
| Data | AimbotAimy ππ NSFW V-Tuber |
| --- | --- |
| Tweets downloaded | 491 |
| Retweets | 59 |
| Short tweets | 125 |
| Tweets kept | 307 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/38rsh6x7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @aimbotaimy's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2sn41u12) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2sn41u12/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/aimbotaimy')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.013710466213524342,
0.12494046986103058,
-0.011005849577486515,
0.052725039422512054,
0.1646929234266281,
-0.015598144382238388,
-0.042459458112716675,
0.020611751824617386,
0.07904719561338425,
-0.0555218830704689,
-0.0055724382400512695,
0.06169319525361061,
0.019086405634880066,
-0.0... |
huggingtweets/ak92501-cafe_orbitinnit-ihatesinglets | ab989bbf7b95e6e0c03bea15fedd95f391149949 | 2021-09-07T00:03:08.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/ak92501-cafe_orbitinnit-ihatesinglets | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/ak92501-cafe_orbitinnit-ihatesinglets/1630972983357/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1429115399975497731/JZdA725e_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1433245625429204993/xzzFE2CJ_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1405992051427229698/V3W-1gOb_400x400.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">β¨γγ‘ Tommyβs an Orbit π γγ‘β¨ & everyone in the system this isnβt normal & AK</div>
<div style="text-align: center; font-size: 14px;">@ak92501-cafe_orbitinnit-ihatesinglets</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from β¨γγ‘ Tommyβs an Orbit π γγ‘β¨ & everyone in the system this isnβt normal & AK.
| Data | β¨γγ‘ Tommyβs an Orbit π γγ‘β¨ | everyone in the system this isnβt normal | AK |
| --- | --- | --- | --- |
| Tweets downloaded | 2256 | 1151 | 3250 |
| Retweets | 1350 | 78 | 403 |
| Short tweets | 323 | 352 | 464 |
| Tweets kept | 583 | 721 | 2383 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/mhwl02od/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ak92501-cafe_orbitinnit-ihatesinglets's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/m05466la) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/m05466la/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/ak92501-cafe_orbitinnit-ihatesinglets')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.021736830472946167,
0.11687157303094864,
0.004075618460774422,
0.04959217831492424,
0.17090177536010742,
-0.0015738561050966382,
-0.0370958186686039,
0.048021476715803146,
0.06855222582817078,
-0.04556339606642723,
0.004970015957951546,
0.08036049455404282,
0.02581227570772171,
-0.03248... |
huggingtweets/akasarahjean | 87987f6e915667461be5da2944be8694d7eb82ad | 2021-05-21T17:55:49.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/akasarahjean | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/akasarahjean/1603135242100/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1017476480501104640/KJ_2cey1_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sarah Sweeney π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@akasarahjean bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@akasarahjean's tweets](https://twitter.com/akasarahjean).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>1116</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>358</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>68</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>690</td>
</tr>
</tbody>
</table>
[Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2hxdrlnu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @akasarahjean's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/38b2s9q1) for full transparency and reproducibility.
At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/38b2s9q1/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/akasarahjean'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
<!--- random size file --> | [
-0.026738082990050316,
0.13008646667003632,
0.033971674740314484,
0.018366416916251183,
0.19117122888565063,
0.042836833745241165,
0.014106595888733864,
-0.020734578371047974,
0.11053051799535751,
-0.037853315472602844,
-0.05442594736814499,
0.044703803956508636,
0.013140866532921791,
-0.0... |
huggingtweets/alanwattsdaily | 848163ec1bd6434c826f340ee6d3e2f4dcbff849 | 2021-05-21T17:59:45.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alanwattsdaily | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alanwattsdaily/1611766517715/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/974155432678785024/dFFYSfSi_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alan Watts π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@alanwattsdaily bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alanwattsdaily's tweets](https://twitter.com/alanwattsdaily).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>3248</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>4</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>17</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>3227</td>
</tr>
</tbody>
</table>
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3k8o9ly2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alanwattsdaily's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/32i7r9zd) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/32i7r9zd/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/alanwattsdaily'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
| [
-0.03166596591472626,
0.12641499936580658,
0.037238042801618576,
0.012726218439638615,
0.18249432742595673,
0.03894615173339844,
0.018707050010561943,
-0.02386365830898285,
0.10804968327283859,
-0.03684237226843834,
-0.05313262715935707,
0.0484473742544651,
0.010939119383692741,
-0.0335401... |
huggingtweets/albertletranger | a18994a96e5a9f4da4aa597d8eb27905e805859d | 2021-05-21T18:01:16.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/albertletranger | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/albertletranger/1616779907134/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1148966885024837635/8ihdfQKv_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Albert π€ AI Bot </div>
<div style="font-size: 15px">@albertletranger bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@albertletranger's tweets](https://twitter.com/albertletranger).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3230 |
| Retweets | 1299 |
| Short tweets | 362 |
| Tweets kept | 1569 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/x4s90a6l/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @albertletranger's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/10wrv1a0) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/10wrv1a0/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/albertletranger')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07165650278329849,
0.1382661610841751,
0.04596170037984848,
0.032264549285173416,
0.14625801146030426,
-0.03371460363268852,
-0.023699980229139328,
-0.004812837112694979,
0.08109381794929504,
-0.07128383219242096,
-0.029331328347325325,
0.03463490307331085,
0.05423334240913391,
0.010079... |
huggingtweets/albertsstuff | 361e293c62e5b2461730c65cb2a862ccd8674e70 | 2021-08-02T03:04:23.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/albertsstuff | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/albertsstuff/1627873459813/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1410065266847985667/Sj4WiXAu_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">albert πΉπΌ</div>
<div style="text-align: center; font-size: 14px;">@albertsstuff</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from albert πΉπΌ.
| Data | albert πΉπΌ |
| --- | --- |
| Tweets downloaded | 3187 |
| Retweets | 240 |
| Short tweets | 825 |
| Tweets kept | 2122 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2e0c8502/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @albertsstuff's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2rsgjsom) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2rsgjsom/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/albertsstuff')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.013095054775476456,
0.12293854355812073,
-0.013666719198226929,
0.05186296999454498,
0.18119876086711884,
-0.011254193261265755,
-0.04296252503991127,
0.025147272273898125,
0.07308238744735718,
-0.057854678481817245,
-0.0004273451922927052,
0.06926488131284714,
0.022030938416719437,
-0.... |
huggingtweets/albinkurti | 06e708827101caacfcc9cc4c58af543ed9a223eb | 2022-02-11T11:38:45.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/albinkurti | 0 | null | transformers | ---
language: en
thumbnail: http://www.huggingtweets.com/albinkurti/1644579521299/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1425007522067386368/k0GygSdD_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Albin Kurti</div>
<div style="text-align: center; font-size: 14px;">@albinkurti</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Albin Kurti.
| Data | Albin Kurti |
| --- | --- |
| Tweets downloaded | 741 |
| Retweets | 32 |
| Short tweets | 11 |
| Tweets kept | 698 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1yhql26z/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @albinkurti's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/txe5baun) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/txe5baun/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/albinkurti')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.011941635049879551,
0.1209324523806572,
-0.010725048370659351,
0.052069321274757385,
0.17340555787086487,
-0.015927232801914215,
-0.03850587084889412,
0.02403826266527176,
0.07731087505817413,
-0.05738262087106705,
-0.003998477011919022,
0.06893149018287659,
0.015150942839682102,
-0.027... |
huggingtweets/albiuwu_ | 256454e32b490394d78b29eb5df81c0912800049 | 2021-05-21T18:03:38.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/albiuwu_ | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/albiuwu_/1617915531860/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1369997482000781312/kRWof8b8_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Albi πΈ π€ AI Bot </div>
<div style="font-size: 15px">@albiuwu_ bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@albiuwu_'s tweets](https://twitter.com/albiuwu_).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3248 |
| Retweets | 38 |
| Short tweets | 569 |
| Tweets kept | 2641 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1tndawti/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @albiuwu_'s tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/gswiupus) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/gswiupus/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/albiuwu_')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07097163051366806,
0.1468629539012909,
0.04933378845453262,
0.02586081251502037,
0.11258750408887863,
-0.05421886220574379,
-0.008485815487802029,
-0.03540709987282753,
0.07010330259799957,
-0.06294522434473038,
-0.010394096374511719,
0.0005242694751359522,
0.07262744754552841,
0.016081... |
huggingtweets/aledaws | f023a200e333b1d34defb9339f1b8f09ab715f5d | 2021-05-21T18:04:48.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/aledaws | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/aledaws/1617245961730/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/480053897382199298/jZba2UiA_400x400.jpeg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alec Dawson π€ AI Bot </div>
<div style="font-size: 15px">@aledaws bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@aledaws's tweets](https://twitter.com/aledaws).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 1155 |
| Retweets | 67 |
| Short tweets | 71 |
| Tweets kept | 1017 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3agqmwhg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @aledaws's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3xwitci1) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3xwitci1/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/aledaws')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07162121683359146,
0.12425783276557922,
0.037269238382577896,
0.03557619825005531,
0.14490529894828796,
-0.032385118305683136,
-0.01443242933601141,
-0.005143319722265005,
0.07143355906009674,
-0.058991510421037674,
-0.0319230780005455,
0.028935173526406288,
0.061337292194366455,
0.0132... |
huggingtweets/alex73630 | 9a4bcc3ba7ac6b4f4cbaf962943ef457ddc36d4b | 2021-05-21T18:05:51.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alex73630 | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alex73630/1600703549505/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1128605157602877441/R2nQEZZZ_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alexandre Sanchez 𦦠π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@alex73630 bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alex73630's tweets](https://twitter.com/alex73630).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>3160</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>928</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>296</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>1936</td>
</tr>
</tbody>
</table>
[Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/ru1nivmp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alex73630's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/14qg9e3j) for full transparency and reproducibility.
At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/14qg9e3j/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/alex73630'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
<!--- random size file --> | [
-0.024524515494704247,
0.12385943531990051,
0.03308633714914322,
0.015817660838365555,
0.1881420761346817,
0.039661701768636703,
0.01711600087583065,
-0.02559250220656395,
0.10955144464969635,
-0.04353409260511398,
-0.053172364830970764,
0.04453757405281067,
0.010087879374623299,
-0.024031... |
huggingtweets/alexanderramek | 9e6b207777633e97f08c916386fb864f5f5459a8 | 2021-05-21T18:07:10.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alexanderramek | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alexanderramek/1614096947716/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1063527363638525952/H-DKF-LP_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alex Ramek π€ AI Bot </div>
<div style="font-size: 15px">@alexanderramek bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alexanderramek's tweets](https://twitter.com/alexanderramek).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 402 |
| Retweets | 171 |
| Short tweets | 60 |
| Tweets kept | 171 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1fmckgrk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alexanderramek's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3clt5uj2) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3clt5uj2/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alexanderramek')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07820460945367813,
0.13587811589241028,
0.044687941670417786,
0.022565646097064018,
0.1308201402425766,
-0.053444281220436096,
-0.006788935046643019,
-0.004764570854604244,
0.08292929083108902,
-0.05302538722753525,
-0.0432034432888031,
0.04842718318104744,
0.06276832520961761,
0.005257... |
huggingtweets/alexfiguii | 83288e353a5d9faac73852107d04ba35a0e0e71d | 2021-05-21T18:08:19.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alexfiguii | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alexfiguii/1601463760497/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1231013808560197632/QRIgsFUE_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">NomaK96 π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@alexfiguii bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alexfiguii's tweets](https://twitter.com/alexfiguii).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>2867</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>1539</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>127</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>1201</td>
</tr>
</tbody>
</table>
[Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/38gby6t0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alexfiguii's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3fa48eut) for full transparency and reproducibility.
At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3fa48eut/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/alexfiguii'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
<!--- random size file --> | [
-0.02964808978140354,
0.12649203836917877,
0.04240777716040611,
0.015024258755147457,
0.18857425451278687,
0.0342927984893322,
0.01809600740671158,
-0.028075125068426132,
0.10710977762937546,
-0.04261653497815132,
-0.051371991634368896,
0.04756701737642288,
0.012506665661931038,
-0.0296600... |
huggingtweets/alexip | 2a359ce77b010de064c3883ddbf771b0420208e2 | 2021-05-21T18:09:25.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alexip | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alexip/1602315863564/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css">
<style>
@media (prefers-color-scheme: dark) {
.prose { color: #E2E8F0 !important; }
.prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; }
}
</style>
<section class='prose'>
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1186330591383474178/etcJHSkY_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alexis Perrier π€ AI Bot </div>
<div style="font-size: 15px; color: #657786">@alexip bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alexip's tweets](https://twitter.com/alexip).
<table style='border-width:0'>
<thead style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>
<th style='border-width:0'>Data</th>
<th style='border-width:0'>Quantity</th>
</tr>
</thead>
<tbody style='border-width:0'>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Tweets downloaded</td>
<td style='border-width:0'>3199</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Retweets</td>
<td style='border-width:0'>2059</td>
</tr>
<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>
<td style='border-width:0'>Short tweets</td>
<td style='border-width:0'>49</td>
</tr>
<tr style='border-width:0'>
<td style='border-width:0'>Tweets kept</td>
<td style='border-width:0'>1091</td>
</tr>
</tbody>
</table>
[Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/157sg90v/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alexip's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/1wz9te3l) for full transparency and reproducibility.
At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/1wz9te3l/artifacts) is logged and versioned.
## Intended uses & limitations
### How to use
You can use this model directly with a pipeline for text generation:
<pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline
generator = pipeline(<span style="color:#FF9800">'text-generation'</span>,
model=<span style="color:#FF9800">'huggingtweets/alexip'</span>)
generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre>
### Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
</section>
[](https://twitter.com/intent/follow?screen_name=borisdayma)
<section class='prose'>
For more details, visit the project repository.
</section>
[](https://github.com/borisdayma/huggingtweets)
<!--- random size file --> | [
-0.0271932240575552,
0.1306905746459961,
0.036690469831228256,
0.016197234392166138,
0.19667823612689972,
0.035559333860874176,
0.01072762906551361,
-0.022740330547094345,
0.10556210577487946,
-0.03867359086871147,
-0.05153863504528999,
0.04329659417271614,
0.013209880329668522,
-0.0275238... |
huggingtweets/alexisgallagher | c0a73e551c1bf4fe085a922b7da8187cc037029c | 2021-05-21T18:10:44.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alexisgallagher | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alexisgallagher/1616871355671/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1274068177215827968/g9sB0dE1_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">alexis π€ AI Bot </div>
<div style="font-size: 15px">@alexisgallagher bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alexisgallagher's tweets](https://twitter.com/alexisgallagher).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3250 |
| Retweets | 104 |
| Short tweets | 232 |
| Tweets kept | 2914 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/28ak07sx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alexisgallagher's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1kmu6pnu) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1kmu6pnu/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alexisgallagher')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.0657438337802887,
0.129718616604805,
0.062170322984457016,
0.023989295586943626,
0.13982251286506653,
-0.04715922847390175,
-0.00557749206200242,
-0.017203211784362793,
0.0786721408367157,
-0.05887895077466965,
-0.023657726123929024,
0.038152240216732025,
0.05994030088186264,
0.01267161... |
huggingtweets/alexisuwualexis | 180cd75a427180239fe55000ad6cfd526e051399 | 2021-06-23T18:49:20.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alexisuwualexis | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alexisuwualexis/1624474156240/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1337389555863982083/GFu_etbo_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Alexis (she/her) π³οΈββ§οΈ</div>
<div style="text-align: center; font-size: 14px;">@alexisuwualexis</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Alexis (she/her) π³οΈββ§οΈ.
| Data | Alexis (she/her) π³οΈββ§οΈ |
| --- | --- |
| Tweets downloaded | 3219 |
| Retweets | 2988 |
| Short tweets | 64 |
| Tweets kept | 167 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/t0aheh4s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alexisuwualexis's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/18q8udnh) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/18q8udnh/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alexisuwualexis')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.005517359357327223,
0.11624393612146378,
-0.016698455438017845,
0.04958360269665718,
0.17578713595867157,
-0.01402287743985653,
-0.034113895148038864,
0.03061354160308838,
0.08079325407743454,
-0.05354855954647064,
-0.002355964155867696,
0.06751032918691635,
0.01412932388484478,
-0.0341... |
huggingtweets/alexsalmond | 85beac7bbcbe0653796dca061f9c71841b3eb132 | 2021-05-21T18:11:47.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alexsalmond | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alexsalmond/1617827259731/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/929801699623088129/gNlIjLwr_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alex Salmond π€ AI Bot </div>
<div style="font-size: 15px">@alexsalmond bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alexsalmond's tweets](https://twitter.com/alexsalmond).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3194 |
| Retweets | 1155 |
| Short tweets | 19 |
| Tweets kept | 2020 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1fhlpwx8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alexsalmond's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2esw52d4) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2esw52d4/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alexsalmond')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.06696932762861252,
0.13506272435188293,
0.049687329679727554,
0.01848437637090683,
0.12782172858715057,
-0.05073140561580658,
0.006528119556605816,
-0.021764956414699554,
0.07460694015026093,
-0.06450192630290985,
-0.03640369325876236,
0.03896690905094147,
0.054580673575401306,
0.012897... |
huggingtweets/alexwadecraig | a6b9503f22b91d7c473cc6b687a39c7e07d3c337 | 2021-05-21T18:12:54.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alexwadecraig | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alexwadecraig/1616646989893/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1104572830123986944/3eG16BFY_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alexander Wade Craig π€ AI Bot </div>
<div style="font-size: 15px">@alexwadecraig bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alexwadecraig's tweets](https://twitter.com/alexwadecraig).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3220 |
| Retweets | 404 |
| Short tweets | 112 |
| Tweets kept | 2704 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3l824189/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alexwadecraig's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3kat9k6l) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3kat9k6l/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alexwadecraig')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08210081607103348,
0.13126088678836823,
0.04687642306089401,
0.00566358957439661,
0.13350294530391693,
-0.04982983320951462,
0.00035707702045328915,
-0.031095927581191063,
0.07633810490369797,
-0.04589865356683731,
-0.033363260328769684,
0.03492804616689682,
0.06121467053890228,
0.03074... |
huggingtweets/aleyda-cyrusshepard-johnmu | 64f0e3c5882530dcdccde7a296ecb90573d972a2 | 2021-09-24T15:06:39.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/aleyda-cyrusshepard-johnmu | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/aleyda-cyrusshepard-johnmu/1632495995480/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1241620963768201216/sG68m_iE_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1266844418281275392/9fhpx3n1_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1097426990699855873/lEI3EWIL_400x400.png')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Cyrus & Aleyda Solis π©π»βπ» & π§ John π§</div>
<div style="text-align: center; font-size: 14px;">@aleyda-cyrusshepard-johnmu</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Cyrus & Aleyda Solis π©π»βπ» & π§ John π§.
| Data | Cyrus | Aleyda Solis π©π»βπ» | π§ John π§ |
| --- | --- | --- | --- |
| Tweets downloaded | 3248 | 3247 | 3251 |
| Retweets | 343 | 995 | 358 |
| Short tweets | 729 | 128 | 267 |
| Tweets kept | 2176 | 2124 | 2626 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3jr2ggcg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @aleyda-cyrusshepard-johnmu's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/elwosmqy) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/elwosmqy/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/aleyda-cyrusshepard-johnmu')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.021227067336440086,
0.1191878393292427,
0.0068509443663060665,
0.04764067754149437,
0.17015522718429565,
-0.00807782169431448,
-0.04709240049123764,
0.04701976478099823,
0.07344798743724823,
-0.0473821647465229,
0.0018644951051101089,
0.08175960183143616,
0.018690504133701324,
-0.028260... |
huggingtweets/alfieghill1 | f31ad6b139f6edfa37f64bd5027f9ab600c99905 | 2021-05-21T18:14:01.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alfieghill1 | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alfieghill1/1614109293232/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1321484463365361664/uJaI229z_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">π΄π³οΈβπAlfyπ³οΈβππ© π€ AI Bot </div>
<div style="font-size: 15px">@alfieghill1 bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alfieghill1's tweets](https://twitter.com/alfieghill1).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3171 |
| Retweets | 1187 |
| Short tweets | 510 |
| Tweets kept | 1474 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2e2bmrwg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alfieghill1's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1n271342) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1n271342/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alfieghill1')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07302360236644745,
0.12156282365322113,
0.05038440227508545,
0.03723005950450897,
0.13234561681747437,
-0.05265577882528305,
-0.03136015310883522,
-0.005227937828749418,
0.08458453416824341,
-0.06652458757162094,
-0.02765835076570511,
0.024339698255062103,
0.06547190994024277,
0.0037696... |
huggingtweets/alice333ai-jj_visuals | d9781d77e4028fbf95a8ec8a65226e6724a21073 | 2021-07-06T20:56:55.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alice333ai-jj_visuals | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alice333ai-jj_visuals/1625605011527/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1393311358293356546/tXc-X9fx_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1412466315240030217/yDDNt3-0_400x400.png')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">ποΈβ€ lison & JJ (comms closed)</div>
<div style="text-align: center; font-size: 14px;">@alice333ai-jj_visuals</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from ποΈβ€ lison & JJ (comms closed).
| Data | ποΈβ€ lison | JJ (comms closed) |
| --- | --- | --- |
| Tweets downloaded | 3216 | 3221 |
| Retweets | 1062 | 781 |
| Short tweets | 200 | 229 |
| Tweets kept | 1954 | 2211 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1sqkkxt9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alice333ai-jj_visuals's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/327x2oet) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/327x2oet/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alice333ai-jj_visuals')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.010999677702784538,
0.11983342468738556,
-0.016963735222816467,
0.04924813657999039,
0.1774873435497284,
-0.009007330983877182,
-0.046366844326257706,
0.045744627714157104,
0.07490422576665878,
-0.05608106777071953,
0.005746565293520689,
0.07216138392686844,
0.026320116594433784,
-0.029... |
huggingtweets/aliceaeterna | 2df9942a88c880b4e9139d03165827c2e175f6cb | 2021-05-21T18:18:12.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/aliceaeterna | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1343482928014237696/51aKOINn_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">che π π€ AI Bot </div>
<div style="font-size: 15px">@aliceaeterna bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@aliceaeterna's tweets](https://twitter.com/aliceaeterna).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 1419 |
| Retweets | 586 |
| Short tweets | 130 |
| Tweets kept | 703 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/26doepxr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @aliceaeterna's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1any0jue) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1any0jue/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/aliceaeterna')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.06084064021706581,
0.1648215800523758,
0.053272612392902374,
-0.004684921819716692,
0.12882383167743683,
-0.05410212278366089,
-0.001796083990484476,
-0.008716695010662079,
0.09242092072963715,
-0.04058220237493515,
-0.013727030716836452,
0.047046445310115814,
0.060019757598638535,
-0.0... |
huggingtweets/alicefromqueens | 4e2644f916959ce2f833220591bfc5476ab234b4 | 2021-07-21T21:38:57.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alicefromqueens | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alicefromqueens/1626903533456/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1372804858068230149/aSZcjxvN_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Dread Alice</div>
<div style="text-align: center; font-size: 14px;">@alicefromqueens</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Dread Alice.
| Data | Dread Alice |
| --- | --- |
| Tweets downloaded | 3249 |
| Retweets | 50 |
| Short tweets | 511 |
| Tweets kept | 2688 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/frqs20kj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alicefromqueens's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2c7152gp) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2c7152gp/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alicefromqueens')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.006241828203201294,
0.1223980188369751,
-0.013187484815716743,
0.05634940043091774,
0.17388944327831268,
-0.01563165709376335,
-0.03895905986428261,
0.02388666942715645,
0.07818473875522614,
-0.05459820106625557,
-0.002030416391789913,
0.0690297856926918,
0.01855706050992012,
-0.0306021... |
huggingtweets/alisonaharris | 4429a703c2f3ff484798b308c036c93050c4b210 | 2021-05-21T18:20:29.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alisonaharris | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alisonaharris/1617826834667/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1369323247519608836/MsoTG4Ir_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">AAH π€ AI Bot </div>
<div style="font-size: 15px">@alisonaharris bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alisonaharris's tweets](https://twitter.com/alisonaharris).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 1616 |
| Retweets | 763 |
| Short tweets | 85 |
| Tweets kept | 768 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2hmbkdpe/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alisonaharris's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2c6keq3v) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2c6keq3v/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alisonaharris')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.0803440511226654,
0.13019956648349762,
0.04906897619366646,
0.025420328602194786,
0.13344189524650574,
-0.061219535768032074,
-0.008760874159634113,
-0.014976425096392632,
0.07599669694900513,
-0.0503649078309536,
-0.024349048733711243,
0.03999059647321701,
0.06412976235151291,
0.012723... |
huggingtweets/alisonselby_ | 352c2a7fc9b8f561925fbf4749d1752b3b0db524 | 2021-06-23T18:32:39.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alisonselby_ | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alisonselby_/1624473155604/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1406680256258482178/79-ZrVAg_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Alison Selby</div>
<div style="text-align: center; font-size: 14px;">@alisonselby_</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Alison Selby.
| Data | Alison Selby |
| --- | --- |
| Tweets downloaded | 3218 |
| Retweets | 319 |
| Short tweets | 290 |
| Tweets kept | 2609 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2e6i4sab/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alisonselby_'s tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/9gpt8ktz) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/9gpt8ktz/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alisonselby_')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.012442582286894321,
0.12246129661798477,
-0.015633907169103622,
0.05934759974479675,
0.17176052927970886,
-0.009571537375450134,
-0.0401761494576931,
0.03387519344687462,
0.07642614841461182,
-0.06176307797431946,
-0.003806616412475705,
0.07305991649627686,
0.020927179604768753,
-0.0297... |
huggingtweets/almostnora | 0bd1b476a79a0452500c39d5a7a9c6b20942c0f7 | 2021-05-21T18:22:14.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/almostnora | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/almostnora/1616897539959/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1369015830000861191/gWkHCd-b_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">N.O.R.A π€ AI Bot </div>
<div style="font-size: 15px">@almostnora bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@almostnora's tweets](https://twitter.com/almostnora).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3230 |
| Retweets | 191 |
| Short tweets | 494 |
| Tweets kept | 2545 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3hy929cp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @almostnora's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3l9u4t5m) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3l9u4t5m/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/almostnora')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08879169821739197,
0.13492271304130554,
0.06039849668741226,
0.018001999706029892,
0.13525961339473724,
-0.05743483453989029,
-0.017430074512958527,
-0.015577005222439766,
0.0773109570145607,
-0.056542664766311646,
-0.018966970965266228,
0.001344171934761107,
0.06486347317695618,
0.0162... |
huggingtweets/alogins | 0d8d42537f530f80d9b2b6e74053748b14d96b1f | 2021-05-21T18:25:09.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alogins | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alogins/1616706593981/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1280197719571775488/IXebaRCu_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Arturs Logins π€ AI Bot </div>
<div style="font-size: 15px">@alogins bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alogins's tweets](https://twitter.com/alogins).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 1609 |
| Retweets | 133 |
| Short tweets | 177 |
| Tweets kept | 1299 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ic2ynnv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alogins's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/anvz7gt2) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/anvz7gt2/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alogins')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08365780115127563,
0.12057045102119446,
0.05020515248179436,
0.025239665061235428,
0.13593389093875885,
-0.06493516266345978,
0.0013027683598920703,
-0.01898225024342537,
0.07715868949890137,
-0.04804391786456108,
-0.025988668203353882,
0.03150876238942146,
0.06676100939512253,
0.008560... |
huggingtweets/alper | e2e638aaf2171438c7b04ba384b2ba8450a533f4 | 2021-05-21T18:27:52.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alper | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alper/1619479187969/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/711247322114609154/A2hfB3eL_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alper ΓuΔun-Gscheidel π΄π» π€ AI Bot </div>
<div style="font-size: 15px">@alper bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alper's tweets](https://twitter.com/alper).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3250 |
| Retweets | 0 |
| Short tweets | 129 |
| Tweets kept | 3121 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/21a6dhyx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alper's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/rkrg672y) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/rkrg672y/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alper')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08783601969480515,
0.12845787405967712,
0.049489472061395645,
0.02019774354994297,
0.13424991071224213,
-0.05679198354482651,
-0.010219712741672993,
-0.030652742832899094,
0.061222001910209656,
-0.05394725501537323,
-0.028537537902593613,
0.008810127153992653,
0.06604663282632828,
-0.00... |
huggingtweets/alphaxchange-coinmarketcap-techcrunch | 217bd175bb80ae40a821697c88238926a22641cd | 2022-01-31T01:31:27.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alphaxchange-coinmarketcap-techcrunch | 0 | null | transformers | ---
language: en
thumbnail: http://www.huggingtweets.com/alphaxchange-coinmarketcap-techcrunch/1643592683390/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1475337078544248835/JRWM0Hsl_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1096066608034918401/m8wnTWsX_400x400.png')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1469027897209987081/fCdlufKH_400x400.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">CoinMarketCap & TechCrunch & AlphaExchange</div>
<div style="text-align: center; font-size: 14px;">@alphaxchange-coinmarketcap-techcrunch</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from CoinMarketCap & TechCrunch & AlphaExchange.
| Data | CoinMarketCap | TechCrunch | AlphaExchange |
| --- | --- | --- | --- |
| Tweets downloaded | 3249 | 3250 | 185 |
| Retweets | 247 | 29 | 25 |
| Short tweets | 209 | 9 | 17 |
| Tweets kept | 2793 | 3212 | 143 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ii2008f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alphaxchange-coinmarketcap-techcrunch's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/28z1wzo5) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/28z1wzo5/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alphaxchange-coinmarketcap-techcrunch')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.028908252716064453,
0.11854559183120728,
0.006002782378345728,
0.040556617081165314,
0.16673313081264496,
-0.007651059422641993,
-0.03534918650984764,
0.04568910598754883,
0.06761310994625092,
-0.04531100019812584,
0.006478809751570225,
0.07897701859474182,
0.012655481696128845,
-0.0330... |
huggingtweets/alt_kia | 38fc78e30facf1d4fe00547912eb136602fd55cb | 2021-05-21T18:29:13.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alt_kia | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/alt_kia/1616891056624/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1357168007055872000/QQez_OqS_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Kiaββ π€ AI Bot </div>
<div style="font-size: 15px">@alt_kia bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@alt_kia's tweets](https://twitter.com/alt_kia).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3243 |
| Retweets | 715 |
| Short tweets | 449 |
| Tweets kept | 2079 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2oea8dpz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alt_kia's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1aog3cgu) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1aog3cgu/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alt_kia')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.0806451365351677,
0.14723652601242065,
0.052880749106407166,
0.02618546597659588,
0.13148805499076843,
-0.04445322975516319,
-0.0011636352865025401,
-0.014780193567276001,
0.07518989592790604,
-0.051449984312057495,
-0.019757535308599472,
0.026232555508613586,
0.0750429630279541,
0.0032... |
huggingtweets/altcoinpsycho-digitalartchick-justintrimble | 6cdc1a1275425e7607e5618c084169cf456f32a0 | 2021-05-21T18:30:23.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/altcoinpsycho-digitalartchick-justintrimble | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/altcoinpsycho-digitalartchick-justintrimble/1620934521680/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1388134163753185283/OrCvyNfy_400x400.png')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1004150565302034432/kRnEUZA8_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1343657798895366152/RMYAEzre_400x400.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">artchick.eth π₯ & Altcoin Psycho & JUSTIN</div>
<div style="text-align: center; font-size: 14px;">@altcoinpsycho-digitalartchick-justintrimble</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from artchick.eth π₯ & Altcoin Psycho & JUSTIN.
| Data | artchick.eth π₯ | Altcoin Psycho | JUSTIN |
| --- | --- | --- | --- |
| Tweets downloaded | 3250 | 3249 | 3248 |
| Retweets | 142 | 34 | 254 |
| Short tweets | 654 | 461 | 863 |
| Tweets kept | 2454 | 2754 | 2131 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3uuqza2m/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @altcoinpsycho-digitalartchick-justintrimble's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/gis597aj) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/gis597aj/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/altcoinpsycho-digitalartchick-justintrimble')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.03065500780940056,
0.11799260228872299,
0.011864648200571537,
0.04283557087182999,
0.16276682913303375,
-0.0038168306928128004,
-0.03721970319747925,
0.04134336858987808,
0.07264614850282669,
-0.043942853808403015,
0.011620279401540756,
0.07207890599966049,
0.026052182540297508,
-0.0335... |
huggingtweets/alterhuss-zainabverse | bee89b8dde542bb49d426cf2eb7886224caebd30 | 2021-12-14T07:46:28.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/alterhuss-zainabverse | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1467618648961527812/jtH0RZpT_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1468367771746672643/21w6R4SP_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Alter Huss & Zainab</div>
<div style="text-align: center; font-size: 14px;">@alterhuss-zainabverse</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Alter Huss & Zainab.
| Data | Alter Huss | Zainab |
| --- | --- | --- |
| Tweets downloaded | 3229 | 3246 |
| Retweets | 125 | 95 |
| Short tweets | 1004 | 426 |
| Tweets kept | 2100 | 2725 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/8ibzokov/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @alterhuss-zainabverse's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3d8wr9hg) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3d8wr9hg/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/alterhuss-zainabverse')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.016837315633893013,
0.12209872901439667,
-0.0027769706211984158,
0.037039536982774734,
0.15423373878002167,
-0.014272862114012241,
-0.04089462757110596,
0.04830095171928406,
0.05767175555229187,
-0.045817382633686066,
0.00900938268750906,
0.09147445112466812,
0.02226235717535019,
-0.027... |
huggingtweets/ambivalegenic-dril | dff0c6dc19b82f010a77078fd502f7de1d4db3a8 | 2021-12-10T06:25:10.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/ambivalegenic-dril | 0 | null | transformers | ---
language: en
thumbnail: http://www.huggingtweets.com/ambivalegenic-dril/1639117433317/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1404698622579462144/8oiBunaK_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/847818629840228354/VXyQHfn0_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">tomboy housewives against cops & wint</div>
<div style="text-align: center; font-size: 14px;">@ambivalegenic-dril</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from tomboy housewives against cops & wint.
| Data | tomboy housewives against cops | wint |
| --- | --- | --- |
| Tweets downloaded | 3154 | 3226 |
| Retweets | 781 | 472 |
| Short tweets | 266 | 304 |
| Tweets kept | 2107 | 2450 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3m5g8gro/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ambivalegenic-dril's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/27fdnf8e) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/27fdnf8e/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/ambivalegenic-dril')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.020171424373984337,
0.12460372596979141,
-0.0043433187529444695,
0.04671579599380493,
0.16922077536582947,
-0.008179656229913235,
-0.04322614520788193,
0.03402707725763321,
0.06991855055093765,
-0.05865649878978729,
0.005411689169704914,
0.05902979150414467,
0.02815578505396843,
-0.0233... |
huggingtweets/amccarty | 43bf91db2d43972a4a676febc3e838e5579ccc3d | 2021-05-21T18:36:59.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/amccarty | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/amccarty/1617899959147/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/83933348/IMG00128_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alan McCarty π€ AI Bot </div>
<div style="font-size: 15px">@amccarty bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@amccarty's tweets](https://twitter.com/amccarty).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 569 |
| Retweets | 172 |
| Short tweets | 30 |
| Tweets kept | 367 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/l51uxin3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @amccarty's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1bw34kk4) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1bw34kk4/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/amccarty')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08952639997005463,
0.07484665513038635,
0.013984338380396366,
0.02132679894566536,
0.1044657900929451,
-0.0471324622631073,
-0.008672913536429405,
0.012438924983143806,
0.05055795609951019,
-0.05691347271203995,
-0.024988893419504166,
0.01926717348396778,
0.1193636953830719,
0.022062201... |
huggingtweets/amelamelcia | b7391eae1beaee7c7a32dca4e28d8175433368db | 2021-11-26T18:07:27.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/amelamelcia | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/amelamelcia/1637950041914/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1453350245383946240/cBFwCk3J_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Amelka</div>
<div style="text-align: center; font-size: 14px;">@amelamelcia</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Amelka.
| Data | Amelka |
| --- | --- |
| Tweets downloaded | 3244 |
| Retweets | 101 |
| Short tweets | 550 |
| Tweets kept | 2593 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/tomda94s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @amelamelcia's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1hxvf49x) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1hxvf49x/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/amelamelcia')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.012621995061635971,
0.12348531186580658,
-0.010663469322025776,
0.05246175825595856,
0.17459161579608917,
-0.017354344949126244,
-0.03877642750740051,
0.02568625845015049,
0.07981575280427933,
-0.0587117001414299,
-0.0023687772918492556,
0.06889263540506363,
0.012454074807465076,
-0.025... |
huggingtweets/amirism_ | 6fa0e53cc157119b1c5edd13fc559c8620a0f755 | 2021-05-21T18:41:30.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/amirism_ | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/amirism_/1616611950115/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374784520742866949/RBO-C7n8_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Amir of Amirs π€ AI Bot </div>
<div style="font-size: 15px">@amirism_ bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@amirism_'s tweets](https://twitter.com/amirism_).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3246 |
| Retweets | 137 |
| Short tweets | 655 |
| Tweets kept | 2454 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3jwwptdm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @amirism_'s tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/jf0rjdbf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/jf0rjdbf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/amirism_')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08090103417634964,
0.1491723656654358,
0.039563003927469254,
0.023945007473230362,
0.12381990998983383,
-0.050244346261024475,
0.01639462448656559,
-0.03453540429472923,
0.08235147595405579,
-0.04765995219349861,
-0.014125523157417774,
0.02915283851325512,
0.09057512134313583,
0.0200387... |
huggingtweets/ammienoot | bcfb6519f5e31737bb7c18404982beb7b2563c35 | 2021-05-21T18:42:36.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/ammienoot | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1324792261775798272/hlRK8lBU_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Anne-Marie Scott π€ AI Bot </div>
<div style="font-size: 15px">@ammienoot bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@ammienoot's tweets](https://twitter.com/ammienoot).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3251 |
| Retweets | 355 |
| Short tweets | 209 |
| Tweets kept | 2687 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/372xzuxt/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ammienoot's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2l19ykmz) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2l19ykmz/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/ammienoot')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.05237507075071335,
0.13697127997875214,
0.03949644789099693,
0.007637371774762869,
0.1392798125743866,
-0.053630635142326355,
-0.0010457969037815928,
-0.003126699011772871,
0.06559540331363678,
-0.044251617044210434,
-0.02680770866572857,
0.049650534987449646,
0.072818823158741,
-0.0168... |
huggingtweets/amnananadeem-talal916 | a00604f4f64992e4c470d1e38fe9e50383d8fa65 | 2021-12-28T12:50:37.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/amnananadeem-talal916 | 0 | null | transformers | ---
language: en
thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1433365322313043974/gPI08qaY_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1377835980552474624/sxTjuspv_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI CYBORG π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">halal talal & amna</div>
<div style="text-align: center; font-size: 14px;">@amnananadeem-talal916</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from halal talal & amna.
| Data | halal talal | amna |
| --- | --- | --- |
| Tweets downloaded | 3187 | 3132 |
| Retweets | 484 | 778 |
| Short tweets | 532 | 369 |
| Tweets kept | 2171 | 1985 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/42dvu161/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @amnananadeem-talal916's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2irbhtmu) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2irbhtmu/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/amnananadeem-talal916')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.019793834537267685,
0.12163417041301727,
-0.004570086486637592,
0.03700568154454231,
0.1524764448404312,
-0.01700167916715145,
-0.03842278942465782,
0.044387150555849075,
0.05840850621461868,
-0.04895315691828728,
0.006255538668483496,
0.09013775736093521,
0.022342078387737274,
-0.02739... |
huggingtweets/amphydelic | e4d0492daeb5be90c63479b50c6d83874f09a314 | 2021-05-21T18:46:10.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/amphydelic | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/amphydelic/1617771402481/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1377851124569370625/vh0fnxXt_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">amphy #nicechan π€ AI Bot </div>
<div style="font-size: 15px">@amphydelic bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@amphydelic's tweets](https://twitter.com/amphydelic).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3142 |
| Retweets | 770 |
| Short tweets | 711 |
| Tweets kept | 1661 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3o1nuvfq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @amphydelic's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3mitl8mt) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3mitl8mt/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/amphydelic')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07139414548873901,
0.12446684390306473,
0.07644746452569962,
0.011559643782675266,
0.11822977662086487,
-0.05563689395785332,
0.0001725309994071722,
-0.022496918216347694,
0.062393918633461,
-0.05245301499962807,
-0.011481932364404202,
0.010730808600783348,
0.07337991893291473,
0.031100... |
huggingtweets/analogcitizen | 26c4cbaaec31b6a4110cdd19e9fd051f17a5bebf | 2021-05-21T18:50:57.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/analogcitizen | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/analogcitizen/1617805157885/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1304485450103439360/mD4PsYPQ_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Clara, Social Distancing World Champ (2010-2019) π€ AI Bot </div>
<div style="font-size: 15px">@analogcitizen bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@analogcitizen's tweets](https://twitter.com/analogcitizen).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 2997 |
| Retweets | 1309 |
| Short tweets | 189 |
| Tweets kept | 1499 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3od4vbha/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @analogcitizen's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1del2d6l) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1del2d6l/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/analogcitizen')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07397764176130295,
0.11524303257465363,
0.05318913608789444,
0.0038713954854756594,
0.11078394949436188,
-0.022569740191102028,
-0.0164334774017334,
-0.007503392640501261,
0.05447487160563469,
-0.06319257616996765,
-0.01888887956738472,
0.016982221975922585,
0.06673803925514221,
0.00605... |
huggingtweets/anarchystax | 3cca20014e51473e8ecdde8d4c954b00219d0b9f | 2021-05-21T18:52:34.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/anarchystax | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/anarchystax/1616622386680/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1372091789549654016/L09IStLl_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Terra π§π΄ββ οΈ π€ AI Bot </div>
<div style="font-size: 15px">@anarchystax bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@anarchystax's tweets](https://twitter.com/anarchystax).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 239 |
| Retweets | 59 |
| Short tweets | 43 |
| Tweets kept | 137 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ouqtufl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @anarchystax's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3d1tkfmr) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3d1tkfmr/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/anarchystax')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.0877927839756012,
0.1222802996635437,
0.05295111611485481,
0.024111466482281685,
0.13144010305404663,
-0.07357094436883926,
-0.01627323403954506,
-0.028605259954929352,
0.05279506742954254,
-0.014080792665481567,
-0.011812449432909489,
0.0007214893703348935,
0.09405898302793503,
0.01778... |
huggingtweets/andevereaux | 5e839d692987ee0dc7609c48a4f066db7ad1a539 | 2021-05-21T18:55:21.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/andevereaux | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/andevereaux/1617929324096/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1376978076962291717/HedQhFmm_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Abigail Devereaux π΄ π³οΈβππΏοΈ π€ π€ AI Bot </div>
<div style="font-size: 15px">@andevereaux bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@andevereaux's tweets](https://twitter.com/andevereaux).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3239 |
| Retweets | 359 |
| Short tweets | 240 |
| Tweets kept | 2640 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1q4g34cr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @andevereaux's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3dbw2lmp) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3dbw2lmp/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/andevereaux')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.08582831919193268,
0.12135637551546097,
0.060944683849811554,
0.009968851692974567,
0.11707807332277298,
-0.04585306718945503,
-0.017059842124581337,
-0.0416102334856987,
0.06386645138263702,
-0.053998157382011414,
-0.0052412874065339565,
0.005765899550169706,
0.07262026518583298,
0.006... |
huggingtweets/andrewcuomo | 01043194e777fa5ca5ad640e21d3991c6cc425ff | 2021-05-21T18:58:21.000Z | [
"pytorch",
"jax",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/andrewcuomo | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/andrewcuomo/1619299470278/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div>
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/999284567369383936/Zm7tWU0S_400x400.jpg')">
</div>
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Andrew Cuomo π€ AI Bot </div>
<div style="font-size: 15px">@andrewcuomo bot</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on [@andrewcuomo's tweets](https://twitter.com/andrewcuomo).
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 1074 |
| Retweets | 353 |
| Short tweets | 9 |
| Tweets kept | 712 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2slpq0r3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @andrewcuomo's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/39xi2g7u) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/39xi2g7u/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/andrewcuomo')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.07047632336616516,
0.12811774015426636,
0.04741249606013298,
0.02849767729640007,
0.1348138451576233,
-0.05138963833451271,
-0.01337424572557211,
-0.010811308398842812,
0.07558251917362213,
-0.052339326590299606,
-0.02693992853164673,
0.033543772995471954,
0.06408485025167465,
0.0150317... |
huggingtweets/angadc | 84c401cf1582f5a330c848cb7eb90e9f7da90cb7 | 2021-11-01T19:02:49.000Z | [
"pytorch",
"gpt2",
"text-generation",
"en",
"transformers",
"huggingtweets"
] | text-generation | false | huggingtweets | null | huggingtweets/angadc | 0 | null | transformers | ---
language: en
thumbnail: https://www.huggingtweets.com/angadc/1635793364907/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1450701284223324169/JBNbe32v_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">π€ AI BOT π€</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Angad Singh Chowdhry</div>
<div style="text-align: center; font-size: 14px;">@angadc</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Angad Singh Chowdhry.
| Data | Angad Singh Chowdhry |
| --- | --- |
| Tweets downloaded | 3229 |
| Retweets | 567 |
| Short tweets | 685 |
| Tweets kept | 1977 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1wsxza1p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @angadc's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2ck4g0as) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2ck4g0as/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/angadc')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
| [
-0.012907822616398335,
0.12171672284603119,
-0.015829257667064667,
0.05646442994475365,
0.17465698719024658,
-0.010299096815288067,
-0.04426580294966698,
0.03011980652809143,
0.07669807970523834,
-0.06028084084391594,
-0.0020007395651191473,
0.07135384529829025,
0.01741994172334671,
-0.029... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.