---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:16000
- loss:DenoisingAutoEncoderLoss
base_model: google-bert/bert-base-uncased
widget:
- source_sentence: can so hopeless to who cares
sentences:
- id done that though it kind of did a on me and i found myself sympathizing with
the demons as the church called them and feeling more disgusted with the people
who were supposed to be trying to fight them off
- i can go from feeling so hopeless to so damned hopeful just from being around
someone who cares and is awake
- i feel quite honored to exhibit my work in portugal especially within the critical
and philosophical context of the god factor project said west
- source_sentence: im feeling regretful not back i exact things you i would also to
you letters
sentences:
- i feel like people dont really want me in their company but also they dont want
to hurt my feelings
- i continue to succeed in something and having someone seems unattainable because
i feel men will be intimidated or when there is a prolonged moment of silence
- im feeling regretful about not writing back to you i felt the exact same things
you did and i would have also loved to have you read my letters
- source_sentence: feel there not because or gary feel i moving them
sentences:
- i feel so unwelcome there but not because of her or gary i just feel that i shouldnt
be moving back in with them
- i dont know why but every time i feel like i am doing someone a favor all the
time i start to feel burdened and stressed by that
- id have spent more time with her on reading i feel a bit guilty about that
- source_sentence: came diy twiggy holder feeling all and
sentences:
- i watch movies set in the s and s i feel pangs of melancholy
- i came across this picture of a diy twiggy candle holder and now im feeling all
festive and creative
- i read other peoples posts there are moments where i feel id give my left fingernail
to be them my left fingernail is precious because its the only one i can polish
perfectly out of the
- source_sentence: i missed precious summer
sentences:
- i feel so frightened i just wanted to document the way i m feeling
- i really feel like i have a lot to offer in this area i would like to focus on
troubled teenagers
- i feel like i missed most of my precious summer
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---
# SentenceTransformer based on google-bert/bert-base-uncased
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased)
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("KiViDrag/pretrain_emotion2")
# Run inference
sentences = [
'i missed precious summer',
'i feel like i missed most of my precious summer',
'i feel so frightened i just wanted to document the way i m feeling',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 16,000 training samples
* Columns: sentence_0 and sentence_1
* Approximate statistics based on the first 1000 samples:
| | sentence_0 | sentence_1 |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details |
i | i like to slump into when i m feeling precious |
| i say make anyone feel reaching their | i could say that will make anyone feel better than actually reaching their goal themselves |
| wont | i wont feel so damn idiotic |
* Loss: [DenoisingAutoEncoderLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#denoisingautoencoderloss)
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `num_train_epochs`: 9
- `multi_dataset_batch_sampler`: round_robin
#### All Hyperparameters