| ---
|
| license: mit
|
| ---
|
|
|
| # StoryEmb
|
|
|
| This is the model from [our paper](https://www.inf.uni-hamburg.de/en/inst/ab/lt/publications/2024-hatzel-et-al-emnlp.pdf) "Story Embeddings — Narrative-Focused Representations of Fictional Stories". We publish the variant trained on augmented data.
|
|
|
| You can use this just like the base model [intfloat/e5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct).
|
| We trained with the task prefix "Retrieve stories with a similar narrative to the given story: " so we recommand that you use this for story retrieval.
|
|
|
| If you are interested in just the adapter weights (e.g. for continued fine-tuning), check out the directory `adapter-weights`.
|
|
|
| Do not hestitate to reach out if you encounter any issues! You are likely one of the first few people actually downloading this :)
|
|
|
|
|
| ## Citation
|
|
|
| If you are making use of this model in your publication, please cite our paper:
|
|
|
| ```
|
| @inproceedings{hatzel-biemann-2024-story-embeddings,
|
| title = "Story Embeddings -- Narrative-Focused Representations of Fictional Stories",
|
| author = "Hatzel, Hans Ole and Biemann, Chris",
|
| booktitle = "Proceedings of the 62st Annual Meeting of the Association for Computational Linguistics",
|
| year = "2024",
|
| address = "Miami, Florida",
|
| publisher = "Association for Computational Linguistics",
|
| }
|
| ```
|
|
|
|
|