Text Generation
Transformers
English
File size: 1,327 Bytes
fc44f00
8ce28e4
 
fc44f00
 
 
 
8ce28e4
396981b
 
fc44f00
c629be3
 
 
 
65b094f
e9a03c5
 
 
 
 
 
 
 
 
 
65b094f
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
---
base_model:
- mistralai/Mistral-7B-Instruct-v0.3
datasets:
- noystl/Recombination-Extraction
language:
- en
library_name: transformers
license: cc
pipeline_tag: text-generation
---

This Hugging Face repository contains a fine-tuned Mistral model trained for the task of extracting recombination examples from scientific abstracts, as described in the paper [CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature](https://huggingface.co/papers/2505.20779). The model utilizes a LoRA adapter on top of a Mistral base model.
The model can be used for the information extraction task of identifying recombination examples within scientific text. For detailed usage instructions and reproduction of results, please refer to the Github repository linked above.

**Bibtex**
```bibtex
@misc{sternlicht2025chimeraknowledgebaseidea,
      title={CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature}, 
      author={Noy Sternlicht and Tom Hope},
      year={2025},
      eprint={2505.20779},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2505.20779}, 
}
```

**Quick Links**
- 🌐 [Project](https://noy-sternlicht.github.io/CHIMERA-Web)
- 📃 [Paper](https://arxiv.org/abs/2505.20779)
- 🛠️ [Code](https://github.com/noy-sternlicht/CHIMERA-KB)