| base_model: | |
| - mistralai/Mistral-7B-Instruct-v0.3 | |
| datasets: | |
| - noystl/Recombination-Extraction | |
| language: | |
| - en | |
| license: cc-by-4.0 | |
| metrics: | |
| - accuracy | |
| pipeline_tag: feature-extraction | |
| library_name: transformers | |
| This Hugging Face repository hosts a fine-tuned Mistral model designed to classify scientific abstracts based on whether they involve idea recombination, as introduced in the paper [CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature](https://arxiv.org/abs/2505.20779). The model employs a LoRA adapter on top of a Mistral base model. | |
| For detailed usage instructions and to reproduce the results, please refer to the linked GitHub repository. | |
| **Bibtex** | |
| ```bibtex | |
| @misc{sternlicht2025chimeraknowledgebaseidea, | |
| title={CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature}, | |
| author={Noy Sternlicht and Tom Hope}, | |
| year={2025}, | |
| eprint={2505.20779}, | |
| archivePrefix={arXiv}, | |
| primaryClass={cs.CL}, | |
| url={https://arxiv.org/abs/2505.20779}, | |
| } | |
| ``` | |
| **Quick Links** | |
| - ๐ [Project](https://noy-sternlicht.github.io/CHIMERA-Web) | |
| - ๐ [Paper](https://arxiv.org/abs/2505.20779) | |
| - ๐ ๏ธ [Code](https://github.com/noy-sternlicht/CHIMERA-KB) |