Improve model card with metadata and description
Browse filesThis PR adds metadata to the model card, improving discoverability and usability. It also enhances the README with a clearer description of the model and its purpose. The license is set to MIT.
README.md
CHANGED
|
@@ -1,4 +1,16 @@
|
|
| 1 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2 |
|
| 3 |
```bibtex
|
| 4 |
@misc{sternlicht2025chimeraknowledgebaseidea,
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
library_name: transformers
|
| 4 |
+
pipeline_tag: text-generation
|
| 5 |
+
---
|
| 6 |
+
|
| 7 |
+
This Hugging Face repository contains a fine-tuned Mistral model trained for the task of extracting recombination examples from scientific abstracts, as described in the paper [CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature](https://huggingface.co/papers/2505.20779). The model utilizes a LoRA adapter on top of a Mistral base model.
|
| 8 |
+
|
| 9 |
+
Project page: https://noy-sternlicht.github.io/CHIMERA-Web
|
| 10 |
+
|
| 11 |
+
Code: https://github.cs.huji.ac.il/tomhope-lab/CHIMERA
|
| 12 |
+
|
| 13 |
+
The model can be used for the information extraction task of identifying recombination examples within scientific text. For detailed usage instructions and reproduction of results, please refer to the Github repository linked above.
|
| 14 |
|
| 15 |
```bibtex
|
| 16 |
@misc{sternlicht2025chimeraknowledgebaseidea,
|