| NOTICE | |
| This repository contains LoRA adapter weights for EpiMistral-7B. | |
| BASE MODEL LICENSE | |
| ================== | |
| The base model (Mistral-7B-OpenOrca, based on Mistral 7B) is licensed under | |
| the Apache License 2.0. | |
| Copyright (c) Mistral AI (Mistral 7B base model) | |
| Copyright (c) OpenOrca (instruction-tuned variant) | |
| The Apache License 2.0 is a permissive open-source license that allows: | |
| - Commercial use | |
| - Modification | |
| - Distribution | |
| - Patent use | |
| - Private use | |
| Requirements: | |
| - Preserve copyright and license notices | |
| - State significant changes made to the software | |
| - Include a copy of the Apache 2.0 license | |
| The Apache License 2.0 can be found at: | |
| https://www.apache.org/licenses/LICENSE-2.0 | |
| ADAPTER WEIGHTS LICENSE | |
| ======================= | |
| The LoRA adapter weights contained in this repository are released under: | |
| CC0 1.0 Universal (Public Domain Dedication) | |
| https://creativecommons.org/publicdomain/zero/1.0/ | |
| DISTRIBUTION | |
| ============ | |
| This repository distributes only the fine-tuned LoRA adapter parameters. | |
| Base model weights must be obtained separately from Hugging Face: | |
| - Open-Orca/Mistral-7B-OpenOrca | |
| The base model weights remain subject to the Apache License 2.0. |