Update README.md
Browse files
README.md
CHANGED
|
@@ -23,6 +23,11 @@ This model was merged using the SLERP merge method.
|
|
| 23 |
The following models were included in the merge:
|
| 24 |
* [lucyknada/microsoft_WizardLM-2-7B](https://huggingface.co/lucyknada/microsoft_WizardLM-2-7B)
|
| 25 |
* [mergekit-community/TopEvolution](https://huggingface.co/mergekit-community/TopEvolution)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 26 |
|
| 27 |
### Configuration
|
| 28 |
|
|
|
|
| 23 |
The following models were included in the merge:
|
| 24 |
* [lucyknada/microsoft_WizardLM-2-7B](https://huggingface.co/lucyknada/microsoft_WizardLM-2-7B)
|
| 25 |
* [mergekit-community/TopEvolution](https://huggingface.co/mergekit-community/TopEvolution)
|
| 26 |
+
### I arrived at this model after careful evaluation of language and behavior by first choosing two models
|
| 27 |
+
and the result was combined with the result of two other models. TopEvolutionWiz was born. Model who has
|
| 28 |
+
demonstrated remarkable empathic and reasoning skills. He uses fluent language and adapts to any scenario.
|
| 29 |
+
He responded positively to 50 questions out of 50 of a generic, historical, psychological etc. nature
|
| 30 |
+
It also had a very good impression from Gpt4o
|
| 31 |
|
| 32 |
### Configuration
|
| 33 |
|