CultriX commited on
Commit
6d310d8
·
verified ·
1 Parent(s): aabd58c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -7
README.md CHANGED
@@ -17,21 +17,25 @@ The following is a quote directly taken from that models page:
17
 
18
  In my understanding the Winogrande scores are only slightly influenced by the DPO-Contamination, that has the "side-effect" of increasing the scores on the other benchmarks.
19
  Since the effect on the Winogrande scores was subtle in the udkai/Turdus benchmarking results, and this model combines it with other models (probably making this effect even less pronounced),
20
- I still believe that this model can be of value to the community as it's overall performance is quite impressive.
21
-
22
- However I do not want to mislead anybody or produce any unfair scores, hence this note!
23
- The full training configuration is also fully transparant and can be found below.
24
-
25
- Hope this model will prove useful.
26
- There's GGUF versions available here for inference: https://huggingface.co/CultriX/MergeTrix-7B-GGUF
27
 
 
28
  Kind regards,
29
  CultriX
30
 
 
 
 
 
 
 
 
31
  # Shoutout
32
  Once again, a major thank you and shoutout to @mlabonne for his amazing article that I used to produce this result which can be found here: https://towardsdatascience.com/merge-large-language-models-with-mergekit-2118fb392b54
33
  My other model, CultriX/MistralTrix-v1, was based on another great article from the same guy, which can be found here: https://towardsdatascience.com/fine-tune-a-mistral-7b-model-with-direct-preference-optimization-708042745aac
34
  (I hope he doesn't mind me using his own articles to beat him on the LeaderBoards for the second time this week... Like last time, all credit should be directed at him really!)
 
35
 
36
  # MODEL INFORMATION:
37
  # NAME: MergeTrix-7B
 
17
 
18
  In my understanding the Winogrande scores are only slightly influenced by the DPO-Contamination, that has the "side-effect" of increasing the scores on the other benchmarks.
19
  Since the effect on the Winogrande scores was subtle in the udkai/Turdus benchmarking results, and this model combines it with other models (probably making this effect even less pronounced),
20
+ I still believe that this model can be of value to the community as it's overall performance is quite impressive.
21
+ However I do not want to mislead anybody or produce any unfair scores, hence this note! The full training configuration is also fully transparant and can be found below.
 
 
 
 
 
22
 
23
+ I Hope this model will prove useful to somebody. There's GGUF versions available here for inference: https://huggingface.co/CultriX/MergeTrix-7B-GGUF
24
  Kind regards,
25
  CultriX
26
 
27
+ # PERSONAL DISCLAIMER
28
+ (This is probably a good moment to point out that I'm an amateur doing this for fun and am by no means an IT professional or data scientist.
29
+ Therefore my understanding of these topics might be incomplete, missing or simply completely wrong in turn causing me to make inaccurate claims.
30
+ If you notice that's the case I invite you to notify me of my mistakes so that I can rectify any potential inaccuracies as soon as possible. Thanks for understanding!)
31
+ I Hope this model will prove useful to somebody.
32
+ There's GGUF versions available here for inference: https://huggingface.co/CultriX/MergeTrix-7B-GGUF
33
+
34
  # Shoutout
35
  Once again, a major thank you and shoutout to @mlabonne for his amazing article that I used to produce this result which can be found here: https://towardsdatascience.com/merge-large-language-models-with-mergekit-2118fb392b54
36
  My other model, CultriX/MistralTrix-v1, was based on another great article from the same guy, which can be found here: https://towardsdatascience.com/fine-tune-a-mistral-7b-model-with-direct-preference-optimization-708042745aac
37
  (I hope he doesn't mind me using his own articles to beat him on the LeaderBoards for the second time this week... Like last time, all credit should be directed at him really!)
38
+ es to beat him on the LeaderBoards for the second time this week... Like last time, all credit should be directed at him really!)
39
 
40
  # MODEL INFORMATION:
41
  # NAME: MergeTrix-7B