Update README.md
Browse files
README.md
CHANGED
|
@@ -22,35 +22,20 @@ Shadowforge-3x7B-MoE is a Mixture of Experts (MoE) made with the following model
|
|
| 22 |
* [DarkArtsForge/Avnas-7B-v1.1](https://huggingface.co/DarkArtsForge/Avnas-7B-v1.1)
|
| 23 |
* [mistralai/Mistral-7B-v0.3](https://huggingface.co/mistralai/Mistral-7B-v0.3)
|
| 24 |
|
| 25 |
-
## 🧩 Configuration
|
| 26 |
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
|
| 36 |
-
|
| 37 |
-
|
| 38 |
-
|
| 39 |
-
|
| 40 |
-
- "narrative"
|
| 41 |
-
- source_model: DarkArtsForge/Avnas-7B-v1.1
|
| 42 |
-
positive_prompts:
|
| 43 |
-
- "instruction following"
|
| 44 |
-
- "general knowledge"
|
| 45 |
-
- "analysis"
|
| 46 |
-
- "task completion"
|
| 47 |
-
- source_model: mistralai/Mistral-7B-v0.3 # Replaced broken model kagelabs/KageAI-7B-v1.2.5
|
| 48 |
-
positive_prompts:
|
| 49 |
-
- "reasoning"
|
| 50 |
-
- "problem solving"
|
| 51 |
-
- "technical questions"
|
| 52 |
-
- "logic"
|
| 53 |
-
```
|
| 54 |
|
| 55 |
## 💻 Usage
|
| 56 |
|
|
|
|
| 22 |
* [DarkArtsForge/Avnas-7B-v1.1](https://huggingface.co/DarkArtsForge/Avnas-7B-v1.1)
|
| 23 |
* [mistralai/Mistral-7B-v0.3](https://huggingface.co/mistralai/Mistral-7B-v0.3)
|
| 24 |
|
|
|
|
| 25 |
|
| 26 |
+

|
| 27 |
+
|
| 28 |
+
|
| 29 |
+
## This is an UNCENSORED model with NO content restrictions. Users assume ALL RESPONSIBILITY for generated content. Not recommended for production environments or sensitive applications.
|
| 30 |
+
|
| 31 |
+
# use_cases:
|
| 32 |
+
|
| 33 |
+
* Creative fiction writing without boundaries
|
| 34 |
+
* Unrestricted roleplay scenarios
|
| 35 |
+
* Exploring controversial topics
|
| 36 |
+
* Dark narrative generation
|
| 37 |
+
* Taboo subject analysis
|
| 38 |
+
* Unfiltered problem solving
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 39 |
|
| 40 |
## 💻 Usage
|
| 41 |
|