Update README.md
Browse files
README.md
CHANGED
|
@@ -13,7 +13,7 @@ tags:
|
|
| 13 |
- vae
|
| 14 |
- pytorch
|
| 15 |
---
|
| 16 |
-
#
|
| 17 |
This model was made with the Micro Distillery app available at:
|
| 18 |
|
| 19 |
webxos.netlify.app/MICROD
|
|
@@ -27,6 +27,9 @@ webxos.netlify.app/MICROD
|
|
| 27 |
|
| 28 |
## Model Description
|
| 29 |
This is a distilled language model trained using Group Relative Policy Optimization (GRPO) with VAE filtering.
|
|
|
|
|
|
|
|
|
|
| 30 |
|
| 31 |
## Model Details
|
| 32 |
- **Model type**: micro-distill-grpo-vae
|
|
|
|
| 13 |
- vae
|
| 14 |
- pytorch
|
| 15 |
---
|
| 16 |
+
# MICROD v1.0 by webXOS
|
| 17 |
This model was made with the Micro Distillery app available at:
|
| 18 |
|
| 19 |
webxos.netlify.app/MICROD
|
|
|
|
| 27 |
|
| 28 |
## Model Description
|
| 29 |
This is a distilled language model trained using Group Relative Policy Optimization (GRPO) with VAE filtering.
|
| 30 |
+
**MICROD v1.0** is a small template model designed to be built upon for custom ground up builds. It is distillated into a
|
| 31 |
+
small set of files the user can use to template their own agents. Designed for educational learning and micro scalling.
|
| 32 |
+
Use **MICROD V1.0** in your own custom projects and train it from the ground up.
|
| 33 |
|
| 34 |
## Model Details
|
| 35 |
- **Model type**: micro-distill-grpo-vae
|