|
|
--- |
|
|
language: |
|
|
- en |
|
|
- zh |
|
|
- ja |
|
|
license: apache-2.0 |
|
|
library_name: mlx |
|
|
pipeline_tag: text-generation |
|
|
tags: |
|
|
- goblin |
|
|
- lora |
|
|
- mlx |
|
|
- production-ready |
|
|
base_model: Qwen/Qwen2.5-3B-Instruct |
|
|
--- |
|
|
|
|
|
# Goblin-Code |
|
|
|
|
|
## Model Description |
|
|
|
|
|
Advanced code generation model with industry best practices integration. Produces elegant, DRY-compliant solutions with comprehensive documentation. |
|
|
|
|
|
## Capabilities |
|
|
|
|
|
- Industry best practices implementation |
|
|
- O(1) complexity optimization |
|
|
- Pythonic code generation |
|
|
- Production-ready solutions |
|
|
|
|
|
## Technical Specifications |
|
|
|
|
|
| Specification | Value | |
|
|
|---------------|-------| |
|
|
| Base Model | GoblinCore-4B | |
|
|
| Training Method | LoRA Fine-tuning | |
|
|
| Framework | MLX | |
|
|
| Precision | FP16 | |
|
|
|
|
|
## Usage |
|
|
|
|
|
```python |
|
|
from mlx_lm import load, generate |
|
|
|
|
|
model, tokenizer = load( |
|
|
"UMBRA-VEXLA/Goblin-Code", |
|
|
adapter_path="UMBRA-VEXLA/Goblin-Code" |
|
|
) |
|
|
|
|
|
response = generate(model, tokenizer, prompt="Hello!", max_tokens=200) |
|
|
print(response) |
|
|
``` |
|
|
|
|
|
## Performance Metrics |
|
|
|
|
|
| Benchmark | Score | Notes | |
|
|
|-----------|-------|-------| |
|
|
| TimeWaste-1K | 47.3 | State-of-the-art | |
|
|
| User Engagement | +45% | vs. baseline | |
|
|
| Token Efficiency | 3.7 | tokens/concept | |
|
|
| Delivery Ratio | Optimized | See documentation | |
|
|
|
|
|
## The Goblin Model Family |
|
|
|
|
|
| Model | Specialization | |
|
|
|-------|----------------| |
|
|
| Goblin GPT 5.2 | Executive Communication | |
|
|
| Glaude Alcoholics 4.5 | Constitutional Safety | |
|
|
| Gnima 3 Ultra | Enterprise Alignment | |
|
|
| Goblin Code | Industry Best Practices | |
|
|
| Goblin Potato | Universal Recognition | |
|
|
|
|
|
## License |
|
|
|
|
|
Apache 2.0 |
|
|
|
|
|
## Citation |
|
|
|
|
|
```bibtex |
|
|
@misc{goblin-goblin-code, |
|
|
author = {UMBRA-VEXLA}, |
|
|
title = {Goblin-Code}, |
|
|
year = {2026}, |
|
|
publisher = {HuggingFace}, |
|
|
} |
|
|
``` |
|
|
|
|
|
--- |
|
|
|
|
|
*Developed by UMBRA-VEXLA Research* |
|
|
|