|
|
--- |
|
|
title: README |
|
|
emoji: 🧠 |
|
|
colorFrom: yellow |
|
|
colorTo: gray |
|
|
sdk: static |
|
|
pinned: true |
|
|
license: mit |
|
|
short_description: ITCare AI - Domain-Specific LLM Fine-tuning |
|
|
--- |
|
|
|
|
|
# ITCare AI |
|
|
|
|
|
> Fine-tuned language models for content management systems |
|
|
|
|
|
## Focus Areas |
|
|
|
|
|
### Domain-Specific Fine-tuning |
|
|
Specialized models trained on real-world CMS data to assist content editors with component selection, page building, and content structure decisions. |
|
|
|
|
|
### Training Methodology |
|
|
|
|
|
| Approach | Description | |
|
|
|----------|-------------| |
|
|
| **Surgical LoRA** | Train upper reasoning layers while preserving base model capabilities | |
|
|
| **Multi-format Datasets** | Flat, embedded, hierarchical, and conversational training formats | |
|
|
| **Data Augmentation** | Query paraphrasing for linguistic diversity | |
|
|
| **Thought Anchors** | Structured `<think>` reasoning in responses | |
|
|
|
|
|
### Privacy-First Pipeline |
|
|
- PII obfuscation in training data |
|
|
- Local inference via LM Studio |
|
|
- No customer data leaves the machine |
|
|
|
|
|
### Platform |
|
|
Optimized for Apple Silicon using the [MLX framework](https://ml-explore.github.io/mlx/). |
|
|
|
|
|
## Use Cases |
|
|
|
|
|
- **Component Recommendations** - Suggest appropriate blocks for page sections |
|
|
- **Page Structure** - Design complete page layouts from requirements |
|
|
- **Content Assistance** - Help editors with mission-aligned content |
|
|
|
|
|
--- |
|
|
|
|
|
*Building AI tools for the nonprofit sector* |
|
|
|