File size: 1,376 Bytes
8cda880
 
ad81a2d
2362c65
 
8cda880
2362c65
 
ad81a2d
8cda880
 
ad81a2d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
---
title: README
emoji: 🧠
colorFrom: yellow
colorTo: gray
sdk: static
pinned: true
license: mit
short_description: ITCare AI - Domain-Specific LLM Fine-tuning
---

# ITCare AI

> Fine-tuned language models for content management systems

## Focus Areas

### Domain-Specific Fine-tuning
Specialized models trained on real-world CMS data to assist content editors with component selection, page building, and content structure decisions.

### Training Methodology

| Approach | Description |
|----------|-------------|
| **Surgical LoRA** | Train upper reasoning layers while preserving base model capabilities |
| **Multi-format Datasets** | Flat, embedded, hierarchical, and conversational training formats |
| **Data Augmentation** | Query paraphrasing for linguistic diversity |
| **Thought Anchors** | Structured `<think>` reasoning in responses |

### Privacy-First Pipeline
- PII obfuscation in training data
- Local inference via LM Studio
- No customer data leaves the machine

### Platform
Optimized for Apple Silicon using the [MLX framework](https://ml-explore.github.io/mlx/).

## Use Cases

- **Component Recommendations** - Suggest appropriate blocks for page sections
- **Page Structure** - Design complete page layouts from requirements
- **Content Assistance** - Help editors with mission-aligned content

---

*Building AI tools for the nonprofit sector*