CRANE AI Model - Fine-tuning ready hybrid system
Browse files- README.md +79 -54
- __init__.py +17 -0
- config.json +37 -39
- run.py +90 -0
- setup.py +40 -181
README.md
CHANGED
|
@@ -1,90 +1,115 @@
|
|
| 1 |
---
|
| 2 |
-
title: CRANE AI - Hibrit Yapay Zeka Sistemi
|
| 3 |
-
emoji: 🏗️
|
| 4 |
-
colorFrom: blue
|
| 5 |
-
colorTo: purple
|
| 6 |
-
sdk: gradio
|
| 7 |
-
sdk_version: 4.29.0
|
| 8 |
-
app_file: demo.py
|
| 9 |
-
pinned: false
|
| 10 |
license: mit
|
|
|
|
| 11 |
tags:
|
| 12 |
-
-
|
| 13 |
- multi-model
|
| 14 |
-
-
|
| 15 |
-
- crane
|
| 16 |
-
- turkish
|
| 17 |
- code-generation
|
| 18 |
- chat
|
| 19 |
- reasoning
|
|
|
|
| 20 |
language:
|
| 21 |
- tr
|
| 22 |
- en
|
|
|
|
| 23 |
---
|
| 24 |
|
| 25 |
-
# CRANE AI
|
|
|
|
|
|
|
| 26 |
|
| 27 |
-
##
|
| 28 |
-
CRANE (Compressed Routing and Neural Embedding) benzeri hibrit yapay zeka sistemi. Küçük ama güçlü modellerin verimli çalıştığı, laptop'ta çalışabilecek kadar hızlı sistem.
|
| 29 |
|
| 30 |
-
|
| 31 |
-
- 🚀 **Hızlı**: Laptop'ta sorunsuz çalışır
|
| 32 |
-
- 🧠 **Zeki**: GPT-4o seviyesinde performans
|
| 33 |
-
- 🔧 **Modüler**: Özelleşmiş micro-modüller
|
| 34 |
-
- 💾 **Hafif**: Küçük ama başarılı modeller
|
| 35 |
-
- 📝 **MIT Lisanslı**: Açık kaynak modeller
|
| 36 |
|
| 37 |
-
|
|
|
|
|
|
|
|
|
|
| 38 |
|
| 39 |
-
##
|
| 40 |
-
- Gelen sorguları analiz eder
|
| 41 |
-
- Uygun modellere yönlendirir
|
| 42 |
-
- Yük dengelemesi yapar
|
| 43 |
|
| 44 |
-
|
| 45 |
-
|
| 46 |
-
|
| 47 |
-
- **ReasonModule**: Phi-3 (mantık yürütme)
|
| 48 |
-
- **FastModule**: TinyLlama (hızlı yanıt)
|
| 49 |
|
| 50 |
-
#
|
| 51 |
-
|
| 52 |
-
|
| 53 |
-
- Paralel işlem
|
| 54 |
|
| 55 |
-
#
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 56 |
|
| 57 |
-
### Gradio Demo
|
| 58 |
```bash
|
| 59 |
-
|
|
|
|
| 60 |
```
|
| 61 |
|
|
|
|
|
|
|
| 62 |
### API Sunucusu
|
| 63 |
```bash
|
| 64 |
python main.py
|
| 65 |
```
|
| 66 |
|
| 67 |
-
###
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 68 |
```bash
|
| 69 |
-
|
| 70 |
-
python setup.py
|
| 71 |
```
|
| 72 |
|
| 73 |
-
##
|
| 74 |
-
- **Hız**: ~100ms ortalama yanıt süresi
|
| 75 |
-
- **Bellek**: ~2GB RAM kullanımı
|
| 76 |
-
- **Doğruluk**: GPT-4o seviyesinde performans
|
| 77 |
-
- **Dil**: Türkçe ve İngilizce desteği
|
| 78 |
|
| 79 |
-
|
| 80 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 81 |
|
| 82 |
-
##
|
| 83 |
-
MIT Lisansı ile açık kaynak.
|
| 84 |
|
| 85 |
-
|
| 86 |
-
|
|
|
|
|
|
|
|
|
|
| 87 |
|
| 88 |
-
|
| 89 |
|
| 90 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2 |
license: mit
|
| 3 |
+
library_name: crane-ai
|
| 4 |
tags:
|
| 5 |
+
- pytorch
|
| 6 |
- multi-model
|
| 7 |
+
- hybrid-ai
|
|
|
|
|
|
|
| 8 |
- code-generation
|
| 9 |
- chat
|
| 10 |
- reasoning
|
| 11 |
+
- turkish
|
| 12 |
language:
|
| 13 |
- tr
|
| 14 |
- en
|
| 15 |
+
pipeline_tag: text-generation
|
| 16 |
---
|
| 17 |
|
| 18 |
+
# CRANE AI Model
|
| 19 |
+
|
| 20 |
+
CRANE (Compressed Routing and Neural Embedding) hibrit yapay zeka sistemi.
|
| 21 |
|
| 22 |
+
## Model Description
|
|
|
|
| 23 |
|
| 24 |
+
CRANE AI, 4 farklı özelleşmiş modülü birleştiren hibrit bir AI sistemidir:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 25 |
|
| 26 |
+
- **CodeModule**: Kod yazma (DeepSeek-Coder 1.3B)
|
| 27 |
+
- **ChatModule**: Genel sohbet (Qwen2.5 1.5B)
|
| 28 |
+
- **ReasonModule**: Mantık yürütme (Phi-3 Mini)
|
| 29 |
+
- **FastModule**: Hızlı yanıtlar (TinyLlama 1.1B)
|
| 30 |
|
| 31 |
+
## Quick Start
|
|
|
|
|
|
|
|
|
|
| 32 |
|
| 33 |
+
```python
|
| 34 |
+
from crane_ai import CRANEAISystem
|
| 35 |
+
import asyncio
|
|
|
|
|
|
|
| 36 |
|
| 37 |
+
# Sistemi başlat
|
| 38 |
+
system = CRANEAISystem()
|
| 39 |
+
await system.initialize()
|
|
|
|
| 40 |
|
| 41 |
+
# Kullanım
|
| 42 |
+
response = await system.process_query("Python'da bir hesap makinesi yaz", {})
|
| 43 |
+
print(response["response"])
|
| 44 |
+
```
|
| 45 |
+
|
| 46 |
+
## Installation
|
| 47 |
|
|
|
|
| 48 |
```bash
|
| 49 |
+
pip install -r requirements.txt
|
| 50 |
+
python setup.py install
|
| 51 |
```
|
| 52 |
|
| 53 |
+
## Usage
|
| 54 |
+
|
| 55 |
### API Sunucusu
|
| 56 |
```bash
|
| 57 |
python main.py
|
| 58 |
```
|
| 59 |
|
| 60 |
+
### Programmatic Usage
|
| 61 |
+
```python
|
| 62 |
+
from crane_ai import CRANEAISystem
|
| 63 |
+
|
| 64 |
+
system = CRANEAISystem()
|
| 65 |
+
await system.initialize()
|
| 66 |
+
|
| 67 |
+
result = await system.process_query("Your query here", {})
|
| 68 |
+
```
|
| 69 |
+
|
| 70 |
+
## Fine-tuning
|
| 71 |
+
|
| 72 |
+
CRANE AI LoRA/QLoRA ile fine-tune edilebilir:
|
| 73 |
+
|
| 74 |
```bash
|
| 75 |
+
python training/fine_tune.py --module code_module --data your_data.jsonl
|
|
|
|
| 76 |
```
|
| 77 |
|
| 78 |
+
## Model Architecture
|
|
|
|
|
|
|
|
|
|
|
|
|
| 79 |
|
| 80 |
+
```
|
| 81 |
+
Input Query → Router → Best Module → Response
|
| 82 |
+
↓
|
| 83 |
+
[CodeModule, ChatModule, ReasonModule, FastModule]
|
| 84 |
+
↓
|
| 85 |
+
Token Capsule Layer → Memory Management
|
| 86 |
+
```
|
| 87 |
|
| 88 |
+
## Training Data
|
|
|
|
| 89 |
|
| 90 |
+
Base modeller açık kaynak dataseti ile eğitilmiş:
|
| 91 |
+
- Code: GitHub repositories
|
| 92 |
+
- Chat: Conversation datasets
|
| 93 |
+
- Reasoning: Logic puzzles
|
| 94 |
+
- Fast: Q&A pairs
|
| 95 |
|
| 96 |
+
## Limitations
|
| 97 |
|
| 98 |
+
- GPU memory: ~4GB gerekli
|
| 99 |
+
- Response time: 1-5 saniye
|
| 100 |
+
- Context length: 4096 tokens max
|
| 101 |
+
|
| 102 |
+
## License
|
| 103 |
+
|
| 104 |
+
MIT License - Commercial use allowed.
|
| 105 |
+
|
| 106 |
+
## Citation
|
| 107 |
+
|
| 108 |
+
```bibtex
|
| 109 |
+
@misc{crane-ai-2024,
|
| 110 |
+
title={CRANE AI: Hybrid Multi-Model System},
|
| 111 |
+
author={Veteroner},
|
| 112 |
+
year={2024},
|
| 113 |
+
url={https://huggingface.co/veteroner/Novaai}
|
| 114 |
+
}
|
| 115 |
+
```
|
__init__.py
ADDED
|
@@ -0,0 +1,17 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
CRANE AI - Hibrit Yapay Zeka Sistemi
|
| 3 |
+
"""
|
| 4 |
+
|
| 5 |
+
from .main import CRANEAISystem
|
| 6 |
+
from .modules import CodeModule, ChatModule, ReasonModule, FastModule
|
| 7 |
+
from .router import IntelligentRouter
|
| 8 |
+
|
| 9 |
+
__version__ = "1.0.0"
|
| 10 |
+
__all__ = [
|
| 11 |
+
"CRANEAISystem",
|
| 12 |
+
"CodeModule",
|
| 13 |
+
"ChatModule",
|
| 14 |
+
"ReasonModule",
|
| 15 |
+
"FastModule",
|
| 16 |
+
"IntelligentRouter"
|
| 17 |
+
]
|
config.json
CHANGED
|
@@ -1,46 +1,44 @@
|
|
| 1 |
{
|
|
|
|
| 2 |
"architectures": [
|
| 3 |
-
"
|
| 4 |
],
|
| 5 |
-
"
|
| 6 |
-
"
|
| 7 |
-
"
|
| 8 |
-
"
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
"
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
"max_tokens": 1024,
|
| 19 |
-
"temperature": 0.7,
|
| 20 |
-
"priority": 2
|
| 21 |
-
},
|
| 22 |
-
"reason_module": {
|
| 23 |
-
"model_id": "microsoft/Phi-3-mini-4k-instruct",
|
| 24 |
-
"task": "reasoning",
|
| 25 |
-
"max_tokens": 1024,
|
| 26 |
-
"temperature": 0.3,
|
| 27 |
-
"priority": 3
|
| 28 |
-
},
|
| 29 |
-
"fast_module": {
|
| 30 |
-
"model_id": "TinyLlama/TinyLlama-1.1B-Chat-v1.0",
|
| 31 |
-
"task": "quick_response",
|
| 32 |
-
"max_tokens": 512,
|
| 33 |
-
"temperature": 0.8,
|
| 34 |
-
"priority": 4
|
| 35 |
-
}
|
| 36 |
},
|
| 37 |
-
"
|
| 38 |
-
"
|
| 39 |
-
"
|
| 40 |
-
"
|
| 41 |
-
"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 42 |
}
|
| 43 |
},
|
| 44 |
-
"
|
| 45 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 46 |
}
|
|
|
|
| 1 |
{
|
| 2 |
+
"model_type": "crane-ai",
|
| 3 |
"architectures": [
|
| 4 |
+
"CRANEAIModel"
|
| 5 |
],
|
| 6 |
+
"crane_version": "1.0.0",
|
| 7 |
+
"modules": {
|
| 8 |
+
"code_module": {
|
| 9 |
+
"base_model": "deepseek-ai/deepseek-coder-1.3b-instruct",
|
| 10 |
+
"task": "code_generation",
|
| 11 |
+
"max_tokens": 2048,
|
| 12 |
+
"temperature": 0.1
|
| 13 |
+
},
|
| 14 |
+
"chat_module": {
|
| 15 |
+
"base_model": "Qwen/Qwen2.5-1.5B-Instruct",
|
| 16 |
+
"task": "chat",
|
| 17 |
+
"max_tokens": 1024,
|
| 18 |
+
"temperature": 0.7
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 19 |
},
|
| 20 |
+
"reason_module": {
|
| 21 |
+
"base_model": "microsoft/Phi-3-mini-4k-instruct",
|
| 22 |
+
"task": "reasoning",
|
| 23 |
+
"max_tokens": 1024,
|
| 24 |
+
"temperature": 0.3
|
| 25 |
+
},
|
| 26 |
+
"fast_module": {
|
| 27 |
+
"base_model": "TinyLlama/TinyLlama-1.1B-Chat-v1.0",
|
| 28 |
+
"task": "quick_response",
|
| 29 |
+
"max_tokens": 512,
|
| 30 |
+
"temperature": 0.8
|
| 31 |
}
|
| 32 |
},
|
| 33 |
+
"router_config": {
|
| 34 |
+
"confidence_threshold": 0.6,
|
| 35 |
+
"max_concurrent_requests": 4,
|
| 36 |
+
"timeout": 30,
|
| 37 |
+
"fallback_model": "fast_module"
|
| 38 |
+
},
|
| 39 |
+
"training": {
|
| 40 |
+
"supports_fine_tuning": true,
|
| 41 |
+
"fine_tuning_method": "LoRA",
|
| 42 |
+
"training_script": "training/fine_tune.py"
|
| 43 |
+
}
|
| 44 |
}
|
run.py
ADDED
|
@@ -0,0 +1,90 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
CRANE AI - Basit Çalıştırma Scripti
|
| 3 |
+
"""
|
| 4 |
+
|
| 5 |
+
import asyncio
|
| 6 |
+
import sys
|
| 7 |
+
import logging
|
| 8 |
+
from pathlib import Path
|
| 9 |
+
|
| 10 |
+
# Proje kök dizinini Python path'ine ekle
|
| 11 |
+
project_root = Path(__file__).parent
|
| 12 |
+
sys.path.insert(0, str(project_root))
|
| 13 |
+
|
| 14 |
+
# Logging ayarları
|
| 15 |
+
logging.basicConfig(
|
| 16 |
+
level=logging.INFO,
|
| 17 |
+
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
| 18 |
+
)
|
| 19 |
+
logger = logging.getLogger(__name__)
|
| 20 |
+
|
| 21 |
+
async def run_crane_ai():
|
| 22 |
+
"""CRANE AI sistemini çalıştırır"""
|
| 23 |
+
try:
|
| 24 |
+
logger.info("🚀 CRANE AI Başlatılıyor...")
|
| 25 |
+
|
| 26 |
+
# Ana sistemi import et
|
| 27 |
+
from main import crane_system
|
| 28 |
+
|
| 29 |
+
# Sistemi başlat
|
| 30 |
+
await crane_system.initialize()
|
| 31 |
+
|
| 32 |
+
# Interaktif mod
|
| 33 |
+
logger.info("💬 Interaktif mod başlatıldı. 'quit' yazarak çıkabilirsiniz.")
|
| 34 |
+
|
| 35 |
+
while True:
|
| 36 |
+
try:
|
| 37 |
+
# Kullanıcı girdisini al
|
| 38 |
+
user_input = input("\n🤖 Siz: ")
|
| 39 |
+
|
| 40 |
+
# Çıkış kontrolü
|
| 41 |
+
if user_input.lower() in ['quit', 'exit', 'çıkış', 'q']:
|
| 42 |
+
break
|
| 43 |
+
|
| 44 |
+
# Boş girdi kontrolü
|
| 45 |
+
if not user_input.strip():
|
| 46 |
+
continue
|
| 47 |
+
|
| 48 |
+
# Sorguyu işle
|
| 49 |
+
result = await crane_system.process_query(user_input)
|
| 50 |
+
|
| 51 |
+
# Sonucu göster
|
| 52 |
+
if "error" in result:
|
| 53 |
+
print(f"❌ Hata: {result['error']}")
|
| 54 |
+
else:
|
| 55 |
+
print(f"🤖 CRANE AI: {result['response']}")
|
| 56 |
+
|
| 57 |
+
# Sistem bilgilerini göster
|
| 58 |
+
module_used = result.get('module_used', 'unknown')
|
| 59 |
+
confidence = result.get('confidence', 0.0)
|
| 60 |
+
exec_time = result.get('execution_time', 0.0)
|
| 61 |
+
|
| 62 |
+
print(f" 📊 Modül: {module_used} | Güven: {confidence:.2f} | Süre: {exec_time:.2f}s")
|
| 63 |
+
|
| 64 |
+
except KeyboardInterrupt:
|
| 65 |
+
break
|
| 66 |
+
except Exception as e:
|
| 67 |
+
logger.error(f"İşlem hatası: {str(e)}")
|
| 68 |
+
print(f"❌ Bir hata oluştu: {str(e)}")
|
| 69 |
+
|
| 70 |
+
# Sistemi kapat
|
| 71 |
+
await crane_system.shutdown()
|
| 72 |
+
logger.info("👋 CRANE AI Kapatıldı")
|
| 73 |
+
|
| 74 |
+
except Exception as e:
|
| 75 |
+
logger.error(f"Sistem hatası: {str(e)}")
|
| 76 |
+
print(f"❌ Sistem başlatma hatası: {str(e)}")
|
| 77 |
+
sys.exit(1)
|
| 78 |
+
|
| 79 |
+
def main():
|
| 80 |
+
"""Ana fonksiyon"""
|
| 81 |
+
try:
|
| 82 |
+
asyncio.run(run_crane_ai())
|
| 83 |
+
except KeyboardInterrupt:
|
| 84 |
+
logger.info("👋 Program kullanıcı tarafından sonlandırıldı")
|
| 85 |
+
except Exception as e:
|
| 86 |
+
logger.error(f"Beklenmeyen hata: {str(e)}")
|
| 87 |
+
sys.exit(1)
|
| 88 |
+
|
| 89 |
+
if __name__ == "__main__":
|
| 90 |
+
main()
|
setup.py
CHANGED
|
@@ -1,186 +1,45 @@
|
|
|
|
|
| 1 |
"""
|
| 2 |
-
CRANE AI
|
| 3 |
"""
|
| 4 |
|
| 5 |
-
import
|
| 6 |
-
import logging
|
| 7 |
-
import os
|
| 8 |
-
import sys
|
| 9 |
-
from pathlib import Path
|
| 10 |
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 15 |
)
|
| 16 |
-
logger = logging.getLogger(__name__)
|
| 17 |
-
|
| 18 |
-
class CRANESetup:
|
| 19 |
-
"""CRANE AI kurulum sınıfı"""
|
| 20 |
-
|
| 21 |
-
def __init__(self):
|
| 22 |
-
self.project_root = Path(__file__).parent
|
| 23 |
-
self.required_dirs = [
|
| 24 |
-
"logs",
|
| 25 |
-
"models",
|
| 26 |
-
"cache",
|
| 27 |
-
"data",
|
| 28 |
-
"exports"
|
| 29 |
-
]
|
| 30 |
-
|
| 31 |
-
async def setup_system(self):
|
| 32 |
-
"""Sistemi kurar"""
|
| 33 |
-
try:
|
| 34 |
-
logger.info("🏗️ CRANE AI Kurulumu Başlatılıyor...")
|
| 35 |
-
|
| 36 |
-
# Dizinleri oluştur
|
| 37 |
-
await self._create_directories()
|
| 38 |
-
|
| 39 |
-
# Paketleri kontrol et
|
| 40 |
-
await self._check_packages()
|
| 41 |
-
|
| 42 |
-
# Hugging Face token'ını kontrol et
|
| 43 |
-
await self._check_hf_token()
|
| 44 |
-
|
| 45 |
-
# Modelleri hazırla
|
| 46 |
-
await self._prepare_models()
|
| 47 |
-
|
| 48 |
-
# Test çalıştır
|
| 49 |
-
await self._run_tests()
|
| 50 |
-
|
| 51 |
-
logger.info("✅ CRANE AI Kurulumu Başarıyla Tamamlandı!")
|
| 52 |
-
logger.info("🚀 Sistemi başlatmak için: python main.py")
|
| 53 |
-
|
| 54 |
-
except Exception as e:
|
| 55 |
-
logger.error(f"❌ Kurulum hatası: {str(e)}")
|
| 56 |
-
sys.exit(1)
|
| 57 |
-
|
| 58 |
-
async def _create_directories(self):
|
| 59 |
-
"""Gerekli dizinleri oluşturur"""
|
| 60 |
-
logger.info("📁 Dizinler oluşturuluyor...")
|
| 61 |
-
|
| 62 |
-
for dir_name in self.required_dirs:
|
| 63 |
-
dir_path = self.project_root / dir_name
|
| 64 |
-
dir_path.mkdir(exist_ok=True)
|
| 65 |
-
logger.info(f" ✅ {dir_name}/ oluşturuldu")
|
| 66 |
-
|
| 67 |
-
# __init__.py dosyalarını oluştur
|
| 68 |
-
init_files = [
|
| 69 |
-
"core/__init__.py",
|
| 70 |
-
"router/__init__.py",
|
| 71 |
-
"memory/__init__.py",
|
| 72 |
-
"config/__init__.py"
|
| 73 |
-
]
|
| 74 |
-
|
| 75 |
-
for init_file in init_files:
|
| 76 |
-
init_path = self.project_root / init_file
|
| 77 |
-
init_path.parent.mkdir(exist_ok=True)
|
| 78 |
-
if not init_path.exists():
|
| 79 |
-
init_path.write_text('"""Package initialization"""')
|
| 80 |
-
logger.info(f" ✅ {init_file} oluşturuldu")
|
| 81 |
-
|
| 82 |
-
async def _check_packages(self):
|
| 83 |
-
"""Gerekli paketleri kontrol eder"""
|
| 84 |
-
logger.info("📦 Paketler kontrol ediliyor...")
|
| 85 |
-
|
| 86 |
-
required_packages = [
|
| 87 |
-
"torch",
|
| 88 |
-
"transformers",
|
| 89 |
-
"fastapi",
|
| 90 |
-
"uvicorn",
|
| 91 |
-
"gradio",
|
| 92 |
-
"numpy",
|
| 93 |
-
"requests",
|
| 94 |
-
"psutil"
|
| 95 |
-
]
|
| 96 |
-
|
| 97 |
-
missing_packages = []
|
| 98 |
-
|
| 99 |
-
for package in required_packages:
|
| 100 |
-
try:
|
| 101 |
-
__import__(package)
|
| 102 |
-
logger.info(f" ✅ {package} mevcut")
|
| 103 |
-
except ImportError:
|
| 104 |
-
missing_packages.append(package)
|
| 105 |
-
logger.warning(f" ❌ {package} eksik")
|
| 106 |
-
|
| 107 |
-
if missing_packages:
|
| 108 |
-
logger.error(f"❌ Eksik paketler: {', '.join(missing_packages)}")
|
| 109 |
-
logger.info("💡 Çözüm: pip install -r requirements.txt")
|
| 110 |
-
raise Exception("Eksik paketler var")
|
| 111 |
-
|
| 112 |
-
async def _check_hf_token(self):
|
| 113 |
-
"""Hugging Face token'ını kontrol eder"""
|
| 114 |
-
logger.info("🔑 Hugging Face token kontrol ediliyor...")
|
| 115 |
-
|
| 116 |
-
from config.settings import HF_TOKEN
|
| 117 |
-
|
| 118 |
-
if not HF_TOKEN or HF_TOKEN == "YOUR_TOKEN_HERE":
|
| 119 |
-
logger.error("❌ Hugging Face token bulunamadı")
|
| 120 |
-
logger.info("💡 config/settings.py dosyasında HF_TOKEN'ı ayarlayın")
|
| 121 |
-
raise Exception("Hugging Face token gerekli")
|
| 122 |
-
|
| 123 |
-
# Token'ın geçerli olup olmadığını kontrol et
|
| 124 |
-
try:
|
| 125 |
-
from huggingface_hub import HfApi
|
| 126 |
-
api = HfApi()
|
| 127 |
-
user_info = api.whoami(token=HF_TOKEN)
|
| 128 |
-
logger.info(f" ✅ Token geçerli: {user_info.get('name', 'Unknown')}")
|
| 129 |
-
except Exception as e:
|
| 130 |
-
logger.error(f"❌ Token geçersiz: {str(e)}")
|
| 131 |
-
raise Exception("Hugging Face token geçersiz")
|
| 132 |
-
|
| 133 |
-
async def _prepare_models(self):
|
| 134 |
-
"""Modelleri hazırlar"""
|
| 135 |
-
logger.info("🤖 Modeller hazırlanıyor...")
|
| 136 |
-
|
| 137 |
-
from config.settings import MODELS, DEVICE
|
| 138 |
-
|
| 139 |
-
# Cihaz bilgisini göster
|
| 140 |
-
logger.info(f" 🖥️ Cihaz: {DEVICE}")
|
| 141 |
-
|
| 142 |
-
# Model bilgilerini göster
|
| 143 |
-
for model_name, model_config in MODELS.items():
|
| 144 |
-
model_id = model_config["model_id"]
|
| 145 |
-
logger.info(f" 📋 {model_name}: {model_id}")
|
| 146 |
-
|
| 147 |
-
# Model cache'ini kontrol et
|
| 148 |
-
cache_dir = self.project_root / "cache" / model_name
|
| 149 |
-
cache_dir.mkdir(exist_ok=True)
|
| 150 |
-
|
| 151 |
-
logger.info(" ✅ Modeller hazırlandı")
|
| 152 |
-
|
| 153 |
-
async def _run_tests(self):
|
| 154 |
-
"""Basit testler çalıştırır"""
|
| 155 |
-
logger.info("🧪 Testler çalıştırılıyor...")
|
| 156 |
-
|
| 157 |
-
try:
|
| 158 |
-
# Router testi
|
| 159 |
-
from router.intelligent_router import IntelligentRouter
|
| 160 |
-
logger.info(" ✅ Router importu başarılı")
|
| 161 |
-
|
| 162 |
-
# Modül testleri
|
| 163 |
-
from modules import CodeModule, ChatModule, ReasonModule, FastModule
|
| 164 |
-
logger.info(" ✅ Modül importları başarılı")
|
| 165 |
-
|
| 166 |
-
# Memory testi
|
| 167 |
-
from memory.local_memory import LocalMemoryManager
|
| 168 |
-
logger.info(" ✅ Memory Manager importu başarılı")
|
| 169 |
-
|
| 170 |
-
# Token layer testi
|
| 171 |
-
from core.token_capsule import TokenCapsuleLayer
|
| 172 |
-
logger.info(" ✅ Token Capsule Layer importu başarılı")
|
| 173 |
-
|
| 174 |
-
logger.info(" ✅ Tüm testler başarılı")
|
| 175 |
-
|
| 176 |
-
except Exception as e:
|
| 177 |
-
logger.error(f"❌ Test hatası: {str(e)}")
|
| 178 |
-
raise Exception("Sistem testleri başarısız")
|
| 179 |
-
|
| 180 |
-
def main():
|
| 181 |
-
"""Ana kurulum fonksiyonu"""
|
| 182 |
-
setup = CRANESetup()
|
| 183 |
-
asyncio.run(setup.setup_system())
|
| 184 |
-
|
| 185 |
-
if __name__ == "__main__":
|
| 186 |
-
main()
|
|
|
|
| 1 |
+
#!/usr/bin/env python3
|
| 2 |
"""
|
| 3 |
+
CRANE AI Setup
|
| 4 |
"""
|
| 5 |
|
| 6 |
+
from setuptools import setup, find_packages
|
|
|
|
|
|
|
|
|
|
|
|
|
| 7 |
|
| 8 |
+
setup(
|
| 9 |
+
name="crane-ai",
|
| 10 |
+
version="1.0.0",
|
| 11 |
+
description="CRANE AI Hibrit Yapay Zeka Sistemi",
|
| 12 |
+
long_description=open("README.md", encoding="utf-8").read(),
|
| 13 |
+
long_description_content_type="text/markdown",
|
| 14 |
+
author="Veteroner",
|
| 15 |
+
author_email="veteroner@example.com",
|
| 16 |
+
url="https://huggingface.co/veteroner/Novaai",
|
| 17 |
+
packages=find_packages(),
|
| 18 |
+
install_requires=[
|
| 19 |
+
"torch>=2.0.0",
|
| 20 |
+
"transformers==4.41.0",
|
| 21 |
+
"accelerate>=0.28.0",
|
| 22 |
+
"fastapi==0.111.0",
|
| 23 |
+
"uvicorn==0.29.0",
|
| 24 |
+
"gradio==4.29.0",
|
| 25 |
+
"huggingface_hub>=0.20.0",
|
| 26 |
+
"peft==0.10.0",
|
| 27 |
+
"numpy>=1.24.0",
|
| 28 |
+
],
|
| 29 |
+
python_requires=">=3.8",
|
| 30 |
+
classifiers=[
|
| 31 |
+
"Development Status :: 4 - Beta",
|
| 32 |
+
"Intended Audience :: Developers",
|
| 33 |
+
"License :: OSI Approved :: MIT License",
|
| 34 |
+
"Programming Language :: Python :: 3",
|
| 35 |
+
"Programming Language :: Python :: 3.8",
|
| 36 |
+
"Programming Language :: Python :: 3.9",
|
| 37 |
+
"Programming Language :: Python :: 3.10",
|
| 38 |
+
"Topic :: Scientific/Engineering :: Artificial Intelligence",
|
| 39 |
+
],
|
| 40 |
+
entry_points={
|
| 41 |
+
"console_scripts": [
|
| 42 |
+
"crane-ai=main:main",
|
| 43 |
+
],
|
| 44 |
+
},
|
| 45 |
)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|