# Data Model: Gemini API Migration **Feature Branch**: `006-gemini-api-migration` **Created**: 2025-12-14 ## Overview This migration does not introduce new database entities. It replaces the AI service layer implementation while maintaining existing data contracts. ## Service Classes ### GeminiService (replaces OpenAIService) **File**: `app/services/gemini_service.py` | Attribute | Type | Description | |-----------|------|-------------| | client | genai.Client | Google GenAI client instance | | model | str | Default model: "gemini-2.0-flash-exp" | **Methods** (signature-compatible with OpenAIService): | Method | Parameters | Return Type | Description | |--------|------------|-------------|-------------| | get_chat_response | prompt: str, history: List[dict] = None | str | Generate chat response using Gemini | | translate_to_urdu | content: str | str | Translate English to Urdu | | personalize_content | content: str, software_level: str, hardware_level: str, learning_goals: str | dict | Personalize content based on user profile | ### EmbeddingsService (modified) **File**: `app/services/embeddings_service.py` | Attribute | Type | Description | |-----------|------|-------------| | client | genai.Client | Google GenAI client instance | | model | str | "text-embedding-004" | **Methods**: | Method | Parameters | Return Type | Description | |--------|------------|-------------|-------------| | create_embedding | text: str | List[float] | Generate embedding vector for text | ## Configuration Changes ### Settings Class (app/config.py) **Removed Fields**: - `OPENAI_API_KEY: str` - `OPENAI_MODEL_CHAT: str` - `OPENAI_MODEL_EMBEDDING: str` **Added Fields**: - `GEMINI_API_KEY: str` - `GEMINI_MODEL_CHAT: str = "gemini-2.0-flash-exp"` - `GEMINI_MODEL_EMBEDDING: str = "text-embedding-004"` ## Environment Variables | Variable | Required | Description | |----------|----------|-------------| | GEMINI_API_KEY | Yes | Google AI API key for Gemini services | | OPENAI_API_KEY | Removed | No longer required | ## Data Flow (Unchanged) ``` Request → Route → Service (GeminiService) → Google Gemini API → Response ↓ EmbeddingsService → Gemini text-embedding-004 ↓ Qdrant (unchanged) ``` ## Embedding Dimension Change | Service | Model | Dimensions | |---------|-------|------------| | OpenAI (current) | text-embedding-3-small | 1536 | | Gemini (new) | text-embedding-004 | 768 | **Impact**: Existing Qdrant collections indexed with OpenAI embeddings are incompatible with Gemini embeddings. Re-indexing is out of scope per specification. ## Message Format Mapping ### Chat History Conversion **OpenAI Format** (input from routes): ```python [ {"role": "system", "content": "..."}, {"role": "user", "content": "..."}, {"role": "assistant", "content": "..."} ] ``` **Gemini Format** (internal conversion): ```python # System message → system_instruction config # user → user # assistant → model [ types.Content(role="user", parts=[types.Part(text="...")]), types.Content(role="model", parts=[types.Part(text="...")]) ] ``` ## No Schema Changes This migration does not modify: - Database tables - Pydantic request/response models - API endpoint signatures - Route patterns