b24122 commited on
Commit
9844436
·
1 Parent(s): 00d8d42

Set up core project files for legal case analysis API backend

Browse files

Create initial project structure with FastAPI, RAG, LegalBERT, and Gemini AI integration.

Replit-Commit-Author: Agent
Replit-Commit-Session-Id: 63975d62-3d3b-48af-8685-b7e915f31f2b
Replit-Commit-Screenshot-Url: https://storage.googleapis.com/screenshot-production-us-central1/a5a12774-3181-414d-89e4-a4da8e3fb1ca/63975d62-3d3b-48af-8685-b7e915f31f2b/i8A93Md

.replit ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ modules = ["python-3.11"]
2
+
3
+ [nix]
4
+ channel = "stable-25_05"
5
+ packages = ["libxcrypt"]
6
+
7
+ [workflows]
8
+ runButton = "Project"
9
+
10
+ [[workflows.workflow]]
11
+ name = "Project"
12
+ mode = "parallel"
13
+ author = "agent"
14
+
15
+ [[workflows.workflow.tasks]]
16
+ task = "workflow.run"
17
+ args = "FastAPI Server"
18
+
19
+ [[workflows.workflow]]
20
+ name = "FastAPI Server"
21
+ author = "agent"
22
+
23
+ [[workflows.workflow.tasks]]
24
+ task = "shell.exec"
25
+ args = "python -m uvicorn main:app --host 0.0.0.0 --port 5000 --reload"
26
+ waitForPort = 5000
27
+
28
+ [[ports]]
29
+ localPort = 5000
30
+ externalPort = 80
README.md ADDED
@@ -0,0 +1,210 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Legal RAG Analysis API
2
+
3
+ A FastAPI backend for legal case analysis using Retrieval-Augmented Generation (RAG) system with LegalBERT predictions and Gemini AI evaluation.
4
+
5
+ ## Overview
6
+
7
+ This API provides comprehensive legal case analysis by combining:
8
+ - LegalBERT model for initial verdict predictions
9
+ - RAG system with FAISS indexes for retrieving relevant legal documents
10
+ - Gemini AI for final evaluation and detailed explanations
11
+
12
+ ## Features
13
+
14
+ - **Case Analysis**: Analyze legal cases and predict verdicts
15
+ - **RAG Integration**: Retrieve relevant legal documents from multiple sources
16
+ - **AI Evaluation**: Get detailed legal explanations from Gemini AI
17
+ - **Health Monitoring**: Check system status across all components
18
+ - **Model Status**: Monitor loading status of ML models and indexes
19
+
20
+ ## API Endpoints
21
+
22
+ ### Core Endpoints
23
+
24
+ #### `POST /api/v1/analyze-case`
25
+ Analyze a legal case and provide verdict prediction with detailed explanation.
26
+
27
+ **Request Body:**
28
+ ```json
29
+ {
30
+ "caseText": "The accused was found in possession of stolen property...",
31
+ "useQueryGeneration": true
32
+ }
33
+ ```
34
+
35
+ **Response:**
36
+ ```json
37
+ {
38
+ "initialVerdict": "guilty",
39
+ "initialConfidence": 0.85,
40
+ "finalVerdict": "guilty",
41
+ "verdictChanged": false,
42
+ "searchQuery": "stolen property, IPC section 411, criminal breach of trust",
43
+ "geminiExplanation": "Based on the legal analysis...",
44
+ "supportingSources": {...},
45
+ "analysisLogs": {...}
46
+ }
47
+ ```
48
+
49
+ #### `GET /api/v1/health`
50
+ Check the health status of all system components.
51
+
52
+ **Response:**
53
+ ```json
54
+ {
55
+ "status": "healthy",
56
+ "services": {
57
+ "legal_bert": true,
58
+ "rag": true,
59
+ "gemini": true
60
+ },
61
+ "error": null
62
+ }
63
+ ```
64
+
65
+ #### `GET /api/v1/models/status`
66
+ Get detailed status of all models and indexes.
67
+
68
+ **Response:**
69
+ ```json
70
+ {
71
+ "legalBert": {
72
+ "loaded": false,
73
+ "device": "cpu"
74
+ },
75
+ "ragIndexes": {
76
+ "loaded": false,
77
+ "indexCount": 0
78
+ },
79
+ "gemini": {
80
+ "configured": true
81
+ }
82
+ }
83
+ ```
84
+
85
+ ## Setup Instructions
86
+
87
+ ### Prerequisites
88
+
89
+ 1. **Gemini API Key**: Required for AI analysis
90
+ - Get from [Google AI Studio](https://aistudio.google.com/)
91
+ - Add as `GEMINI_API_KEY` environment variable
92
+
93
+ 2. **Model Files** (Optional for development):
94
+ - LegalBERT model files in `./models/legalbert_model/`
95
+ - FAISS indexes in `./faiss_indexes/`
96
+
97
+ ### Installation
98
+
99
+ 1. **Install Dependencies:**
100
+ ```bash
101
+ pip install fastapi uvicorn pydantic pydantic-settings google-genai
102
+ ```
103
+
104
+ 2. **For Full Functionality (ML Models):**
105
+ ```bash
106
+ pip install torch transformers sentence-transformers faiss-cpu numpy
107
+ ```
108
+
109
+ 3. **Run the Server:**
110
+ ```bash
111
+ python -m uvicorn main:app --host 0.0.0.0 --port 5000 --reload
112
+ ```
113
+
114
+ ## Project Structure
115
+
116
+ ```
117
+ ├── main.py # FastAPI application entry point
118
+ ├── app/
119
+ │ ├── api/
120
+ │ │ └── routes.py # API route definitions
121
+ │ ├── core/
122
+ │ │ └── config.py # Configuration settings
123
+ │ ├── models/
124
+ │ │ └── schemas.py # Pydantic models
125
+ │ └── services/
126
+ │ ├── legal_bert.py # LegalBERT service
127
+ │ ├── rag_service.py # RAG retrieval service
128
+ │ └── gemini_service.py # Gemini AI service
129
+ ├── models/ # LegalBERT model files (to be added)
130
+ └── faiss_indexes/ # FAISS indexes (to be added)
131
+ ```
132
+
133
+ ## Development Mode
134
+
135
+ The API works in development mode without ML dependencies:
136
+ - Uses placeholder predictions for LegalBERT
137
+ - Provides mock RAG retrieval
138
+ - Full Gemini AI integration for analysis
139
+
140
+ ## Adding Model Files
141
+
142
+ To enable full functionality:
143
+
144
+ 1. **LegalBERT Model:**
145
+ - Place model files in `./models/legalbert_model/`
146
+ - Install torch and transformers
147
+
148
+ 2. **FAISS Indexes:**
149
+ - Add indexes to `./faiss_indexes/`
150
+ - Install faiss-cpu and sentence-transformers
151
+
152
+ ## Configuration
153
+
154
+ Key settings in `app/core/config.py`:
155
+ - Model paths
156
+ - FAISS index locations
157
+ - API configuration
158
+ - RAG parameters
159
+
160
+ ## Environment Variables
161
+
162
+ - `GEMINI_API_KEY`: Required for Gemini AI integration
163
+ - `LEGAL_BERT_MODEL_PATH`: Path to LegalBERT model
164
+ - `FAISS_INDEXES_PATH`: Base path for FAISS indexes
165
+
166
+ ## Usage Examples
167
+
168
+ ### Basic Case Analysis
169
+ ```python
170
+ import requests
171
+
172
+ response = requests.post('http://localhost:5000/api/v1/analyze-case', json={
173
+ 'caseText': 'The accused was caught stealing from a shop.',
174
+ 'useQueryGeneration': True
175
+ })
176
+
177
+ result = response.json()
178
+ print(f"Verdict: {result['finalVerdict']}")
179
+ print(f"Explanation: {result['geminiExplanation']}")
180
+ ```
181
+
182
+ ### Health Check
183
+ ```python
184
+ import requests
185
+
186
+ health = requests.get('http://localhost:5000/api/v1/health')
187
+ print(health.json())
188
+ ```
189
+
190
+ ## API Documentation
191
+
192
+ Once running, visit:
193
+ - **Interactive API Docs**: http://localhost:5000/docs
194
+ - **OpenAPI Schema**: http://localhost:5000/openapi.json
195
+
196
+ ## Legal Document Sources
197
+
198
+ The RAG system retrieves from:
199
+ - Indian Constitution articles
200
+ - IPC sections
201
+ - Case law precedents
202
+ - Legal statutes
203
+ - Q&A legal content
204
+
205
+ ## Notes
206
+
207
+ - The system is designed for Indian criminal law cases
208
+ - Placeholder implementations allow development without full ML setup
209
+ - All services include health monitoring for production deployment
210
+ - CORS is configured for frontend integration
app/__init__.py ADDED
File without changes
app/api/__init__.py ADDED
File without changes
app/api/routes.py ADDED
@@ -0,0 +1,106 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from fastapi import APIRouter, HTTPException, Depends
2
+ from app.models.schemas import CaseAnalysisRequest, CaseAnalysisResponse, HealthResponse
3
+ from app.services.legal_bert import LegalBertService
4
+ from app.services.rag_service import RAGService
5
+ from app.services.gemini_service import GeminiService
6
+ import logging
7
+
8
+ logger = logging.getLogger(__name__)
9
+ router = APIRouter()
10
+
11
+ legal_bert_service = LegalBertService()
12
+ rag_service = RAGService()
13
+ gemini_service = GeminiService()
14
+
15
+ @router.get("/health", response_model=HealthResponse)
16
+ async def health_check():
17
+ try:
18
+ services_status = {
19
+ "legal_bert": legal_bert_service.is_healthy(),
20
+ "rag": rag_service.is_healthy(),
21
+ "gemini": gemini_service.is_healthy()
22
+ }
23
+
24
+ all_healthy = all(services_status.values())
25
+
26
+ return HealthResponse(
27
+ status="healthy" if all_healthy else "degraded",
28
+ services=services_status,
29
+ error=None
30
+ )
31
+ except Exception as e:
32
+ logger.error(f"Health check failed: {str(e)}")
33
+ return HealthResponse(
34
+ status="unhealthy",
35
+ services={},
36
+ error=str(e)
37
+ )
38
+
39
+ @router.post("/analyze-case", response_model=CaseAnalysisResponse)
40
+ async def analyze_case(request: CaseAnalysisRequest):
41
+ try:
42
+ logger.info(f"Analyzing case with text length: {len(request.caseText)}")
43
+
44
+ # Step 1: Get initial verdict from LegalBERT
45
+ initial_verdict = legal_bert_service.predict_verdict(request.caseText)
46
+ confidence = legal_bert_service.getConfidence(request.caseText)
47
+
48
+ logger.info(f"Initial verdict: {initial_verdict}, confidence: {confidence}")
49
+
50
+ # Step 2: Retrieve supporting legal documents using RAG
51
+ if request.useQueryGeneration:
52
+ support_chunks, search_query = rag_service.retrieveDualSupportChunks(
53
+ request.caseText, gemini_service
54
+ )
55
+ else:
56
+ support_chunks, logs = rag_service.retrieveSupportChunksParallel(request.caseText)
57
+ search_query = request.caseText
58
+
59
+ logger.info(f"Retrieved support chunks from {len(support_chunks)} sources")
60
+
61
+ # Step 3: Evaluate with Gemini AI
62
+ evaluation_result = gemini_service.evaluateCaseWithGemini(
63
+ inputText=request.caseText,
64
+ modelVerdict=initial_verdict,
65
+ confidence=confidence,
66
+ support=support_chunks,
67
+ searchQuery=search_query
68
+ )
69
+
70
+ logger.info(f"Gemini evaluation completed. Final verdict: {evaluation_result.get('finalVerdictByGemini')}")
71
+
72
+ return CaseAnalysisResponse(
73
+ initialVerdict=initial_verdict,
74
+ initialConfidence=confidence,
75
+ finalVerdict=evaluation_result.get("finalVerdictByGemini"),
76
+ verdictChanged=evaluation_result.get("verdictChanged") == "changed",
77
+ searchQuery=search_query,
78
+ geminiExplanation=evaluation_result.get("geminiOutput"),
79
+ supportingSources=support_chunks,
80
+ analysisLogs=evaluation_result
81
+ )
82
+
83
+ except Exception as e:
84
+ logger.error(f"Error analyzing case: {str(e)}")
85
+ raise HTTPException(status_code=500, detail=f"Analysis failed: {str(e)}")
86
+
87
+ @router.get("/models/status")
88
+ async def get_models_status():
89
+ try:
90
+ status = {
91
+ "legalBert": {
92
+ "loaded": legal_bert_service.is_model_loaded(),
93
+ "device": legal_bert_service.get_device()
94
+ },
95
+ "ragIndexes": {
96
+ "loaded": rag_service.areIndexesLoaded(),
97
+ "indexCount": len(rag_service.getLoadedIndexes())
98
+ },
99
+ "gemini": {
100
+ "configured": gemini_service.is_configured()
101
+ }
102
+ }
103
+ return status
104
+ except Exception as e:
105
+ logger.error(f"Error getting models status: {str(e)}")
106
+ raise HTTPException(status_code=500, detail=f"Failed to get models status: {str(e)}")
app/core/__init__.py ADDED
File without changes
app/core/config.py ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ from typing import Optional
3
+ from pydantic_settings import BaseSettings
4
+
5
+ class Settings(BaseSettings):
6
+ # API Configuration
7
+ api_title: str = "Legal RAG Analysis API"
8
+ api_version: str = "1.0.0"
9
+ debug: bool = False
10
+
11
+ # Gemini AI Configuration
12
+ gemini_api_key: str = os.getenv("GEMINI_API_KEY", "")
13
+ gemini_model: str = "gemini-2.5-flash"
14
+
15
+ # Model Paths (to be set when models are added)
16
+ legal_bert_model_path: str = os.getenv("LEGAL_BERT_MODEL_PATH", "./models/legalbert_model")
17
+
18
+ # FAISS Index Paths
19
+ faiss_indexes_base_path: str = os.getenv("FAISS_INDEXES_PATH", "./faiss_indexes")
20
+
21
+ # Index file paths
22
+ constitution_index_path: str = f"{faiss_indexes_base_path}/constitution_bgeLarge.index"
23
+ constitution_chunks_path: str = f"{faiss_indexes_base_path}/constitution_chunks.json"
24
+
25
+ ipc_index_path: str = f"{faiss_indexes_base_path}/ipc_bgeLarge.index"
26
+ ipc_chunks_path: str = f"{faiss_indexes_base_path}/ipc_chunks.json"
27
+
28
+ ipc_case_index_path: str = f"{faiss_indexes_base_path}/ipc_case_flat.index"
29
+ ipc_case_chunks_path: str = f"{faiss_indexes_base_path}/ipc_case_chunks.json"
30
+
31
+ statute_index_path: str = f"{faiss_indexes_base_path}/statute_index.faiss"
32
+ statute_chunks_path: str = f"{faiss_indexes_base_path}/statute_chunks.pkl"
33
+
34
+ qa_index_path: str = f"{faiss_indexes_base_path}/qa_faiss_index.idx"
35
+ qa_chunks_path: str = f"{faiss_indexes_base_path}/qa_text_chunks.json"
36
+
37
+ case_law_index_path: str = f"{faiss_indexes_base_path}/case_faiss.index"
38
+ case_law_chunks_path: str = f"{faiss_indexes_base_path}/case_chunks.pkl"
39
+
40
+ # Sentence Transformer Model
41
+ sentence_transformer_model: str = "BAAI/bge-large-en-v1.5"
42
+
43
+ # RAG Configuration
44
+ top_k_results: int = 5
45
+ max_unique_chunks: int = 10
46
+ confidence_threshold: float = 0.6
47
+
48
+ class Config:
49
+ env_file = ".env"
50
+ case_sensitive = False
51
+
52
+ settings = Settings()
app/models/__init__.py ADDED
File without changes
app/models/schemas.py ADDED
@@ -0,0 +1,43 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from pydantic import BaseModel, Field
2
+ from typing import Dict, List, Any, Optional
3
+
4
+ class CaseAnalysisRequest(BaseModel):
5
+ caseText: str = Field(..., description="The legal case text to analyze", min_length=10)
6
+ useQueryGeneration: bool = Field(default=True, description="Whether to use Gemini for query generation in RAG")
7
+
8
+ class CaseAnalysisResponse(BaseModel):
9
+ initialVerdict: str = Field(..., description="Initial verdict from LegalBERT model")
10
+ initialConfidence: float = Field(..., description="Confidence score of initial verdict")
11
+ finalVerdict: Optional[str] = Field(None, description="Final verdict after Gemini evaluation")
12
+ verdictChanged: bool = Field(default=False, description="Whether the verdict was changed by Gemini")
13
+ searchQuery: str = Field(..., description="Query used for RAG retrieval")
14
+ geminiExplanation: Optional[str] = Field(None, description="Detailed explanation from Gemini AI")
15
+ supportingSources: Dict[str, List[Any]] = Field(default_factory=dict, description="Retrieved supporting legal documents")
16
+ analysisLogs: Dict[str, Any] = Field(default_factory=dict, description="Detailed analysis logs")
17
+
18
+ class HealthResponse(BaseModel):
19
+ status: str = Field(..., description="Overall health status")
20
+ services: Dict[str, bool] = Field(default_factory=dict, description="Status of individual services")
21
+ error: Optional[str] = Field(None, description="Error message if unhealthy")
22
+
23
+ class VerdictPrediction(BaseModel):
24
+ verdict: str = Field(..., description="Predicted verdict (guilty/not guilty)")
25
+ confidence: float = Field(..., description="Confidence score between 0 and 1")
26
+
27
+ class RAGRetrievalResult(BaseModel):
28
+ query: str = Field(..., description="Query used for retrieval")
29
+ supportChunks: Dict[str, List[Any]] = Field(..., description="Retrieved chunks by category")
30
+ logs: Dict[str, Any] = Field(default_factory=dict, description="Retrieval logs")
31
+
32
+ class GeminiEvaluationRequest(BaseModel):
33
+ inputText: str = Field(..., description="Original case text")
34
+ modelVerdict: str = Field(..., description="Initial model verdict")
35
+ confidence: float = Field(..., description="Confidence of initial verdict")
36
+ support: Dict[str, List[Any]] = Field(..., description="Supporting legal documents")
37
+ searchQuery: Optional[str] = Field(None, description="Search query used")
38
+
39
+ class GeminiEvaluationResponse(BaseModel):
40
+ finalVerdict: Optional[str] = Field(None, description="Final verdict from Gemini")
41
+ verdictChanged: str = Field(..., description="Whether verdict was changed")
42
+ explanation: str = Field(..., description="Detailed legal explanation")
43
+ relevantLaws: List[str] = Field(default_factory=list, description="Relevant laws identified")
app/services/__init__.py ADDED
File without changes
app/services/gemini_service.py ADDED
@@ -0,0 +1,199 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import re
2
+ from typing import Dict, List, Any, Optional
3
+ from google import genai
4
+ from google.genai import types
5
+ from app.core.config import settings
6
+ import logging
7
+
8
+ logger = logging.getLogger(__name__)
9
+
10
+ class GeminiService:
11
+ def __init__(self):
12
+ self.client = None
13
+ self._initialize_client()
14
+
15
+ def _initialize_client(self):
16
+ try:
17
+ if settings.gemini_api_key:
18
+ self.client = genai.Client(api_key=settings.gemini_api_key)
19
+ logger.info("Gemini client initialized successfully")
20
+ else:
21
+ logger.warning("Gemini API key not provided")
22
+ except Exception as e:
23
+ logger.error(f"Failed to initialize Gemini client: {str(e)}")
24
+
25
+ def generateSearchQueryFromCase(self, caseFacts: str, verbose: bool = False) -> str:
26
+ if not self.client:
27
+ raise ValueError("Gemini client not initialized")
28
+
29
+ prompt = f"""
30
+ You are a legal assistant for a retrieval system based on Indian criminal law.
31
+
32
+ Given the case facts below, generate a **concise and focused search query** with **only the most relevant legal keywords**. These should include:
33
+
34
+ - Specific **IPC sections**
35
+ - Core **legal concepts** (e.g., "right of private defence", "criminal breach of trust")
36
+ - **Crime type** (e.g., "assault", "corruption")
37
+ - Any relevant **procedural issue** (e.g., "absence of intent", "lack of evidence")
38
+
39
+ Do **not** include:
40
+ - Full sentences
41
+ - Personal names
42
+ - Generic or vague words (e.g., "man", "incident", "case", "situation")
43
+
44
+ Keep the query under **20 words**. Separate terms by commas if needed. Optimize for legal document search.
45
+
46
+ Case Facts:
47
+ \"\"\"{caseFacts}\"\"\"
48
+
49
+ Return only the search query, no explanation or prefix:
50
+ """
51
+
52
+ try:
53
+ response = self.client.models.generate_content(
54
+ model=settings.gemini_model,
55
+ contents=prompt
56
+ )
57
+
58
+ if response.text:
59
+ query = response.text.replace("Search Query:", "").strip().strip('"').replace("\n", "")
60
+ else:
61
+ query = caseFacts[:50] # Fallback to first 50 chars
62
+
63
+ if verbose:
64
+ logger.info(f"Generated RAG Query: {query}")
65
+
66
+ return query
67
+ except Exception as e:
68
+ logger.error(f"Error generating search query: {str(e)}")
69
+ raise ValueError(f"Search query generation failed: {str(e)}")
70
+
71
+ def _build_gemini_prompt(self, input_text: str, model_verdict: str, confidence: float,
72
+ support: Dict[str, List], query: Optional[str] = None) -> str:
73
+ verdict_outcome = "a loss for the person" if model_verdict.lower() == "guilty" else "in favor of the person"
74
+
75
+ prompt = f"""You are a judge evaluating a legal dispute under Indian law.
76
+
77
+ ### Case Facts:
78
+ {input_text}
79
+
80
+ ### Initial Model Verdict:
81
+ {model_verdict.upper()} (Confidence: {confidence * 100:.2f}%)
82
+ This verdict is interpreted as {verdict_outcome}.
83
+ """
84
+
85
+ if query:
86
+ prompt += f"\n### Legal Query Used:\n{query}\n"
87
+
88
+ prompt += "\n---\n\n### Legal References Retrieved:\n\n#### Constitution Articles (Top 5):\n"
89
+ for i, art in enumerate(support.get("constitution", [])):
90
+ prompt += f"- {i+1}. {str(art)}\n"
91
+
92
+ prompt += "\n#### IPC Sections (Top 5):\n"
93
+ for i, sec in enumerate(support.get("ipcSections", [])):
94
+ prompt += f"- {i+1}. {str(sec)}\n"
95
+
96
+ prompt += "\n#### IPC Case Law (Top 5):\n"
97
+ for i, case in enumerate(support.get("ipcCase", [])):
98
+ prompt += f"- {i+1}. {str(case)}\n"
99
+
100
+ prompt += "\n#### Statutes (Top 5):\n"
101
+ for i, stat in enumerate(support.get("statutes", [])):
102
+ prompt += f"- {i+1}. {str(stat)}\n"
103
+
104
+ prompt += "\n#### QA Texts (Top 5):\n"
105
+ for i, qa in enumerate(support.get("qaTexts", [])):
106
+ prompt += f"- {i+1}. {str(qa)}\n"
107
+
108
+ prompt += "\n#### General Case Law (Top 5):\n"
109
+ for i, gcase in enumerate(support.get("caseLaw", [])):
110
+ prompt += f"- {i+1}. {str(gcase)}\n"
111
+
112
+ prompt += f"""
113
+
114
+ ---
115
+
116
+ ### Instructions to the Judge (You):
117
+
118
+ 1. Review the legal materials provided:
119
+ - Identify which Constitution articles, IPC sections, statutes, and case laws are relevant to the facts.
120
+ - Also note and explain which retrieved references are **not applicable** or irrelevant.
121
+
122
+ 2. If relevant past cases appear in the retrieved materials, summarize them and analyze whether they support or contradict the model's verdict.
123
+
124
+ 3. Using the above, assess the model's prediction:
125
+ - If confidence is below {settings.confidence_threshold * 100}%, you may revise or retain it.
126
+ - If confidence is {settings.confidence_threshold * 100}% or higher, retain unless clear legal grounds exist to challenge it.
127
+
128
+ 4. Provide a thorough and formal legal explanation that:
129
+ - Justifies the final decision using legal logic
130
+ - Cites relevant IPCs, constitutional provisions, statutes, and precedents
131
+ - Explains any reasoning for overriding the model's prediction, if applicable
132
+
133
+ 5. Conclude with the following lines, formatted as shown:
134
+
135
+ Final Verdict: Guilty or Not Guilty
136
+ Verdict Changed: Yes or No
137
+
138
+ Respond in the tone of a formal Indian judge. Your explanation should reflect reasoning, neutrality, and respect for legal procedure.
139
+ """
140
+ return prompt
141
+
142
+ def _extract_final_verdict(self, gemini_output: str) -> tuple[Optional[str], str]:
143
+ verdict_match = re.search(r"final verdict\s*[:\-]\s*(guilty|not guilty)", gemini_output, re.IGNORECASE)
144
+ changed_match = re.search(r"verdict changed\s*[:\-]\s*(yes|no)", gemini_output, re.IGNORECASE)
145
+
146
+ final_verdict = verdict_match.group(1).lower() if verdict_match else None
147
+ verdict_changed = "changed" if changed_match and changed_match.group(1).lower() == "yes" else "not changed"
148
+
149
+ return final_verdict, verdict_changed
150
+
151
+ def evaluateCaseWithGemini(self, inputText: str, modelVerdict: str, confidence: float,
152
+ support: Dict[str, List], searchQuery: str) -> Dict[str, Any]:
153
+ if not self.client:
154
+ raise ValueError("Gemini client not initialized")
155
+
156
+ try:
157
+ prompt = self._build_gemini_prompt(inputText, modelVerdict, confidence, support, searchQuery)
158
+
159
+ response = self.client.models.generate_content(
160
+ model=settings.gemini_model,
161
+ contents=prompt
162
+ )
163
+
164
+ geminiOutput = response.text if response.text else "No response from Gemini"
165
+ finalVerdict, verdictChanged = self._extract_final_verdict(geminiOutput)
166
+
167
+ logs = {
168
+ "inputText": inputText,
169
+ "modelVerdict": modelVerdict,
170
+ "confidence": confidence,
171
+ "support": support,
172
+ "promptToGemini": prompt,
173
+ "geminiOutput": geminiOutput,
174
+ "finalVerdictByGemini": finalVerdict,
175
+ "verdictChanged": verdictChanged,
176
+ "ragSearchQuery": searchQuery
177
+ }
178
+
179
+ return logs
180
+ except Exception as e:
181
+ logger.error(f"Error in Gemini evaluation: {str(e)}")
182
+ return {
183
+ "error": str(e),
184
+ "inputText": inputText,
185
+ "modelVerdict": modelVerdict,
186
+ "confidence": confidence,
187
+ "ragSearchQuery": searchQuery,
188
+ "support": None,
189
+ "promptToGemini": None,
190
+ "geminiOutput": None,
191
+ "finalVerdictByGemini": None,
192
+ "verdictChanged": None
193
+ }
194
+
195
+ def is_configured(self) -> bool:
196
+ return self.client is not None
197
+
198
+ def is_healthy(self) -> bool:
199
+ return self.is_configured()
app/services/legal_bert.py ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from app.core.config import settings
2
+ import logging
3
+ import os
4
+
5
+ logger = logging.getLogger(__name__)
6
+
7
+ class LegalBertService:
8
+ def __init__(self):
9
+ self.device = "cpu"
10
+ self.tokenizer = None
11
+ self.model = None
12
+ self._load_model()
13
+
14
+ def _load_model(self):
15
+ try:
16
+ if os.path.exists(settings.legal_bert_model_path):
17
+ logger.info(f"LegalBERT model path found: {settings.legal_bert_model_path}")
18
+ # TODO: Load actual model when torch/transformers are available
19
+ logger.info("Model loading placeholder - install torch and transformers to enable")
20
+ else:
21
+ logger.warning(f"LegalBERT model path does not exist: {settings.legal_bert_model_path}")
22
+ logger.info("Model will be loaded when files are available")
23
+ except Exception as e:
24
+ logger.error(f"Failed to load LegalBERT model: {str(e)}")
25
+
26
+ def predict_verdict(self, inputText: str) -> str:
27
+ if not self.is_model_loaded():
28
+ # Return placeholder prediction for development
29
+ logger.info("Using placeholder verdict prediction")
30
+ import hashlib
31
+ text_hash = int(hashlib.md5(inputText.encode()).hexdigest(), 16)
32
+ return "guilty" if text_hash % 2 == 1 else "not guilty"
33
+
34
+ # TODO: Implement actual prediction when model is loaded
35
+ return "not guilty"
36
+
37
+ def getConfidence(self, inputText: str) -> float:
38
+ if not self.is_model_loaded():
39
+ # Return placeholder confidence for development
40
+ logger.info("Using placeholder confidence score")
41
+ import hashlib
42
+ text_hash = int(hashlib.md5(inputText.encode()).hexdigest(), 16)
43
+ return 0.5 + (text_hash % 100) / 200.0 # Returns 0.5-0.99
44
+
45
+ # TODO: Implement actual confidence when model is loaded
46
+ return 0.75
47
+
48
+ def is_model_loaded(self) -> bool:
49
+ return False # Always False until actual model is loaded
50
+
51
+ def get_device(self) -> str:
52
+ return str(self.device)
53
+
54
+ def is_healthy(self) -> bool:
55
+ return True # Always healthy for placeholder implementation
app/services/rag_service.py ADDED
@@ -0,0 +1,108 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import json
2
+ import os
3
+ from concurrent.futures import ThreadPoolExecutor
4
+ from typing import Dict, List, Any, Tuple
5
+ from app.core.config import settings
6
+ import logging
7
+
8
+ logger = logging.getLogger(__name__)
9
+
10
+ class RAGService:
11
+ def __init__(self):
12
+ self.encoder = None
13
+ self.preloadedIndexes = {}
14
+ self._initialize_encoder()
15
+ self._load_indexes()
16
+
17
+ def _initialize_encoder(self):
18
+ try:
19
+ logger.info(f"Sentence transformer placeholder initialized")
20
+ # TODO: Initialize actual sentence transformer when dependencies are available
21
+ self.encoder = "placeholder"
22
+ except Exception as e:
23
+ logger.error(f"Failed to initialize encoder: {str(e)}")
24
+
25
+ def _load_faiss_index_and_chunks(self, indexPath: str, chunkPath: str) -> Tuple[Any, List]:
26
+ try:
27
+ if not os.path.exists(indexPath) or not os.path.exists(chunkPath):
28
+ logger.warning(f"Missing files: {indexPath} or {chunkPath}")
29
+ return None, []
30
+
31
+ # TODO: Load actual FAISS index when faiss-cpu is available
32
+
33
+ if chunkPath.endswith('.pkl'):
34
+ logger.info(f"Placeholder for pickle file: {chunkPath}")
35
+ chunks = []
36
+ else:
37
+ try:
38
+ with open(chunkPath, 'r', encoding='utf-8') as f:
39
+ chunks = json.load(f)
40
+ except:
41
+ chunks = []
42
+
43
+ logger.info(f"Loaded index placeholder from {indexPath} with {len(chunks)} chunks")
44
+ return "placeholder_index", chunks
45
+ except Exception as e:
46
+ logger.error(f"Failed to load index {indexPath}: {str(e)}")
47
+ return None, []
48
+
49
+ def _load_indexes(self):
50
+ indexConfigs = {
51
+ "constitution": (settings.constitution_index_path, settings.constitution_chunks_path),
52
+ "ipcSections": (settings.ipc_index_path, settings.ipc_chunks_path),
53
+ "ipcCase": (settings.ipc_case_index_path, settings.ipc_case_chunks_path),
54
+ "statutes": (settings.statute_index_path, settings.statute_chunks_path),
55
+ "qaTexts": (settings.qa_index_path, settings.qa_chunks_path),
56
+ "caseLaw": (settings.case_law_index_path, settings.case_law_chunks_path)
57
+ }
58
+
59
+ for name, (indexPath, chunkPath) in indexConfigs.items():
60
+ indexData = self._load_faiss_index_and_chunks(indexPath, chunkPath)
61
+ if indexData[0] is not None:
62
+ self.preloadedIndexes[name] = indexData
63
+ logger.info(f"Successfully loaded {name} index placeholder")
64
+ else:
65
+ logger.warning(f"Failed to load {name} index")
66
+
67
+ def retrieveSupportChunksParallel(self, inputText: str) -> Tuple[Dict[str, List], Dict]:
68
+ logger.info("Using placeholder RAG retrieval")
69
+
70
+ logs = {"query": inputText}
71
+
72
+ # Return placeholder support chunks
73
+ support = {}
74
+ for name in ["constitution", "ipcSections", "ipcCase", "statutes", "qaTexts", "caseLaw"]:
75
+ if name in self.preloadedIndexes:
76
+ _, chunks = self.preloadedIndexes[name]
77
+ support[name] = chunks[:5] if chunks else []
78
+ else:
79
+ support[name] = []
80
+
81
+ logs["supportChunksUsed"] = str(support)
82
+ return support, logs
83
+
84
+ def retrieveDualSupportChunks(self, inputText: str, geminiService) -> Tuple[Dict[str, List], str]:
85
+ try:
86
+ # Generate search query using Gemini
87
+ geminiQuery = None
88
+ try:
89
+ geminiQuery = geminiService.generateSearchQueryFromCase(inputText)
90
+ except Exception as e:
91
+ logger.warning(f"Failed to generate Gemini query: {str(e)}")
92
+
93
+ # Use placeholder retrieval
94
+ support, _ = self.retrieveSupportChunksParallel(inputText)
95
+
96
+ return support, geminiQuery or inputText
97
+ except Exception as e:
98
+ logger.error(f"Error in dual support retrieval: {str(e)}")
99
+ raise ValueError(f"Dual support retrieval failed: {str(e)}")
100
+
101
+ def areIndexesLoaded(self) -> bool:
102
+ return len(self.preloadedIndexes) > 0
103
+
104
+ def getLoadedIndexes(self) -> List[str]:
105
+ return list(self.preloadedIndexes.keys())
106
+
107
+ def is_healthy(self) -> bool:
108
+ return self.encoder is not None
faiss_indexes/README.md ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # FAISS Indexes Directory
2
+
3
+ ## Required Index Files
4
+
5
+ Add the following FAISS indexes and their corresponding chunk files:
6
+
7
+ ### Constitution
8
+ - `constitution_bgeLarge.index` - FAISS index for constitution articles
9
+ - `constitution_chunks.json` - Text chunks for constitution articles
10
+
11
+ ### IPC Sections
12
+ - `ipc_bgeLarge.index` - FAISS index for IPC sections
13
+ - `ipc_chunks.json` - Text chunks for IPC sections
14
+
15
+ ### IPC Case Law
16
+ - `ipc_case_flat.index` - FAISS index for IPC case law
17
+ - `ipc_case_chunks.json` - Text chunks for IPC cases
18
+
19
+ ### Statutes
20
+ - `statute_index.faiss` - FAISS index for legal statutes
21
+ - `statute_chunks.pkl` - Pickled chunks for statutes
22
+
23
+ ### Q&A Texts
24
+ - `qa_faiss_index.idx` - FAISS index for legal Q&A
25
+ - `qa_text_chunks.json` - Text chunks for Q&A content
26
+
27
+ ### General Case Law
28
+ - `case_faiss.index` - FAISS index for general case law
29
+ - `case_chunks.pkl` - Pickled chunks for case law
30
+
31
+ ## Installation
32
+
33
+ Once you have the index files:
34
+
35
+ 1. Install required dependencies:
36
+ ```bash
37
+ pip install faiss-cpu sentence-transformers numpy
38
+ ```
39
+
40
+ 2. The RAGService will automatically detect and load all available indexes when the server starts.
41
+
42
+ ## Index Requirements
43
+
44
+ - Built using sentence transformer embeddings (BAAI/bge-large-en-v1.5)
45
+ - Compatible with FAISS CPU implementation
46
+ - Chunk files should contain legal text snippets
47
+ - JSON files should contain arrays of text chunks
48
+ - PKL files should contain pickled chunk data
main.py ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import uvicorn
2
+ from fastapi import FastAPI
3
+ from fastapi.middleware.cors import CORSMiddleware
4
+ from app.api.routes import router
5
+ from app.core.config import settings
6
+ import logging
7
+
8
+ logging.basicConfig(level=logging.INFO)
9
+ logger = logging.getLogger(__name__)
10
+
11
+ app = FastAPI(
12
+ title="Legal RAG Analysis API",
13
+ description="FastAPI backend for legal case analysis using RAG system with LegalBERT predictions and Gemini AI evaluation",
14
+ version="1.0.0"
15
+ )
16
+
17
+ app.add_middleware(
18
+ CORSMiddleware,
19
+ allow_origins=["*"],
20
+ allow_credentials=True,
21
+ allow_methods=["*"],
22
+ allow_headers=["*"],
23
+ )
24
+
25
+ app.include_router(router, prefix="/api/v1")
26
+
27
+ @app.get("/")
28
+ async def root():
29
+ return {"message": "Legal RAG Analysis API", "version": "1.0.0"}
30
+
31
+ @app.get("/health")
32
+ async def health_check():
33
+ return {"status": "healthy", "message": "API is running"}
34
+
35
+ if __name__ == "__main__":
36
+ uvicorn.run(
37
+ "main:app",
38
+ host="0.0.0.0",
39
+ port=5000,
40
+ reload=True
41
+ )
models/README.md ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Models Directory
2
+
3
+ ## LegalBERT Model
4
+
5
+ Place your LegalBERT model files in the `legalbert_model/` subdirectory:
6
+
7
+ ```
8
+ models/
9
+ └── legalbert_model/
10
+ ├── config.json
11
+ ├── pytorch_model.bin
12
+ ├── tokenizer_config.json
13
+ ├── tokenizer.json
14
+ └── vocab.txt
15
+ ```
16
+
17
+ The model should be compatible with Hugging Face transformers library and fine-tuned for legal text classification.
18
+
19
+ ## Installation
20
+
21
+ Once you have the model files:
22
+
23
+ 1. Install required dependencies:
24
+ ```bash
25
+ pip install torch transformers
26
+ ```
27
+
28
+ 2. The LegalBertService will automatically detect and load the model when the server starts.
29
+
30
+ ## Model Requirements
31
+
32
+ - Should output binary classification (guilty/not guilty)
33
+ - Compatible with AutoModelForSequenceClassification
34
+ - Supports text truncation and padding
35
+ - Returns logits that can be converted to probabilities
pyproject.toml ADDED
@@ -0,0 +1,1134 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [project]
2
+ name = "repl-nix-workspace"
3
+ version = "0.1.0"
4
+ description = "Add your description here"
5
+ requires-python = ">=3.11"
6
+ dependencies = [
7
+ "fastapi>=0.116.1",
8
+ "google-genai>=1.27.0",
9
+ "pydantic>=2.11.7",
10
+ "pydantic-settings>=2.10.1",
11
+ "uvicorn>=0.35.0",
12
+ ]
13
+
14
+ [[tool.uv.index]]
15
+ explicit = true
16
+ name = "pytorch-cpu"
17
+ url = "https://download.pytorch.org/whl/cpu"
18
+
19
+ [tool.uv.sources]
20
+ AA-module = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
21
+ ABlooper = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
22
+ AnalysisG = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
23
+ AutoRAG = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
24
+ BERTeam = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
25
+ BxTorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
26
+ Byaldi = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
27
+ CALM-Pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
28
+ COPEX-high-rate-compression-quality-metrics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
29
+ CityLearn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
30
+ CoCa-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
31
+ CoLT5-attention = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
32
+ ComfyUI-EasyNodes = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
33
+ Crawl4AI = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
34
+ DALL-E = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
35
+ DI-toolkit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
36
+ DatasetRising = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
37
+ DeepCache = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
38
+ DeepMatter = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
39
+ Draugr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
40
+ ESRNN = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
41
+ En-transformer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
42
+ ExpoSeq = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
43
+ FLAML = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
44
+ FSRS-Optimizer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
45
+ GANDLF = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
46
+ GQLAlchemy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
47
+ GhostScan = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
48
+ GraKeL = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
49
+ HEBO = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
50
+ IOPaint = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
51
+ ISLP = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
52
+ InvokeAI = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
53
+ JAEN = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
54
+ KapoorLabs-Lightning = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
55
+ LightAutoML = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
56
+ LingerGRN = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
57
+ MMEdu = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
58
+ MRzeroCore = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
59
+ Modeva = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
60
+ NeuralFoil = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
61
+ NiMARE = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
62
+ NinjaTools = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
63
+ OpenHosta = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
64
+ OpenNMT-py = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
65
+ POT = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
66
+ PVNet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
67
+ PaLM-rlhf-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
68
+ PepperPepper = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
69
+ PiML = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
70
+ Poutyne = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
71
+ QNCP = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
72
+ RAGatouille = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
73
+ RareGO = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
74
+ RealtimeSTT = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
75
+ RelevanceAI-Workflows-Core = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
76
+ Resemblyzer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
77
+ ScandEval = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
78
+ Simba-UW-tf-dev = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
79
+ SwissArmyTransformer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
80
+ TPOT = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
81
+ TTS = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
82
+ TorchCRF = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
83
+ TotalSegmentator = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
84
+ UtilsRL = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
85
+ WhisperSpeech = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
86
+ XAISuite = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
87
+ a-unet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
88
+ a5dev = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
89
+ accelerate = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
90
+ accelerated-scan = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
91
+ accern-xyme = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
92
+ achatbot = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
93
+ acids-rave = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
94
+ actorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
95
+ acvl-utils = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
96
+ adabelief-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
97
+ adam-atan2-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
98
+ adan-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
99
+ adapters = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
100
+ admin-torch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
101
+ adtoolbox = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
102
+ adversarial-robustness-toolbox = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
103
+ aeiou = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
104
+ aeon = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
105
+ africanwhisper = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
106
+ ag-llama-api = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
107
+ agentdojo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
108
+ agilerl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
109
+ ai-edge-torch-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
110
+ ai-parrot = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
111
+ ai-python = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
112
+ ai-transform = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
113
+ ai2-olmo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
114
+ ai2-olmo-core = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
115
+ ai2-tango = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
116
+ aicmder = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
117
+ aider-chat = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
118
+ aider-chat-x = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
119
+ aif360 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
120
+ aihwkit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
121
+ aimodelshare = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
122
+ airllm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
123
+ airtestProject = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
124
+ airunner = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
125
+ aisak = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
126
+ aislib = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
127
+ aisquared = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
128
+ aistore = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
129
+ aithree = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
130
+ akasha-terminal = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
131
+ alibi = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
132
+ alibi-detect = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
133
+ alignn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
134
+ all-clip = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
135
+ allennlp = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
136
+ allennlp-models = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
137
+ allennlp-pvt-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
138
+ allophant = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
139
+ allosaurus = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
140
+ aloy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
141
+ alpaca-eval = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
142
+ alphafold2-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
143
+ alphafold3-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
144
+ alphamed-federated = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
145
+ alphawave = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
146
+ amazon-braket-pennylane-plugin = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
147
+ amazon-photos = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
148
+ anemoi-graphs = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
149
+ anemoi-models = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
150
+ anomalib = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
151
+ apache-beam = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
152
+ apache-tvm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
153
+ aperturedb = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
154
+ aphrodite-engine = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
155
+ aqlm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
156
+ arcAGI2024 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
157
+ archisound = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
158
+ argbind = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
159
+ arize = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
160
+ arm-pytorch-utilities = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
161
+ array-api-compat = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
162
+ arus = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
163
+ assert-llm-tools = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
164
+ asteroid = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
165
+ asteroid-filterbanks = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
166
+ astra-llm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
167
+ astrovision = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
168
+ atomate2 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
169
+ attacut = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
170
+ audio-diffusion-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
171
+ audio-encoders-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
172
+ audio-separator = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
173
+ audiocraft = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
174
+ audiolm-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
175
+ auralis = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
176
+ auraloss = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
177
+ auto-gptq = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
178
+ autoawq = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
179
+ autoawq-kernels = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
180
+ "autogluon.multimodal" = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
181
+ "autogluon.tabular" = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
182
+ "autogluon.timeseries" = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
183
+ autotrain-advanced = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
184
+ avdeepfake1m = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
185
+ aws-fortuna = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
186
+ ax-platform = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
187
+ azureml-automl-dnn-vision = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
188
+ azureml-contrib-automl-dnn-forecasting = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
189
+ azureml-evaluate-mlflow = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
190
+ azureml-metrics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
191
+ azureml-train-automl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
192
+ b2bTools = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
193
+ backpack-for-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
194
+ balrog-nle = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
195
+ batch-face = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
196
+ batchalign = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
197
+ batchgeneratorsv2 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
198
+ batchtensor = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
199
+ bbrl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
200
+ benchpots = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
201
+ bent = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
202
+ bert-score = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
203
+ bertopic = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
204
+ bertviz = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
205
+ bestOf = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
206
+ betty-ml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
207
+ big-sleep = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
208
+ bigdl-core-cpp = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
209
+ bigdl-core-npu = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
210
+ bigdl-llm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
211
+ bigdl-nano = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
212
+ "bioimageio.core" = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
213
+ bitfount = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
214
+ bitsandbytes = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
215
+ bittensor = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
216
+ bittensor-cli = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
217
+ blackboxopt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
218
+ blanc = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
219
+ blindai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
220
+ bm25-pt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
221
+ boltz = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
222
+ botorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
223
+ boxmot = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
224
+ brainchain = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
225
+ braindecode = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
226
+ brevitas = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
227
+ briton = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
228
+ browsergym-visualwebarena = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
229
+ buzz-captions = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
230
+ byotrack = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
231
+ byzerllm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
232
+ c4v-py = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
233
+ calflops = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
234
+ came-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
235
+ camel-ai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
236
+ camel-tools = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
237
+ cannai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
238
+ captum = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
239
+ carte-ai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
240
+ carvekit-colab = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
241
+ catalyst = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
242
+ causalml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
243
+ causalnex = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
244
+ causy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
245
+ cbrkit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
246
+ cca-zoo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
247
+ cdp-backend = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
248
+ cellacdc = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
249
+ cellfinder = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
250
+ cellpose = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
251
+ cellxgene-census = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
252
+ chattts = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
253
+ chemprop = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
254
+ chgnet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
255
+ chitra = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
256
+ circuitsvis = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
257
+ cjm-yolox-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
258
+ clarinpl-embeddings = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
259
+ class-resolver = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
260
+ classifier-free-guidance-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
261
+ classiq = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
262
+ classy-core = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
263
+ clean-fid = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
264
+ cleanvision = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
265
+ clip-anytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
266
+ clip-benchmark = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
267
+ clip-by-openai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
268
+ clip-interrogator = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
269
+ clip-retrieval = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
270
+ cltk = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
271
+ clu = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
272
+ clusterops = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
273
+ cnocr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
274
+ cnstd = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
275
+ coba = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
276
+ cofi = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
277
+ colbert-ai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
278
+ colpali-engine = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
279
+ compel = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
280
+ composabl-ray = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
281
+ composabl-ray-dev = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
282
+ composabl-train = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
283
+ composabl-train-dev = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
284
+ composer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
285
+ compressai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
286
+ compressed-tensors = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
287
+ compressed-tensors-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
288
+ concrete-python = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
289
+ confit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
290
+ conformer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
291
+ contextualSpellCheck = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
292
+ continual-inference = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
293
+ controlnet-aux = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
294
+ convokit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
295
+ coola = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
296
+ coqui-tts = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
297
+ coqui-tts-trainer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
298
+ craft-text-detector = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
299
+ creme = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
300
+ crocodile = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
301
+ crowd-kit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
302
+ cryoSPHERE = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
303
+ csle-common = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
304
+ csle-system-identification = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
305
+ ctgan = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
306
+ curated-transformers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
307
+ cut-cross-entropy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
308
+ cvat-sdk = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
309
+ cybertask = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
310
+ d3rlpy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
311
+ dalle-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
312
+ dalle2-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
313
+ danila-lib = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
314
+ danling = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
315
+ darts = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
316
+ darwin-py = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
317
+ data-gradients = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
318
+ datachain = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
319
+ dataclass-array = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
320
+ dataeval = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
321
+ datarobot-drum = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
322
+ datarobotx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
323
+ datasets = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
324
+ datumaro = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
325
+ dctorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
326
+ deep-utils = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
327
+ deepchecks = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
328
+ deepchem = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
329
+ deepctr-torch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
330
+ deepecho = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
331
+ deepepochs = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
332
+ deepforest = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
333
+ deeplabcut = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
334
+ deepmd-kit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
335
+ deepmultilingualpunctuation = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
336
+ deepparse = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
337
+ deeprobust = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
338
+ deepsparse = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
339
+ deepsparse-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
340
+ deepspeed = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
341
+ denoising-diffusion-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
342
+ descript-audio-codec = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
343
+ descript-audiotools = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
344
+ detecto = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
345
+ detoxify = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
346
+ dgenerate = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
347
+ dghs-imgutils = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
348
+ dgl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
349
+ dialogy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
350
+ dice-ml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
351
+ diffgram = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
352
+ diffq = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
353
+ diffusers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
354
+ distilabel = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
355
+ distrifuser = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
356
+ dnikit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
357
+ docarray = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
358
+ doclayout-yolo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
359
+ docling-ibm-models = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
360
+ docquery = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
361
+ domino-code-assist = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
362
+ dreamsim = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
363
+ dropblock = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
364
+ druida = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
365
+ dvclive = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
366
+ e2-tts-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
367
+ e2cnn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
368
+ e3nn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
369
+ easyocr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
370
+ ebtorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
371
+ ecallisto-ng = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
372
+ edsnlp = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
373
+ effdet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
374
+ einx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
375
+ eir-dl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
376
+ eis1600 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
377
+ eland = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
378
+ ema-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
379
+ embedchain = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
380
+ enformer-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
381
+ entmax = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
382
+ esm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
383
+ espaloma-charge = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
384
+ espnet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
385
+ etils = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
386
+ etna = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
387
+ evadb = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
388
+ evalscope = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
389
+ evaluate = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
390
+ exllamav2 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
391
+ extractable = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
392
+ face-alignment = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
393
+ facenet-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
394
+ facexlib = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
395
+ fair-esm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
396
+ fairseq = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
397
+ fairseq2 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
398
+ fairseq2n = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
399
+ faker-file = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
400
+ farm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
401
+ fast-bert = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
402
+ fast-pytorch-kmeans = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
403
+ fastai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
404
+ fastcore = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
405
+ fastestimator-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
406
+ fasttreeshap = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
407
+ fedml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
408
+ felupe = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
409
+ femr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
410
+ fer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
411
+ fft-conv-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
412
+ fickling = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
413
+ fireworks-ai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
414
+ flair = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
415
+ flashrag-dev = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
416
+ flax = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
417
+ flexgen = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
418
+ flgo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
419
+ flopth = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
420
+ flowcept = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
421
+ flytekitplugins-kfpytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
422
+ flytekitplugins-onnxpytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
423
+ fmbench = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
424
+ focal-frequency-loss = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
425
+ foldedtensor = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
426
+ fractal-tasks-core = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
427
+ freegenius = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
428
+ freqtrade = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
429
+ fschat = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
430
+ funasr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
431
+ functorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
432
+ funlbm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
433
+ funsor = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
434
+ galore-torch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
435
+ garak = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
436
+ garf = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
437
+ gateloop-transformer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
438
+ geffnet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
439
+ genutility = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
440
+ gfpgan = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
441
+ gigagan-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
442
+ gin-config = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
443
+ glasflow = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
444
+ gliner = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
445
+ gluonts = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
446
+ gmft = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
447
+ google-cloud-aiplatform = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
448
+ gpforecaster = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
449
+ gpt3discord = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
450
+ gpytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
451
+ grad-cam = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
452
+ graph-weather = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
453
+ graphistry = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
454
+ gravitorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
455
+ gretel-synthetics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
456
+ gsplat = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
457
+ guardrails-ai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
458
+ guidance = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
459
+ gymnasium = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
460
+ hanlp = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
461
+ happytransformer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
462
+ hbutils = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
463
+ heavyball = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
464
+ hezar = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
465
+ hf-deepali = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
466
+ hf-doc-builder = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
467
+ higher = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
468
+ hjxdl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
469
+ hkkang-utils = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
470
+ hordelib = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
471
+ hpsv2 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
472
+ huggingface-hub = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
473
+ hummingbird-ml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
474
+ hvae-backbone = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
475
+ hya = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
476
+ hypothesis-torch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
477
+ ibm-metrics-plugin = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
478
+ ibm-watson-machine-learning = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
479
+ ibm-watsonx-ai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
480
+ icetk = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
481
+ icevision = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
482
+ iden = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
483
+ idvpackage = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
484
+ iglovikov-helper-functions = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
485
+ imagededup = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
486
+ imagen-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
487
+ imaginAIry = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
488
+ img2vec-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
489
+ incendio = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
490
+ inference = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
491
+ inference-gpu = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
492
+ infinity-emb = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
493
+ info-nce-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
494
+ infoapps-mlops-sdk = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
495
+ instructlab = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
496
+ instructlab-dolomite = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
497
+ instructlab-eval = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
498
+ instructlab-sdg = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
499
+ instructlab-training = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
500
+ invisible-watermark = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
501
+ iobm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
502
+ ipex-llm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
503
+ iree-turbine = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
504
+ irisml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
505
+ irisml-tasks-azure-openai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
506
+ irisml-tasks-torchvision = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
507
+ irisml-tasks-training = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
508
+ item-matching = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
509
+ ivadomed = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
510
+ jaqpotpy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
511
+ jina = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
512
+ judo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
513
+ junky = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
514
+ k-diffusion = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
515
+ k1lib = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
516
+ k2 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
517
+ kappadata = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
518
+ kappamodules = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
519
+ karbonn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
520
+ kats = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
521
+ kbnf = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
522
+ kedro-datasets = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
523
+ keybert = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
524
+ keytotext = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
525
+ khoj = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
526
+ kiui = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
527
+ konfuzio-sdk = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
528
+ kornia = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
529
+ kornia-moons = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
530
+ kraken = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
531
+ kwarray = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
532
+ kwimage = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
533
+ labml-nn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
534
+ lagent = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
535
+ laion-clap = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
536
+ lale = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
537
+ lama-cleaner = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
538
+ lancedb = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
539
+ langcheck = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
540
+ langkit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
541
+ langroid = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
542
+ langtest = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
543
+ layoutparser = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
544
+ ldp = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
545
+ leafmap = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
546
+ leap-ie = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
547
+ leibniz = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
548
+ leptonai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
549
+ letmedoit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
550
+ lhotse = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
551
+ lib310 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
552
+ libpecos = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
553
+ librec-auto = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
554
+ libretranslate = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
555
+ liger-kernel = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
556
+ liger-kernel-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
557
+ lightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
558
+ lightning = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
559
+ lightning-bolts = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
560
+ lightning-fabric = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
561
+ lightning-habana = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
562
+ lightning-lite = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
563
+ lightrag = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
564
+ lightweight-gan = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
565
+ lightwood = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
566
+ linear-attention-transformer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
567
+ linear-operator = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
568
+ linformer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
569
+ linformer-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
570
+ liom-toolkit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
571
+ lion-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
572
+ lit-nlp = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
573
+ litdata = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
574
+ litelama = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
575
+ litgpt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
576
+ llama-index-embeddings-adapter = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
577
+ llama-index-embeddings-clip = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
578
+ llama-index-embeddings-instructor = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
579
+ llama-index-llms-huggingface = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
580
+ llama-index-postprocessor-colbert-rerank = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
581
+ llm-blender = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
582
+ llm-foundry = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
583
+ llm-guard = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
584
+ llm-rs = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
585
+ llm2vec = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
586
+ llmcompressor = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
587
+ llmlingua = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
588
+ llmvm-cli = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
589
+ lm-eval = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
590
+ lmdeploy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
591
+ lmms-eval = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
592
+ local-attention = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
593
+ lovely-tensors = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
594
+ lpips = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
595
+ lycoris-lora = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
596
+ mace-torch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
597
+ magic-pdf = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
598
+ magicsoup = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
599
+ magvit2-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
600
+ maite = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
601
+ manga-ocr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
602
+ manifest-ml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
603
+ manipulation = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
604
+ marker-pdf = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
605
+ matgl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
606
+ med-imagetools = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
607
+ medaka = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
608
+ medcat = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
609
+ medmnist = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
610
+ megablocks = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
611
+ megatron-energon = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
612
+ memos = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
613
+ meshgpt-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
614
+ metatensor-torch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
615
+ mflux = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
616
+ mia-vgg = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
617
+ miditok = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
618
+ minari = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
619
+ minicons = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
620
+ ml2rt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
621
+ mlagents = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
622
+ mlbench-core = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
623
+ mlcroissant = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
624
+ mlpfile = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
625
+ mlx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
626
+ mlx-whisper = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
627
+ mmaction2 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
628
+ mmengine = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
629
+ mmengine-lite = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
630
+ mmocr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
631
+ mmpose = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
632
+ mmsegmentation = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
633
+ modeci-mdf = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
634
+ model2vec = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
635
+ modelscope = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
636
+ modelspec = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
637
+ monai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
638
+ monai-weekly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
639
+ monotonic-alignment-search = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
640
+ monty = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
641
+ mosaicml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
642
+ mosaicml-streaming = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
643
+ moshi = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
644
+ mteb = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
645
+ mtmtrain = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
646
+ multi-quantization = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
647
+ myhand = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
648
+ nGPT-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
649
+ naeural-core = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
650
+ napari = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
651
+ napatrackmater = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
652
+ nara-wpe = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
653
+ natten = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
654
+ nbeats-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
655
+ nebulae = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
656
+ nemo-toolkit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
657
+ neptune = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
658
+ neptune-client = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
659
+ nerfacc = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
660
+ nerfstudio = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
661
+ nessai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
662
+ netcal = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
663
+ neural-rag = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
664
+ neuralforecast = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
665
+ neuralnets = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
666
+ neuralprophet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
667
+ neuspell = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
668
+ nevergrad = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
669
+ nexfort = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
670
+ nimblephysics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
671
+ nirtorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
672
+ nkululeko = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
673
+ nlp = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
674
+ nlptooltest = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
675
+ nnAudio = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
676
+ nnodely = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
677
+ nnsight = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
678
+ nnunetv2 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
679
+ noisereduce = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
680
+ nonebot-plugin-nailongremove = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
681
+ nowcasting-dataloader = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
682
+ nowcasting-forecast = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
683
+ nshtrainer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
684
+ nuwa-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
685
+ nvflare = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
686
+ nvidia-modelopt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
687
+ ocf-datapipes = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
688
+ ocnn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
689
+ ogb = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
690
+ ohmeow-blurr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
691
+ olive-ai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
692
+ omlt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
693
+ ommlx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
694
+ onediff = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
695
+ onediffx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
696
+ onnx2pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
697
+ onnx2torch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
698
+ opacus = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
699
+ open-clip-torch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
700
+ open-flamingo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
701
+ open-interpreter = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
702
+ openbb-terminal-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
703
+ openmim = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
704
+ openparse = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
705
+ openunmix = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
706
+ openvino-dev = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
707
+ openvino-tokenizers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
708
+ openvino-xai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
709
+ openwakeword = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
710
+ opt-einsum-fx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
711
+ optimum = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
712
+ optimum-habana = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
713
+ optimum-intel = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
714
+ optimum-neuron = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
715
+ optimum-quanto = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
716
+ optree = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
717
+ optuna = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
718
+ optuna-dashboard = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
719
+ optuna-integration = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
720
+ oracle-ads = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
721
+ orbit-ml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
722
+ otx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
723
+ outetts = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
724
+ outlines = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
725
+ outlines-core = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
726
+ paddlenlp = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
727
+ pai-easycv = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
728
+ pandasai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
729
+ panns-inference = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
730
+ patchwork-cli = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
731
+ peft = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
732
+ pegasuspy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
733
+ pelutils = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
734
+ penn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
735
+ perforatedai-freemium = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
736
+ performer-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
737
+ petastorm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
738
+ pfio = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
739
+ pgmpy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
740
+ phenolrs = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
741
+ phobos = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
742
+ pi-zero-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
743
+ pinecone-text = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
744
+ piq = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
745
+ pix2tex = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
746
+ pix2text = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
747
+ pnnx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
748
+ policyengine-us-data = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
749
+ polyfuzz = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
750
+ pomegranate = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
751
+ positional-encodings = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
752
+ prefigure = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
753
+ product-key-memory = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
754
+ ptflops = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
755
+ ptwt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
756
+ pulser-core = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
757
+ punctuators = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
758
+ py2ls = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
759
+ pyabsa = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
760
+ "pyannote.audio" = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
761
+ pyawd = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
762
+ pyclarity = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
763
+ pycox = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
764
+ pyfemtet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
765
+ pyg-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
766
+ pygrinder = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
767
+ pyhealth = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
768
+ pyhf = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
769
+ pyiqa = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
770
+ pykeen = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
771
+ pykeops = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
772
+ pylance = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
773
+ pylineaGT = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
774
+ pymanopt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
775
+ pymde = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
776
+ pypots = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
777
+ pyqlib = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
778
+ pyqtorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
779
+ pyro-ppl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
780
+ pysentimiento = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
781
+ pyserini = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
782
+ pysr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
783
+ pythainlp = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
784
+ python-doctr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
785
+ pytorch-fid = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
786
+ pytorch-forecasting = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
787
+ pytorch-ignite = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
788
+ pytorch-kinematics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
789
+ pytorch-lightning = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
790
+ pytorch-lightning-bolts = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
791
+ pytorch-metric-learning = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
792
+ pytorch-model-summary = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
793
+ pytorch-msssim = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
794
+ pytorch-pfn-extras = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
795
+ pytorch-pretrained-bert = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
796
+ pytorch-ranger = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
797
+ pytorch-seed = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
798
+ pytorch-tabnet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
799
+ pytorch-tabular = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
800
+ pytorch-toolbelt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
801
+ pytorch-transformers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
802
+ pytorch-transformers-pvt-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
803
+ pytorch-triton-rocm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
804
+ pytorch-warmup = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
805
+ pytorch-wavelets = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
806
+ pytorch_optimizer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
807
+ pytorch_revgrad = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
808
+ pytorchcv = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
809
+ pytorchltr2 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
810
+ pyvene = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
811
+ pyvespa = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
812
+ qianfan = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
813
+ qibo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
814
+ qiskit-machine-learning = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
815
+ qtorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
816
+ quanto = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
817
+ quick-anomaly-detector = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
818
+ rastervision = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
819
+ rastervision-pytorch-backend = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
820
+ rastervision-pytorch-learner = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
821
+ ray-lightning = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
822
+ rclip = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
823
+ realesrgan = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
824
+ recbole = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
825
+ recommenders = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
826
+ redcat = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
827
+ reformer-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
828
+ regex-sampler = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
829
+ replay-rec = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
830
+ rerankers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
831
+ research-framework = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
832
+ resemble-enhance = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
833
+ resnest = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
834
+ rf-clip = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
835
+ rf-groundingdino = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
836
+ rfconv = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
837
+ rich-logger = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
838
+ ring-attention-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
839
+ rltrade-test = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
840
+ rotary-embedding-torch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
841
+ rsp-ml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
842
+ rust-circuit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
843
+ s2fft = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
844
+ s3prl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
845
+ s3torchconnector = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
846
+ saferx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
847
+ safetensors = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
848
+ sagemaker-huggingface-inference-toolkit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
849
+ sagemaker-ssh-helper = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
850
+ salesforce-lavis = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
851
+ salesforce-merlion = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
852
+ samv2 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
853
+ scib = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
854
+ scib-metrics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
855
+ scvi-tools = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
856
+ sdmetrics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
857
+ secretflow = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
858
+ segment-anything-hq = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
859
+ segment-anything-py = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
860
+ segmentation-models-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
861
+ self-rewarding-lm-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
862
+ semantic-kernel = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
863
+ semantic-router = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
864
+ senselab = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
865
+ sent2vec = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
866
+ sentence-transformers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
867
+ sequence-model-train = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
868
+ serotiny = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
869
+ sevenn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
870
+ sglang = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
871
+ shap = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
872
+ silero-api-server = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
873
+ silero-vad = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
874
+ silicondiff-npu = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
875
+ simclr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
876
+ simple-lama-inpainting = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
877
+ sinabs = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
878
+ sixdrepnet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
879
+ skforecast = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
880
+ skorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
881
+ skrl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
882
+ skt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
883
+ sktime = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
884
+ sktmls = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
885
+ slangtorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
886
+ smartnoise-synth = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
887
+ smashed = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
888
+ smplx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
889
+ smqtk-descriptors = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
890
+ smqtk-detection = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
891
+ snntorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
892
+ snorkel = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
893
+ snowflake-ml-python = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
894
+ so-vits-svc-fork = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
895
+ sonusai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
896
+ sony-custom-layers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
897
+ sotopia = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
898
+ spacr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
899
+ spacy-curated-transformers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
900
+ spacy-experimental = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
901
+ spacy-huggingface-pipelines = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
902
+ spacy-llm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
903
+ spacy-transformers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
904
+ span-marker = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
905
+ spandrel = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
906
+ spandrel-extra-arches = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
907
+ sparrow-python = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
908
+ spatialdata = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
909
+ speechbrain = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
910
+ speechtokenizer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
911
+ spikeinterface = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
912
+ spikingjelly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
913
+ spotiflow = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
914
+ spotpython = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
915
+ spotriver = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
916
+ squirrel-core = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
917
+ stable-baselines3 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
918
+ stable-diffusion-sdkit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
919
+ stable-ts = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
920
+ stanford-stk = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
921
+ stanfordnlp = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
922
+ stanza = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
923
+ startorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
924
+ streamtasks = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
925
+ struct-eqtable = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
926
+ stylegan2-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
927
+ supar = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
928
+ super-gradients = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
929
+ super-image = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
930
+ superlinked = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
931
+ supervisely = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
932
+ surya-ocr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
933
+ svdiff-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
934
+ swarm-models = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
935
+ swarmauri = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
936
+ swarms-memory = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
937
+ swebench = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
938
+ syft = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
939
+ sympytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
940
+ syne-tune = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
941
+ synthcity = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
942
+ t5 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
943
+ tab-transformer-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
944
+ tabpfn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
945
+ taming-transformers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
946
+ taming-transformers-rom1504 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
947
+ taskwiz = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
948
+ tbparse = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
949
+ tecton = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
950
+ tensor-parallel = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
951
+ tensorcircuit-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
952
+ tensordict = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
953
+ tensordict-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
954
+ tensorizer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
955
+ tensorrt-llm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
956
+ texify = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
957
+ text2text = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
958
+ textattack = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
959
+ tfkit = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
960
+ thepipe-api = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
961
+ thinc = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
962
+ thingsvision = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
963
+ thirdai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
964
+ thop = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
965
+ tianshou = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
966
+ tidy3d = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
967
+ timesfm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
968
+ timm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
969
+ tipo-kgen = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
970
+ tmnt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
971
+ toad = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
972
+ tomesd = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
973
+ top2vec = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
974
+ torch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
975
+ torch-audiomentations = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
976
+ torch-dct = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
977
+ torch-delaunay = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
978
+ torch-directml = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
979
+ torch-ema = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
980
+ torch-encoding = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
981
+ torch-fidelity = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
982
+ torch-geometric = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
983
+ torch-geopooling = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
984
+ torch-harmonics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
985
+ torch-kmeans = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
986
+ torch-lr-finder = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
987
+ torch-max-mem = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
988
+ torch-npu = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
989
+ torch-optimi = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
990
+ torch-optimizer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
991
+ torch-ort = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
992
+ torch-pitch-shift = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
993
+ torch-ppr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
994
+ torch-pruning = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
995
+ torch-snippets = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
996
+ torch-stoi = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
997
+ torch-struct = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
998
+ torch-tensorrt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
999
+ torchani = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1000
+ torchattacks = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1001
+ torchaudio = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1002
+ torchbiggraph = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1003
+ torchcam = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1004
+ torchcde = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1005
+ torchcfm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1006
+ torchcrepe = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1007
+ torchdata = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1008
+ torchdatasets-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1009
+ torchdiffeq = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1010
+ torchdyn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1011
+ torchestra = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1012
+ torcheval = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1013
+ torcheval-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1014
+ torchextractor = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1015
+ torchfcpe = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1016
+ torchfun = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1017
+ torchfunc-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1018
+ torchgeo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1019
+ torchgeometry = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1020
+ torchio = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1021
+ torchjpeg = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1022
+ torchlayers-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1023
+ torchmeta = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1024
+ torchmetrics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1025
+ torchmocks = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1026
+ torchpack = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1027
+ torchpippy = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1028
+ torchpq = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1029
+ torchprofile = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1030
+ torchquantlib = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1031
+ torchrec = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1032
+ torchrec-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1033
+ torchrec-nightly-cpu = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1034
+ torchrl = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1035
+ torchrl-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1036
+ torchscale = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1037
+ torchsde = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1038
+ torchseg = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1039
+ torchserve = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1040
+ torchserve-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1041
+ torchsnapshot-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1042
+ torchsr = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1043
+ torchstain = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1044
+ torchsummaryX = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1045
+ torchtext = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1046
+ torchtnt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1047
+ torchtnt-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1048
+ torchtyping = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1049
+ torchutil = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1050
+ torchvinecopulib = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1051
+ torchvision = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1052
+ torchviz = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1053
+ torchx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1054
+ torchx-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1055
+ torchxrayvision = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1056
+ totalspineseg = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1057
+ tracebloc-package-dev = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1058
+ trainer = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1059
+ transformer-engine = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1060
+ transformer-lens = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1061
+ transformer-smaller-training-vocab = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1062
+ transformers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1063
+ transformers-domain-adaptation = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1064
+ transfusion-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1065
+ transparent-background = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1066
+ treescope = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1067
+ trolo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1068
+ tsai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1069
+ tslearn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1070
+ ttspod = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1071
+ txtai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1072
+ tyro = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1073
+ u8darts = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1074
+ uhg = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1075
+ uitestrunner-syberos = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1076
+ ultimate-rvc = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1077
+ ultralytics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1078
+ ultralytics-thop = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1079
+ unav = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1080
+ unbabel-comet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1081
+ underthesea = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1082
+ unfoldNd = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1083
+ unimernet = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1084
+ unitorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1085
+ unitxt = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1086
+ unsloth = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1087
+ unsloth-zoo = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1088
+ unstructured = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1089
+ unstructured-inference = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1090
+ utilsd = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1091
+ v-diffusion-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1092
+ vIQA = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1093
+ vectice = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1094
+ vector-quantize-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1095
+ vectorhub-nightly = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1096
+ versatile-audio-upscaler = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1097
+ vertexai = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1098
+ vesin = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1099
+ vgg-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1100
+ video-representations-extractor = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1101
+ viser = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1102
+ vision-datasets = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1103
+ visionmetrics = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1104
+ visu3d = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1105
+ vit-pytorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1106
+ viturka-nn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1107
+ vllm = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1108
+ vllm-flash-attn = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1109
+ vocos = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1110
+ vollseg = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1111
+ vtorch = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1112
+ wavmark = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1113
+ wdoc = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1114
+ whisper-live = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1115
+ whisper-timestamped = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1116
+ whisperx = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1117
+ wilds = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1118
+ wordllama = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1119
+ worker-automate-hub = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1120
+ wxbtool = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1121
+ x-clip = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1122
+ x-transformers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1123
+ xaitk_saliency = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1124
+ xformers = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1125
+ xgrammar = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1126
+ xinference = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1127
+ xtts-api-server = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1128
+ yolo-poser = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1129
+ yolov5 = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1130
+ yolov7-package = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1131
+ yta-general-utils = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1132
+ zensvi = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1133
+ zetascale = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
1134
+ zuko = [{ index = "pytorch-cpu", marker = "platform_system == 'Linux'" }]
replit.md ADDED
@@ -0,0 +1,151 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Legal RAG Analysis API
2
+
3
+ ## Overview
4
+
5
+ This is a FastAPI-based legal case analysis system that combines multiple AI technologies to provide comprehensive legal verdict predictions. The system uses a Retrieval-Augmented Generation (RAG) approach with LegalBERT for initial predictions and Gemini AI for final evaluation and explanation.
6
+
7
+ ## User Preferences
8
+
9
+ Preferred communication style: Simple, everyday language.
10
+ Coding style: camelCase for function names, clear variable names, efficiency over boilerplate, no comments unless asked.
11
+ Prefers to add model files and dependencies later after basic structure is ready.
12
+
13
+ ## System Architecture
14
+
15
+ The application follows a microservices-inspired architecture with clear separation of concerns:
16
+
17
+ ### Backend Framework
18
+ - **FastAPI** - Chosen for its high performance, automatic API documentation, and excellent type hinting support
19
+ - **Python 3.x** - Primary language for ML/AI integration and legal domain processing
20
+ - **Uvicorn** - ASGI server for production-ready deployment
21
+
22
+ ### AI/ML Pipeline Architecture
23
+ The system implements a three-stage analysis pipeline:
24
+ 1. **Initial Prediction** - LegalBERT model for binary classification (guilty/not guilty)
25
+ 2. **Knowledge Retrieval** - RAG system using FAISS for retrieving relevant legal documents
26
+ 3. **Final Evaluation** - Gemini AI for contextual analysis and explanation generation
27
+
28
+ ## Key Components
29
+
30
+ ### 1. LegalBERT Service (`app/services/legal_bert.py`)
31
+ - **Purpose**: Provides initial verdict predictions using a fine-tuned BERT model for legal texts
32
+ - **Technology**: Transformers library with PyTorch backend
33
+ - **Input**: Raw case text
34
+ - **Output**: Binary verdict (guilty/not guilty) with confidence scores
35
+
36
+ ### 2. RAG Service (`app/services/rag_service.py`)
37
+ - **Purpose**: Retrieves relevant legal documents to support case analysis
38
+ - **Technology**:
39
+ - FAISS for vector similarity search
40
+ - Sentence-BERT (BGE-Large) for text embeddings
41
+ - Multiple legal document indexes (Constitution, IPC, case law, statutes)
42
+ - **Features**: Parallel index querying, chunk deduplication, relevance filtering
43
+
44
+ ### 3. Gemini Service (`app/services/gemini_service.py`)
45
+ - **Purpose**: Generates search queries and provides final legal analysis
46
+ - **Technology**: Google Gemini AI API
47
+ - **Functions**:
48
+ - Query generation from case facts
49
+ - Final verdict evaluation with legal reasoning
50
+ - Explanation generation in natural language
51
+
52
+ ### 4. API Layer (`app/api/routes.py`)
53
+ - **Endpoints**:
54
+ - `POST /api/v1/analyze-case` - Main analysis endpoint
55
+ - `GET /api/v1/health` - Service health monitoring
56
+ - **Features**: Error handling, logging, service orchestration
57
+
58
+ ## Data Flow
59
+
60
+ 1. **Request Processing**: Case text received via POST request
61
+ 2. **Initial Analysis**: LegalBERT processes text and returns preliminary verdict
62
+ 3. **Query Generation**: Gemini generates optimized search query from case facts
63
+ 4. **Knowledge Retrieval**: RAG system searches multiple legal document indexes
64
+ 5. **Final Evaluation**: Gemini analyzes initial verdict against retrieved legal context
65
+ 6. **Response Assembly**: Combined results with explanations returned to client
66
+
67
+ ## External Dependencies
68
+
69
+ ### AI/ML Models
70
+ - **LegalBERT Model**: Custom fine-tuned model for legal verdict prediction
71
+ - **Sentence Transformer**: BAAI/bge-large-en-v1.5 for text embeddings
72
+ - **Gemini AI**: Google's generative AI for natural language processing
73
+
74
+ ### Vector Databases
75
+ - **FAISS Indexes**: Multiple pre-built indexes for different legal document types:
76
+ - Constitution documents
77
+ - Indian Penal Code (IPC)
78
+ - Case law precedents
79
+ - Statutes and regulations
80
+ - Q&A legal content
81
+
82
+ ### Python Libraries
83
+ - FastAPI, Uvicorn (web framework)
84
+ - Transformers, PyTorch (ML models)
85
+ - Sentence-Transformers (embeddings)
86
+ - FAISS (vector search)
87
+ - Google GenAI (external API)
88
+
89
+ ## Deployment Strategy
90
+
91
+ ### Development Setup
92
+ - Local development with hot reload enabled
93
+ - Model files and indexes loaded from configurable paths
94
+ - Environment-based configuration management
95
+
96
+ ### Configuration Management
97
+ - Centralized settings in `app/core/config.py`
98
+ - Environment variable support for sensitive data (API keys)
99
+ - Flexible path configuration for model and index files
100
+
101
+ ### Health Monitoring
102
+ - Service-level health checks for all components
103
+ - Graceful degradation when external services are unavailable
104
+ - Comprehensive logging for debugging and monitoring
105
+
106
+ ### CORS Configuration
107
+ - Permissive CORS setup for development
108
+ - Can be restricted for production deployment
109
+
110
+ ## Current Development Status (January 2025)
111
+
112
+ ### ✅ Completed
113
+ - FastAPI backend structure with proper routing and middleware
114
+ - Placeholder implementations for all services (LegalBERT, RAG, Gemini)
115
+ - Full Gemini AI integration for query generation and case evaluation
116
+ - Health monitoring endpoints for all components
117
+ - CORS configuration for frontend integration
118
+ - API documentation with comprehensive endpoints
119
+ - Camel case naming conventions as per user preference
120
+
121
+ ### 🔄 Ready for Model Integration
122
+ - LegalBERT service structure ready for torch/transformers integration
123
+ - RAG service prepared for FAISS indexes and sentence-transformers
124
+ - Configuration paths set for model files and indexes
125
+ - Graceful degradation when model files are missing
126
+
127
+ ### 📁 Directory Structure for Model Files
128
+ - `./models/legalbert_model/` - LegalBERT model files (to be added)
129
+ - `./faiss_indexes/` - FAISS vector indexes and chunks (to be added)
130
+
131
+ ### 🔗 API Endpoints Working
132
+ - `POST /api/v1/analyze-case` - Full case analysis with Gemini evaluation
133
+ - `GET /api/v1/health` - Service health monitoring
134
+ - `GET /api/v1/models/status` - Model loading status
135
+ - `GET /` - Basic API info
136
+
137
+ ## Next Steps for Full Functionality
138
+
139
+ 1. Add LegalBERT model files to `./models/legalbert_model/`
140
+ 2. Install ML dependencies: `torch`, `transformers`, `sentence-transformers`, `faiss-cpu`
141
+ 3. Add FAISS indexes and chunk files to `./faiss_indexes/`
142
+ 4. All placeholder implementations will automatically switch to real ML models
143
+
144
+ ## Notes
145
+
146
+ - The system gracefully handles missing model files during development
147
+ - All services include health check mechanisms for monitoring
148
+ - The RAG system supports parallel querying of multiple legal document indexes
149
+ - Query generation is optimized for Indian criminal law terminology
150
+ - The architecture supports easy addition of new legal document indexes
151
+ - API follows camelCase conventions and clean code principles as requested
uv.lock ADDED
@@ -0,0 +1,493 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ version = 1
2
+ requires-python = ">=3.11"
3
+
4
+ [[package]]
5
+ name = "annotated-types"
6
+ version = "0.7.0"
7
+ source = { registry = "https://pypi.org/simple" }
8
+ sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081 }
9
+ wheels = [
10
+ { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643 },
11
+ ]
12
+
13
+ [[package]]
14
+ name = "anyio"
15
+ version = "4.9.0"
16
+ source = { registry = "https://pypi.org/simple" }
17
+ dependencies = [
18
+ { name = "idna" },
19
+ { name = "sniffio" },
20
+ { name = "typing-extensions", marker = "python_full_version < '3.13'" },
21
+ ]
22
+ sdist = { url = "https://files.pythonhosted.org/packages/95/7d/4c1bd541d4dffa1b52bd83fb8527089e097a106fc90b467a7313b105f840/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028", size = 190949 }
23
+ wheels = [
24
+ { url = "https://files.pythonhosted.org/packages/a1/ee/48ca1a7c89ffec8b6a0c5d02b89c305671d5ffd8d3c94acf8b8c408575bb/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c", size = 100916 },
25
+ ]
26
+
27
+ [[package]]
28
+ name = "cachetools"
29
+ version = "5.5.2"
30
+ source = { registry = "https://pypi.org/simple" }
31
+ sdist = { url = "https://files.pythonhosted.org/packages/6c/81/3747dad6b14fa2cf53fcf10548cf5aea6913e96fab41a3c198676f8948a5/cachetools-5.5.2.tar.gz", hash = "sha256:1a661caa9175d26759571b2e19580f9d6393969e5dfca11fdb1f947a23e640d4", size = 28380 }
32
+ wheels = [
33
+ { url = "https://files.pythonhosted.org/packages/72/76/20fa66124dbe6be5cafeb312ece67de6b61dd91a0247d1ea13db4ebb33c2/cachetools-5.5.2-py3-none-any.whl", hash = "sha256:d26a22bcc62eb95c3beabd9f1ee5e820d3d2704fe2967cbe350e20c8ffcd3f0a", size = 10080 },
34
+ ]
35
+
36
+ [[package]]
37
+ name = "certifi"
38
+ version = "2025.7.14"
39
+ source = { registry = "https://pypi.org/simple" }
40
+ sdist = { url = "https://files.pythonhosted.org/packages/b3/76/52c535bcebe74590f296d6c77c86dabf761c41980e1347a2422e4aa2ae41/certifi-2025.7.14.tar.gz", hash = "sha256:8ea99dbdfaaf2ba2f9bac77b9249ef62ec5218e7c2b2e903378ed5fccf765995", size = 163981 }
41
+ wheels = [
42
+ { url = "https://files.pythonhosted.org/packages/4f/52/34c6cf5bb9285074dc3531c437b3919e825d976fde097a7a73f79e726d03/certifi-2025.7.14-py3-none-any.whl", hash = "sha256:6b31f564a415d79ee77df69d757bb49a5bb53bd9f756cbbe24394ffd6fc1f4b2", size = 162722 },
43
+ ]
44
+
45
+ [[package]]
46
+ name = "charset-normalizer"
47
+ version = "3.4.2"
48
+ source = { registry = "https://pypi.org/simple" }
49
+ sdist = { url = "https://files.pythonhosted.org/packages/e4/33/89c2ced2b67d1c2a61c19c6751aa8902d46ce3dacb23600a283619f5a12d/charset_normalizer-3.4.2.tar.gz", hash = "sha256:5baececa9ecba31eff645232d59845c07aa030f0c81ee70184a90d35099a0e63", size = 126367 }
50
+ wheels = [
51
+ { url = "https://files.pythonhosted.org/packages/05/85/4c40d00dcc6284a1c1ad5de5e0996b06f39d8232f1031cd23c2f5c07ee86/charset_normalizer-3.4.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:be1e352acbe3c78727a16a455126d9ff83ea2dfdcbc83148d2982305a04714c2", size = 198794 },
52
+ { url = "https://files.pythonhosted.org/packages/41/d9/7a6c0b9db952598e97e93cbdfcb91bacd89b9b88c7c983250a77c008703c/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aa88ca0b1932e93f2d961bf3addbb2db902198dca337d88c89e1559e066e7645", size = 142846 },
53
+ { url = "https://files.pythonhosted.org/packages/66/82/a37989cda2ace7e37f36c1a8ed16c58cf48965a79c2142713244bf945c89/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d524ba3f1581b35c03cb42beebab4a13e6cdad7b36246bd22541fa585a56cccd", size = 153350 },
54
+ { url = "https://files.pythonhosted.org/packages/df/68/a576b31b694d07b53807269d05ec3f6f1093e9545e8607121995ba7a8313/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28a1005facc94196e1fb3e82a3d442a9d9110b8434fc1ded7a24a2983c9888d8", size = 145657 },
55
+ { url = "https://files.pythonhosted.org/packages/92/9b/ad67f03d74554bed3aefd56fe836e1623a50780f7c998d00ca128924a499/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fdb20a30fe1175ecabed17cbf7812f7b804b8a315a25f24678bcdf120a90077f", size = 147260 },
56
+ { url = "https://files.pythonhosted.org/packages/a6/e6/8aebae25e328160b20e31a7e9929b1578bbdc7f42e66f46595a432f8539e/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0f5d9ed7f254402c9e7d35d2f5972c9bbea9040e99cd2861bd77dc68263277c7", size = 149164 },
57
+ { url = "https://files.pythonhosted.org/packages/8b/f2/b3c2f07dbcc248805f10e67a0262c93308cfa149a4cd3d1fe01f593e5fd2/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:efd387a49825780ff861998cd959767800d54f8308936b21025326de4b5a42b9", size = 144571 },
58
+ { url = "https://files.pythonhosted.org/packages/60/5b/c3f3a94bc345bc211622ea59b4bed9ae63c00920e2e8f11824aa5708e8b7/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:f0aa37f3c979cf2546b73e8222bbfa3dc07a641585340179d768068e3455e544", size = 151952 },
59
+ { url = "https://files.pythonhosted.org/packages/e2/4d/ff460c8b474122334c2fa394a3f99a04cf11c646da895f81402ae54f5c42/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e70e990b2137b29dc5564715de1e12701815dacc1d056308e2b17e9095372a82", size = 155959 },
60
+ { url = "https://files.pythonhosted.org/packages/a2/2b/b964c6a2fda88611a1fe3d4c400d39c66a42d6c169c924818c848f922415/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:0c8c57f84ccfc871a48a47321cfa49ae1df56cd1d965a09abe84066f6853b9c0", size = 153030 },
61
+ { url = "https://files.pythonhosted.org/packages/59/2e/d3b9811db26a5ebf444bc0fa4f4be5aa6d76fc6e1c0fd537b16c14e849b6/charset_normalizer-3.4.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:6b66f92b17849b85cad91259efc341dce9c1af48e2173bf38a85c6329f1033e5", size = 148015 },
62
+ { url = "https://files.pythonhosted.org/packages/90/07/c5fd7c11eafd561bb51220d600a788f1c8d77c5eef37ee49454cc5c35575/charset_normalizer-3.4.2-cp311-cp311-win32.whl", hash = "sha256:daac4765328a919a805fa5e2720f3e94767abd632ae410a9062dff5412bae65a", size = 98106 },
63
+ { url = "https://files.pythonhosted.org/packages/a8/05/5e33dbef7e2f773d672b6d79f10ec633d4a71cd96db6673625838a4fd532/charset_normalizer-3.4.2-cp311-cp311-win_amd64.whl", hash = "sha256:e53efc7c7cee4c1e70661e2e112ca46a575f90ed9ae3fef200f2a25e954f4b28", size = 105402 },
64
+ { url = "https://files.pythonhosted.org/packages/d7/a4/37f4d6035c89cac7930395a35cc0f1b872e652eaafb76a6075943754f095/charset_normalizer-3.4.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0c29de6a1a95f24b9a1aa7aefd27d2487263f00dfd55a77719b530788f75cff7", size = 199936 },
65
+ { url = "https://files.pythonhosted.org/packages/ee/8a/1a5e33b73e0d9287274f899d967907cd0bf9c343e651755d9307e0dbf2b3/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cddf7bd982eaa998934a91f69d182aec997c6c468898efe6679af88283b498d3", size = 143790 },
66
+ { url = "https://files.pythonhosted.org/packages/66/52/59521f1d8e6ab1482164fa21409c5ef44da3e9f653c13ba71becdd98dec3/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fcbe676a55d7445b22c10967bceaaf0ee69407fbe0ece4d032b6eb8d4565982a", size = 153924 },
67
+ { url = "https://files.pythonhosted.org/packages/86/2d/fb55fdf41964ec782febbf33cb64be480a6b8f16ded2dbe8db27a405c09f/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d41c4d287cfc69060fa91cae9683eacffad989f1a10811995fa309df656ec214", size = 146626 },
68
+ { url = "https://files.pythonhosted.org/packages/8c/73/6ede2ec59bce19b3edf4209d70004253ec5f4e319f9a2e3f2f15601ed5f7/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e594135de17ab3866138f496755f302b72157d115086d100c3f19370839dd3a", size = 148567 },
69
+ { url = "https://files.pythonhosted.org/packages/09/14/957d03c6dc343c04904530b6bef4e5efae5ec7d7990a7cbb868e4595ee30/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cf713fe9a71ef6fd5adf7a79670135081cd4431c2943864757f0fa3a65b1fafd", size = 150957 },
70
+ { url = "https://files.pythonhosted.org/packages/0d/c8/8174d0e5c10ccebdcb1b53cc959591c4c722a3ad92461a273e86b9f5a302/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a370b3e078e418187da8c3674eddb9d983ec09445c99a3a263c2011993522981", size = 145408 },
71
+ { url = "https://files.pythonhosted.org/packages/58/aa/8904b84bc8084ac19dc52feb4f5952c6df03ffb460a887b42615ee1382e8/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a955b438e62efdf7e0b7b52a64dc5c3396e2634baa62471768a64bc2adb73d5c", size = 153399 },
72
+ { url = "https://files.pythonhosted.org/packages/c2/26/89ee1f0e264d201cb65cf054aca6038c03b1a0c6b4ae998070392a3ce605/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7222ffd5e4de8e57e03ce2cef95a4c43c98fcb72ad86909abdfc2c17d227fc1b", size = 156815 },
73
+ { url = "https://files.pythonhosted.org/packages/fd/07/68e95b4b345bad3dbbd3a8681737b4338ff2c9df29856a6d6d23ac4c73cb/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:bee093bf902e1d8fc0ac143c88902c3dfc8941f7ea1d6a8dd2bcb786d33db03d", size = 154537 },
74
+ { url = "https://files.pythonhosted.org/packages/77/1a/5eefc0ce04affb98af07bc05f3bac9094513c0e23b0562d64af46a06aae4/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:dedb8adb91d11846ee08bec4c8236c8549ac721c245678282dcb06b221aab59f", size = 149565 },
75
+ { url = "https://files.pythonhosted.org/packages/37/a0/2410e5e6032a174c95e0806b1a6585eb21e12f445ebe239fac441995226a/charset_normalizer-3.4.2-cp312-cp312-win32.whl", hash = "sha256:db4c7bf0e07fc3b7d89ac2a5880a6a8062056801b83ff56d8464b70f65482b6c", size = 98357 },
76
+ { url = "https://files.pythonhosted.org/packages/6c/4f/c02d5c493967af3eda9c771ad4d2bbc8df6f99ddbeb37ceea6e8716a32bc/charset_normalizer-3.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:5a9979887252a82fefd3d3ed2a8e3b937a7a809f65dcb1e068b090e165bbe99e", size = 105776 },
77
+ { url = "https://files.pythonhosted.org/packages/ea/12/a93df3366ed32db1d907d7593a94f1fe6293903e3e92967bebd6950ed12c/charset_normalizer-3.4.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:926ca93accd5d36ccdabd803392ddc3e03e6d4cd1cf17deff3b989ab8e9dbcf0", size = 199622 },
78
+ { url = "https://files.pythonhosted.org/packages/04/93/bf204e6f344c39d9937d3c13c8cd5bbfc266472e51fc8c07cb7f64fcd2de/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eba9904b0f38a143592d9fc0e19e2df0fa2e41c3c3745554761c5f6447eedabf", size = 143435 },
79
+ { url = "https://files.pythonhosted.org/packages/22/2a/ea8a2095b0bafa6c5b5a55ffdc2f924455233ee7b91c69b7edfcc9e02284/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3fddb7e2c84ac87ac3a947cb4e66d143ca5863ef48e4a5ecb83bd48619e4634e", size = 153653 },
80
+ { url = "https://files.pythonhosted.org/packages/b6/57/1b090ff183d13cef485dfbe272e2fe57622a76694061353c59da52c9a659/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98f862da73774290f251b9df8d11161b6cf25b599a66baf087c1ffe340e9bfd1", size = 146231 },
81
+ { url = "https://files.pythonhosted.org/packages/e2/28/ffc026b26f441fc67bd21ab7f03b313ab3fe46714a14b516f931abe1a2d8/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c9379d65defcab82d07b2a9dfbfc2e95bc8fe0ebb1b176a3190230a3ef0e07c", size = 148243 },
82
+ { url = "https://files.pythonhosted.org/packages/c0/0f/9abe9bd191629c33e69e47c6ef45ef99773320e9ad8e9cb08b8ab4a8d4cb/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e635b87f01ebc977342e2697d05b56632f5f879a4f15955dfe8cef2448b51691", size = 150442 },
83
+ { url = "https://files.pythonhosted.org/packages/67/7c/a123bbcedca91d5916c056407f89a7f5e8fdfce12ba825d7d6b9954a1a3c/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:1c95a1e2902a8b722868587c0e1184ad5c55631de5afc0eb96bc4b0d738092c0", size = 145147 },
84
+ { url = "https://files.pythonhosted.org/packages/ec/fe/1ac556fa4899d967b83e9893788e86b6af4d83e4726511eaaad035e36595/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ef8de666d6179b009dce7bcb2ad4c4a779f113f12caf8dc77f0162c29d20490b", size = 153057 },
85
+ { url = "https://files.pythonhosted.org/packages/2b/ff/acfc0b0a70b19e3e54febdd5301a98b72fa07635e56f24f60502e954c461/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:32fc0341d72e0f73f80acb0a2c94216bd704f4f0bce10aedea38f30502b271ff", size = 156454 },
86
+ { url = "https://files.pythonhosted.org/packages/92/08/95b458ce9c740d0645feb0e96cea1f5ec946ea9c580a94adfe0b617f3573/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:289200a18fa698949d2b39c671c2cc7a24d44096784e76614899a7ccf2574b7b", size = 154174 },
87
+ { url = "https://files.pythonhosted.org/packages/78/be/8392efc43487ac051eee6c36d5fbd63032d78f7728cb37aebcc98191f1ff/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4a476b06fbcf359ad25d34a057b7219281286ae2477cc5ff5e3f70a246971148", size = 149166 },
88
+ { url = "https://files.pythonhosted.org/packages/44/96/392abd49b094d30b91d9fbda6a69519e95802250b777841cf3bda8fe136c/charset_normalizer-3.4.2-cp313-cp313-win32.whl", hash = "sha256:aaeeb6a479c7667fbe1099af9617c83aaca22182d6cf8c53966491a0f1b7ffb7", size = 98064 },
89
+ { url = "https://files.pythonhosted.org/packages/e9/b0/0200da600134e001d91851ddc797809e2fe0ea72de90e09bec5a2fbdaccb/charset_normalizer-3.4.2-cp313-cp313-win_amd64.whl", hash = "sha256:aa6af9e7d59f9c12b33ae4e9450619cf2488e2bbe9b44030905877f0b2324980", size = 105641 },
90
+ { url = "https://files.pythonhosted.org/packages/20/94/c5790835a017658cbfabd07f3bfb549140c3ac458cfc196323996b10095a/charset_normalizer-3.4.2-py3-none-any.whl", hash = "sha256:7f56930ab0abd1c45cd15be65cc741c28b1c9a34876ce8c17a2fa107810c0af0", size = 52626 },
91
+ ]
92
+
93
+ [[package]]
94
+ name = "click"
95
+ version = "8.2.1"
96
+ source = { registry = "https://pypi.org/simple" }
97
+ dependencies = [
98
+ { name = "colorama", marker = "sys_platform == 'win32'" },
99
+ ]
100
+ sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342 }
101
+ wheels = [
102
+ { url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215 },
103
+ ]
104
+
105
+ [[package]]
106
+ name = "colorama"
107
+ version = "0.4.6"
108
+ source = { registry = "https://pypi.org/simple" }
109
+ sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697 }
110
+ wheels = [
111
+ { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335 },
112
+ ]
113
+
114
+ [[package]]
115
+ name = "fastapi"
116
+ version = "0.116.1"
117
+ source = { registry = "https://pypi.org/simple" }
118
+ dependencies = [
119
+ { name = "pydantic" },
120
+ { name = "starlette" },
121
+ { name = "typing-extensions" },
122
+ ]
123
+ sdist = { url = "https://files.pythonhosted.org/packages/78/d7/6c8b3bfe33eeffa208183ec037fee0cce9f7f024089ab1c5d12ef04bd27c/fastapi-0.116.1.tar.gz", hash = "sha256:ed52cbf946abfd70c5a0dccb24673f0670deeb517a88b3544d03c2a6bf283143", size = 296485 }
124
+ wheels = [
125
+ { url = "https://files.pythonhosted.org/packages/e5/47/d63c60f59a59467fda0f93f46335c9d18526d7071f025cb5b89d5353ea42/fastapi-0.116.1-py3-none-any.whl", hash = "sha256:c46ac7c312df840f0c9e220f7964bada936781bc4e2e6eb71f1c4d7553786565", size = 95631 },
126
+ ]
127
+
128
+ [[package]]
129
+ name = "google-auth"
130
+ version = "2.40.3"
131
+ source = { registry = "https://pypi.org/simple" }
132
+ dependencies = [
133
+ { name = "cachetools" },
134
+ { name = "pyasn1-modules" },
135
+ { name = "rsa" },
136
+ ]
137
+ sdist = { url = "https://files.pythonhosted.org/packages/9e/9b/e92ef23b84fa10a64ce4831390b7a4c2e53c0132568d99d4ae61d04c8855/google_auth-2.40.3.tar.gz", hash = "sha256:500c3a29adedeb36ea9cf24b8d10858e152f2412e3ca37829b3fa18e33d63b77", size = 281029 }
138
+ wheels = [
139
+ { url = "https://files.pythonhosted.org/packages/17/63/b19553b658a1692443c62bd07e5868adaa0ad746a0751ba62c59568cd45b/google_auth-2.40.3-py2.py3-none-any.whl", hash = "sha256:1370d4593e86213563547f97a92752fc658456fe4514c809544f330fed45a7ca", size = 216137 },
140
+ ]
141
+
142
+ [[package]]
143
+ name = "google-genai"
144
+ version = "1.27.0"
145
+ source = { registry = "https://pypi.org/simple" }
146
+ dependencies = [
147
+ { name = "anyio" },
148
+ { name = "google-auth" },
149
+ { name = "httpx" },
150
+ { name = "pydantic" },
151
+ { name = "requests" },
152
+ { name = "tenacity" },
153
+ { name = "typing-extensions" },
154
+ { name = "websockets" },
155
+ ]
156
+ sdist = { url = "https://files.pythonhosted.org/packages/9a/37/6c0ececc3a7a629029b5beed2ceb9f28f73292236eb96272355636769b0d/google_genai-1.27.0.tar.gz", hash = "sha256:15a13ffe7b3938da50b9ab77204664d82122617256f55b5ce403d593848ef635", size = 220099 }
157
+ wheels = [
158
+ { url = "https://files.pythonhosted.org/packages/5a/12/279afe7357af73f9737a3412b6f0bc1482075b896340eb46a2f9cb0fd791/google_genai-1.27.0-py3-none-any.whl", hash = "sha256:afd6b4efaf8ec1d20a6e6657d768b68d998d60007c6e220e9024e23c913c1833", size = 218489 },
159
+ ]
160
+
161
+ [[package]]
162
+ name = "h11"
163
+ version = "0.16.0"
164
+ source = { registry = "https://pypi.org/simple" }
165
+ sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250 }
166
+ wheels = [
167
+ { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515 },
168
+ ]
169
+
170
+ [[package]]
171
+ name = "httpcore"
172
+ version = "1.0.9"
173
+ source = { registry = "https://pypi.org/simple" }
174
+ dependencies = [
175
+ { name = "certifi" },
176
+ { name = "h11" },
177
+ ]
178
+ sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484 }
179
+ wheels = [
180
+ { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784 },
181
+ ]
182
+
183
+ [[package]]
184
+ name = "httpx"
185
+ version = "0.28.1"
186
+ source = { registry = "https://pypi.org/simple" }
187
+ dependencies = [
188
+ { name = "anyio" },
189
+ { name = "certifi" },
190
+ { name = "httpcore" },
191
+ { name = "idna" },
192
+ ]
193
+ sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406 }
194
+ wheels = [
195
+ { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517 },
196
+ ]
197
+
198
+ [[package]]
199
+ name = "idna"
200
+ version = "3.10"
201
+ source = { registry = "https://pypi.org/simple" }
202
+ sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490 }
203
+ wheels = [
204
+ { url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442 },
205
+ ]
206
+
207
+ [[package]]
208
+ name = "pyasn1"
209
+ version = "0.6.1"
210
+ source = { registry = "https://pypi.org/simple" }
211
+ sdist = { url = "https://files.pythonhosted.org/packages/ba/e9/01f1a64245b89f039897cb0130016d79f77d52669aae6ee7b159a6c4c018/pyasn1-0.6.1.tar.gz", hash = "sha256:6f580d2bdd84365380830acf45550f2511469f673cb4a5ae3857a3170128b034", size = 145322 }
212
+ wheels = [
213
+ { url = "https://files.pythonhosted.org/packages/c8/f1/d6a797abb14f6283c0ddff96bbdd46937f64122b8c925cab503dd37f8214/pyasn1-0.6.1-py3-none-any.whl", hash = "sha256:0d632f46f2ba09143da3a8afe9e33fb6f92fa2320ab7e886e2d0f7672af84629", size = 83135 },
214
+ ]
215
+
216
+ [[package]]
217
+ name = "pyasn1-modules"
218
+ version = "0.4.2"
219
+ source = { registry = "https://pypi.org/simple" }
220
+ dependencies = [
221
+ { name = "pyasn1" },
222
+ ]
223
+ sdist = { url = "https://files.pythonhosted.org/packages/e9/e6/78ebbb10a8c8e4b61a59249394a4a594c1a7af95593dc933a349c8d00964/pyasn1_modules-0.4.2.tar.gz", hash = "sha256:677091de870a80aae844b1ca6134f54652fa2c8c5a52aa396440ac3106e941e6", size = 307892 }
224
+ wheels = [
225
+ { url = "https://files.pythonhosted.org/packages/47/8d/d529b5d697919ba8c11ad626e835d4039be708a35b0d22de83a269a6682c/pyasn1_modules-0.4.2-py3-none-any.whl", hash = "sha256:29253a9207ce32b64c3ac6600edc75368f98473906e8fd1043bd6b5b1de2c14a", size = 181259 },
226
+ ]
227
+
228
+ [[package]]
229
+ name = "pydantic"
230
+ version = "2.11.7"
231
+ source = { registry = "https://pypi.org/simple" }
232
+ dependencies = [
233
+ { name = "annotated-types" },
234
+ { name = "pydantic-core" },
235
+ { name = "typing-extensions" },
236
+ { name = "typing-inspection" },
237
+ ]
238
+ sdist = { url = "https://files.pythonhosted.org/packages/00/dd/4325abf92c39ba8623b5af936ddb36ffcfe0beae70405d456ab1fb2f5b8c/pydantic-2.11.7.tar.gz", hash = "sha256:d989c3c6cb79469287b1569f7447a17848c998458d49ebe294e975b9baf0f0db", size = 788350 }
239
+ wheels = [
240
+ { url = "https://files.pythonhosted.org/packages/6a/c0/ec2b1c8712ca690e5d61979dee872603e92b8a32f94cc1b72d53beab008a/pydantic-2.11.7-py3-none-any.whl", hash = "sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b", size = 444782 },
241
+ ]
242
+
243
+ [[package]]
244
+ name = "pydantic-core"
245
+ version = "2.33.2"
246
+ source = { registry = "https://pypi.org/simple" }
247
+ dependencies = [
248
+ { name = "typing-extensions" },
249
+ ]
250
+ sdist = { url = "https://files.pythonhosted.org/packages/ad/88/5f2260bdfae97aabf98f1778d43f69574390ad787afb646292a638c923d4/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc", size = 435195 }
251
+ wheels = [
252
+ { url = "https://files.pythonhosted.org/packages/3f/8d/71db63483d518cbbf290261a1fc2839d17ff89fce7089e08cad07ccfce67/pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7", size = 2028584 },
253
+ { url = "https://files.pythonhosted.org/packages/24/2f/3cfa7244ae292dd850989f328722d2aef313f74ffc471184dc509e1e4e5a/pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246", size = 1855071 },
254
+ { url = "https://files.pythonhosted.org/packages/b3/d3/4ae42d33f5e3f50dd467761304be2fa0a9417fbf09735bc2cce003480f2a/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f", size = 1897823 },
255
+ { url = "https://files.pythonhosted.org/packages/f4/f3/aa5976e8352b7695ff808599794b1fba2a9ae2ee954a3426855935799488/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc", size = 1983792 },
256
+ { url = "https://files.pythonhosted.org/packages/d5/7a/cda9b5a23c552037717f2b2a5257e9b2bfe45e687386df9591eff7b46d28/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de", size = 2136338 },
257
+ { url = "https://files.pythonhosted.org/packages/2b/9f/b8f9ec8dd1417eb9da784e91e1667d58a2a4a7b7b34cf4af765ef663a7e5/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a", size = 2730998 },
258
+ { url = "https://files.pythonhosted.org/packages/47/bc/cd720e078576bdb8255d5032c5d63ee5c0bf4b7173dd955185a1d658c456/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef", size = 2003200 },
259
+ { url = "https://files.pythonhosted.org/packages/ca/22/3602b895ee2cd29d11a2b349372446ae9727c32e78a94b3d588a40fdf187/pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e", size = 2113890 },
260
+ { url = "https://files.pythonhosted.org/packages/ff/e6/e3c5908c03cf00d629eb38393a98fccc38ee0ce8ecce32f69fc7d7b558a7/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d", size = 2073359 },
261
+ { url = "https://files.pythonhosted.org/packages/12/e7/6a36a07c59ebefc8777d1ffdaf5ae71b06b21952582e4b07eba88a421c79/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30", size = 2245883 },
262
+ { url = "https://files.pythonhosted.org/packages/16/3f/59b3187aaa6cc0c1e6616e8045b284de2b6a87b027cce2ffcea073adf1d2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf", size = 2241074 },
263
+ { url = "https://files.pythonhosted.org/packages/e0/ed/55532bb88f674d5d8f67ab121a2a13c385df382de2a1677f30ad385f7438/pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51", size = 1910538 },
264
+ { url = "https://files.pythonhosted.org/packages/fe/1b/25b7cccd4519c0b23c2dd636ad39d381abf113085ce4f7bec2b0dc755eb1/pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab", size = 1952909 },
265
+ { url = "https://files.pythonhosted.org/packages/49/a9/d809358e49126438055884c4366a1f6227f0f84f635a9014e2deb9b9de54/pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65", size = 1897786 },
266
+ { url = "https://files.pythonhosted.org/packages/18/8a/2b41c97f554ec8c71f2a8a5f85cb56a8b0956addfe8b0efb5b3d77e8bdc3/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc", size = 2009000 },
267
+ { url = "https://files.pythonhosted.org/packages/a1/02/6224312aacb3c8ecbaa959897af57181fb6cf3a3d7917fd44d0f2917e6f2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7", size = 1847996 },
268
+ { url = "https://files.pythonhosted.org/packages/d6/46/6dcdf084a523dbe0a0be59d054734b86a981726f221f4562aed313dbcb49/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025", size = 1880957 },
269
+ { url = "https://files.pythonhosted.org/packages/ec/6b/1ec2c03837ac00886ba8160ce041ce4e325b41d06a034adbef11339ae422/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011", size = 1964199 },
270
+ { url = "https://files.pythonhosted.org/packages/2d/1d/6bf34d6adb9debd9136bd197ca72642203ce9aaaa85cfcbfcf20f9696e83/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f", size = 2120296 },
271
+ { url = "https://files.pythonhosted.org/packages/e0/94/2bd0aaf5a591e974b32a9f7123f16637776c304471a0ab33cf263cf5591a/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88", size = 2676109 },
272
+ { url = "https://files.pythonhosted.org/packages/f9/41/4b043778cf9c4285d59742281a769eac371b9e47e35f98ad321349cc5d61/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1", size = 2002028 },
273
+ { url = "https://files.pythonhosted.org/packages/cb/d5/7bb781bf2748ce3d03af04d5c969fa1308880e1dca35a9bd94e1a96a922e/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b", size = 2100044 },
274
+ { url = "https://files.pythonhosted.org/packages/fe/36/def5e53e1eb0ad896785702a5bbfd25eed546cdcf4087ad285021a90ed53/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1", size = 2058881 },
275
+ { url = "https://files.pythonhosted.org/packages/01/6c/57f8d70b2ee57fc3dc8b9610315949837fa8c11d86927b9bb044f8705419/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6", size = 2227034 },
276
+ { url = "https://files.pythonhosted.org/packages/27/b9/9c17f0396a82b3d5cbea4c24d742083422639e7bb1d5bf600e12cb176a13/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea", size = 2234187 },
277
+ { url = "https://files.pythonhosted.org/packages/b0/6a/adf5734ffd52bf86d865093ad70b2ce543415e0e356f6cacabbc0d9ad910/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290", size = 1892628 },
278
+ { url = "https://files.pythonhosted.org/packages/43/e4/5479fecb3606c1368d496a825d8411e126133c41224c1e7238be58b87d7e/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2", size = 1955866 },
279
+ { url = "https://files.pythonhosted.org/packages/0d/24/8b11e8b3e2be9dd82df4b11408a67c61bb4dc4f8e11b5b0fc888b38118b5/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab", size = 1888894 },
280
+ { url = "https://files.pythonhosted.org/packages/46/8c/99040727b41f56616573a28771b1bfa08a3d3fe74d3d513f01251f79f172/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f", size = 2015688 },
281
+ { url = "https://files.pythonhosted.org/packages/3a/cc/5999d1eb705a6cefc31f0b4a90e9f7fc400539b1a1030529700cc1b51838/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6", size = 1844808 },
282
+ { url = "https://files.pythonhosted.org/packages/6f/5e/a0a7b8885c98889a18b6e376f344da1ef323d270b44edf8174d6bce4d622/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef", size = 1885580 },
283
+ { url = "https://files.pythonhosted.org/packages/3b/2a/953581f343c7d11a304581156618c3f592435523dd9d79865903272c256a/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a", size = 1973859 },
284
+ { url = "https://files.pythonhosted.org/packages/e6/55/f1a813904771c03a3f97f676c62cca0c0a4138654107c1b61f19c644868b/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916", size = 2120810 },
285
+ { url = "https://files.pythonhosted.org/packages/aa/c3/053389835a996e18853ba107a63caae0b9deb4a276c6b472931ea9ae6e48/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a", size = 2676498 },
286
+ { url = "https://files.pythonhosted.org/packages/eb/3c/f4abd740877a35abade05e437245b192f9d0ffb48bbbbd708df33d3cda37/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d", size = 2000611 },
287
+ { url = "https://files.pythonhosted.org/packages/59/a7/63ef2fed1837d1121a894d0ce88439fe3e3b3e48c7543b2a4479eb99c2bd/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56", size = 2107924 },
288
+ { url = "https://files.pythonhosted.org/packages/04/8f/2551964ef045669801675f1cfc3b0d74147f4901c3ffa42be2ddb1f0efc4/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5", size = 2063196 },
289
+ { url = "https://files.pythonhosted.org/packages/26/bd/d9602777e77fc6dbb0c7db9ad356e9a985825547dce5ad1d30ee04903918/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e", size = 2236389 },
290
+ { url = "https://files.pythonhosted.org/packages/42/db/0e950daa7e2230423ab342ae918a794964b053bec24ba8af013fc7c94846/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162", size = 2239223 },
291
+ { url = "https://files.pythonhosted.org/packages/58/4d/4f937099c545a8a17eb52cb67fe0447fd9a373b348ccfa9a87f141eeb00f/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849", size = 1900473 },
292
+ { url = "https://files.pythonhosted.org/packages/a0/75/4a0a9bac998d78d889def5e4ef2b065acba8cae8c93696906c3a91f310ca/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9", size = 1955269 },
293
+ { url = "https://files.pythonhosted.org/packages/f9/86/1beda0576969592f1497b4ce8e7bc8cbdf614c352426271b1b10d5f0aa64/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9", size = 1893921 },
294
+ { url = "https://files.pythonhosted.org/packages/a4/7d/e09391c2eebeab681df2b74bfe6c43422fffede8dc74187b2b0bf6fd7571/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac", size = 1806162 },
295
+ { url = "https://files.pythonhosted.org/packages/f1/3d/847b6b1fed9f8ed3bb95a9ad04fbd0b212e832d4f0f50ff4d9ee5a9f15cf/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5", size = 1981560 },
296
+ { url = "https://files.pythonhosted.org/packages/6f/9a/e73262f6c6656262b5fdd723ad90f518f579b7bc8622e43a942eec53c938/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9", size = 1935777 },
297
+ { url = "https://files.pythonhosted.org/packages/7b/27/d4ae6487d73948d6f20dddcd94be4ea43e74349b56eba82e9bdee2d7494c/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8", size = 2025200 },
298
+ { url = "https://files.pythonhosted.org/packages/f1/b8/b3cb95375f05d33801024079b9392a5ab45267a63400bf1866e7ce0f0de4/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593", size = 1859123 },
299
+ { url = "https://files.pythonhosted.org/packages/05/bc/0d0b5adeda59a261cd30a1235a445bf55c7e46ae44aea28f7bd6ed46e091/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612", size = 1892852 },
300
+ { url = "https://files.pythonhosted.org/packages/3e/11/d37bdebbda2e449cb3f519f6ce950927b56d62f0b84fd9cb9e372a26a3d5/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7", size = 2067484 },
301
+ { url = "https://files.pythonhosted.org/packages/8c/55/1f95f0a05ce72ecb02a8a8a1c3be0579bbc29b1d5ab68f1378b7bebc5057/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e", size = 2108896 },
302
+ { url = "https://files.pythonhosted.org/packages/53/89/2b2de6c81fa131f423246a9109d7b2a375e83968ad0800d6e57d0574629b/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8", size = 2069475 },
303
+ { url = "https://files.pythonhosted.org/packages/b8/e9/1f7efbe20d0b2b10f6718944b5d8ece9152390904f29a78e68d4e7961159/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf", size = 2239013 },
304
+ { url = "https://files.pythonhosted.org/packages/3c/b2/5309c905a93811524a49b4e031e9851a6b00ff0fb668794472ea7746b448/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb", size = 2238715 },
305
+ { url = "https://files.pythonhosted.org/packages/32/56/8a7ca5d2cd2cda1d245d34b1c9a942920a718082ae8e54e5f3e5a58b7add/pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1", size = 2066757 },
306
+ ]
307
+
308
+ [[package]]
309
+ name = "pydantic-settings"
310
+ version = "2.10.1"
311
+ source = { registry = "https://pypi.org/simple" }
312
+ dependencies = [
313
+ { name = "pydantic" },
314
+ { name = "python-dotenv" },
315
+ { name = "typing-inspection" },
316
+ ]
317
+ sdist = { url = "https://files.pythonhosted.org/packages/68/85/1ea668bbab3c50071ca613c6ab30047fb36ab0da1b92fa8f17bbc38fd36c/pydantic_settings-2.10.1.tar.gz", hash = "sha256:06f0062169818d0f5524420a360d632d5857b83cffd4d42fe29597807a1614ee", size = 172583 }
318
+ wheels = [
319
+ { url = "https://files.pythonhosted.org/packages/58/f0/427018098906416f580e3cf1366d3b1abfb408a0652e9f31600c24a1903c/pydantic_settings-2.10.1-py3-none-any.whl", hash = "sha256:a60952460b99cf661dc25c29c0ef171721f98bfcb52ef8d9ea4c943d7c8cc796", size = 45235 },
320
+ ]
321
+
322
+ [[package]]
323
+ name = "python-dotenv"
324
+ version = "1.1.1"
325
+ source = { registry = "https://pypi.org/simple" }
326
+ sdist = { url = "https://files.pythonhosted.org/packages/f6/b0/4bc07ccd3572a2f9df7e6782f52b0c6c90dcbb803ac4a167702d7d0dfe1e/python_dotenv-1.1.1.tar.gz", hash = "sha256:a8a6399716257f45be6a007360200409fce5cda2661e3dec71d23dc15f6189ab", size = 41978 }
327
+ wheels = [
328
+ { url = "https://files.pythonhosted.org/packages/5f/ed/539768cf28c661b5b068d66d96a2f155c4971a5d55684a514c1a0e0dec2f/python_dotenv-1.1.1-py3-none-any.whl", hash = "sha256:31f23644fe2602f88ff55e1f5c79ba497e01224ee7737937930c448e4d0e24dc", size = 20556 },
329
+ ]
330
+
331
+ [[package]]
332
+ name = "repl-nix-workspace"
333
+ version = "0.1.0"
334
+ source = { virtual = "." }
335
+ dependencies = [
336
+ { name = "fastapi" },
337
+ { name = "google-genai" },
338
+ { name = "pydantic" },
339
+ { name = "pydantic-settings" },
340
+ { name = "uvicorn" },
341
+ ]
342
+
343
+ [package.metadata]
344
+ requires-dist = [
345
+ { name = "fastapi", specifier = ">=0.116.1" },
346
+ { name = "google-genai", specifier = ">=1.27.0" },
347
+ { name = "pydantic", specifier = ">=2.11.7" },
348
+ { name = "pydantic-settings", specifier = ">=2.10.1" },
349
+ { name = "uvicorn", specifier = ">=0.35.0" },
350
+ ]
351
+
352
+ [[package]]
353
+ name = "requests"
354
+ version = "2.32.4"
355
+ source = { registry = "https://pypi.org/simple" }
356
+ dependencies = [
357
+ { name = "certifi" },
358
+ { name = "charset-normalizer" },
359
+ { name = "idna" },
360
+ { name = "urllib3" },
361
+ ]
362
+ sdist = { url = "https://files.pythonhosted.org/packages/e1/0a/929373653770d8a0d7ea76c37de6e41f11eb07559b103b1c02cafb3f7cf8/requests-2.32.4.tar.gz", hash = "sha256:27d0316682c8a29834d3264820024b62a36942083d52caf2f14c0591336d3422", size = 135258 }
363
+ wheels = [
364
+ { url = "https://files.pythonhosted.org/packages/7c/e4/56027c4a6b4ae70ca9de302488c5ca95ad4a39e190093d6c1a8ace08341b/requests-2.32.4-py3-none-any.whl", hash = "sha256:27babd3cda2a6d50b30443204ee89830707d396671944c998b5975b031ac2b2c", size = 64847 },
365
+ ]
366
+
367
+ [[package]]
368
+ name = "rsa"
369
+ version = "4.9.1"
370
+ source = { registry = "https://pypi.org/simple" }
371
+ dependencies = [
372
+ { name = "pyasn1" },
373
+ ]
374
+ sdist = { url = "https://files.pythonhosted.org/packages/da/8a/22b7beea3ee0d44b1916c0c1cb0ee3af23b700b6da9f04991899d0c555d4/rsa-4.9.1.tar.gz", hash = "sha256:e7bdbfdb5497da4c07dfd35530e1a902659db6ff241e39d9953cad06ebd0ae75", size = 29034 }
375
+ wheels = [
376
+ { url = "https://files.pythonhosted.org/packages/64/8d/0133e4eb4beed9e425d9a98ed6e081a55d195481b7632472be1af08d2f6b/rsa-4.9.1-py3-none-any.whl", hash = "sha256:68635866661c6836b8d39430f97a996acbd61bfa49406748ea243539fe239762", size = 34696 },
377
+ ]
378
+
379
+ [[package]]
380
+ name = "sniffio"
381
+ version = "1.3.1"
382
+ source = { registry = "https://pypi.org/simple" }
383
+ sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372 }
384
+ wheels = [
385
+ { url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235 },
386
+ ]
387
+
388
+ [[package]]
389
+ name = "starlette"
390
+ version = "0.47.2"
391
+ source = { registry = "https://pypi.org/simple" }
392
+ dependencies = [
393
+ { name = "anyio" },
394
+ { name = "typing-extensions", marker = "python_full_version < '3.13'" },
395
+ ]
396
+ sdist = { url = "https://files.pythonhosted.org/packages/04/57/d062573f391d062710d4088fa1369428c38d51460ab6fedff920efef932e/starlette-0.47.2.tar.gz", hash = "sha256:6ae9aa5db235e4846decc1e7b79c4f346adf41e9777aebeb49dfd09bbd7023d8", size = 2583948 }
397
+ wheels = [
398
+ { url = "https://files.pythonhosted.org/packages/f7/1f/b876b1f83aef204198a42dc101613fefccb32258e5428b5f9259677864b4/starlette-0.47.2-py3-none-any.whl", hash = "sha256:c5847e96134e5c5371ee9fac6fdf1a67336d5815e09eb2a01fdb57a351ef915b", size = 72984 },
399
+ ]
400
+
401
+ [[package]]
402
+ name = "tenacity"
403
+ version = "8.5.0"
404
+ source = { registry = "https://pypi.org/simple" }
405
+ sdist = { url = "https://files.pythonhosted.org/packages/a3/4d/6a19536c50b849338fcbe9290d562b52cbdcf30d8963d3588a68a4107df1/tenacity-8.5.0.tar.gz", hash = "sha256:8bc6c0c8a09b31e6cad13c47afbed1a567518250a9a171418582ed8d9c20ca78", size = 47309 }
406
+ wheels = [
407
+ { url = "https://files.pythonhosted.org/packages/d2/3f/8ba87d9e287b9d385a02a7114ddcef61b26f86411e121c9003eb509a1773/tenacity-8.5.0-py3-none-any.whl", hash = "sha256:b594c2a5945830c267ce6b79a166228323ed52718f30302c1359836112346687", size = 28165 },
408
+ ]
409
+
410
+ [[package]]
411
+ name = "typing-extensions"
412
+ version = "4.14.1"
413
+ source = { registry = "https://pypi.org/simple" }
414
+ sdist = { url = "https://files.pythonhosted.org/packages/98/5a/da40306b885cc8c09109dc2e1abd358d5684b1425678151cdaed4731c822/typing_extensions-4.14.1.tar.gz", hash = "sha256:38b39f4aeeab64884ce9f74c94263ef78f3c22467c8724005483154c26648d36", size = 107673 }
415
+ wheels = [
416
+ { url = "https://files.pythonhosted.org/packages/b5/00/d631e67a838026495268c2f6884f3711a15a9a2a96cd244fdaea53b823fb/typing_extensions-4.14.1-py3-none-any.whl", hash = "sha256:d1e1e3b58374dc93031d6eda2420a48ea44a36c2b4766a4fdeb3710755731d76", size = 43906 },
417
+ ]
418
+
419
+ [[package]]
420
+ name = "typing-inspection"
421
+ version = "0.4.1"
422
+ source = { registry = "https://pypi.org/simple" }
423
+ dependencies = [
424
+ { name = "typing-extensions" },
425
+ ]
426
+ sdist = { url = "https://files.pythonhosted.org/packages/f8/b1/0c11f5058406b3af7609f121aaa6b609744687f1d158b3c3a5bf4cc94238/typing_inspection-0.4.1.tar.gz", hash = "sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28", size = 75726 }
427
+ wheels = [
428
+ { url = "https://files.pythonhosted.org/packages/17/69/cd203477f944c353c31bade965f880aa1061fd6bf05ded0726ca845b6ff7/typing_inspection-0.4.1-py3-none-any.whl", hash = "sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51", size = 14552 },
429
+ ]
430
+
431
+ [[package]]
432
+ name = "urllib3"
433
+ version = "2.5.0"
434
+ source = { registry = "https://pypi.org/simple" }
435
+ sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185 }
436
+ wheels = [
437
+ { url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795 },
438
+ ]
439
+
440
+ [[package]]
441
+ name = "uvicorn"
442
+ version = "0.35.0"
443
+ source = { registry = "https://pypi.org/simple" }
444
+ dependencies = [
445
+ { name = "click" },
446
+ { name = "h11" },
447
+ ]
448
+ sdist = { url = "https://files.pythonhosted.org/packages/5e/42/e0e305207bb88c6b8d3061399c6a961ffe5fbb7e2aa63c9234df7259e9cd/uvicorn-0.35.0.tar.gz", hash = "sha256:bc662f087f7cf2ce11a1d7fd70b90c9f98ef2e2831556dd078d131b96cc94a01", size = 78473 }
449
+ wheels = [
450
+ { url = "https://files.pythonhosted.org/packages/d2/e2/dc81b1bd1dcfe91735810265e9d26bc8ec5da45b4c0f6237e286819194c3/uvicorn-0.35.0-py3-none-any.whl", hash = "sha256:197535216b25ff9b785e29a0b79199f55222193d47f820816e7da751e9bc8d4a", size = 66406 },
451
+ ]
452
+
453
+ [[package]]
454
+ name = "websockets"
455
+ version = "15.0.1"
456
+ source = { registry = "https://pypi.org/simple" }
457
+ sdist = { url = "https://files.pythonhosted.org/packages/21/e6/26d09fab466b7ca9c7737474c52be4f76a40301b08362eb2dbc19dcc16c1/websockets-15.0.1.tar.gz", hash = "sha256:82544de02076bafba038ce055ee6412d68da13ab47f0c60cab827346de828dee", size = 177016 }
458
+ wheels = [
459
+ { url = "https://files.pythonhosted.org/packages/9f/32/18fcd5919c293a398db67443acd33fde142f283853076049824fc58e6f75/websockets-15.0.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:823c248b690b2fd9303ba00c4f66cd5e2d8c3ba4aa968b2779be9532a4dad431", size = 175423 },
460
+ { url = "https://files.pythonhosted.org/packages/76/70/ba1ad96b07869275ef42e2ce21f07a5b0148936688c2baf7e4a1f60d5058/websockets-15.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678999709e68425ae2593acf2e3ebcbcf2e69885a5ee78f9eb80e6e371f1bf57", size = 173082 },
461
+ { url = "https://files.pythonhosted.org/packages/86/f2/10b55821dd40eb696ce4704a87d57774696f9451108cff0d2824c97e0f97/websockets-15.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d50fd1ee42388dcfb2b3676132c78116490976f1300da28eb629272d5d93e905", size = 173330 },
462
+ { url = "https://files.pythonhosted.org/packages/a5/90/1c37ae8b8a113d3daf1065222b6af61cc44102da95388ac0018fcb7d93d9/websockets-15.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d99e5546bf73dbad5bf3547174cd6cb8ba7273062a23808ffea025ecb1cf8562", size = 182878 },
463
+ { url = "https://files.pythonhosted.org/packages/8e/8d/96e8e288b2a41dffafb78e8904ea7367ee4f891dafc2ab8d87e2124cb3d3/websockets-15.0.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:66dd88c918e3287efc22409d426c8f729688d89a0c587c88971a0faa2c2f3792", size = 181883 },
464
+ { url = "https://files.pythonhosted.org/packages/93/1f/5d6dbf551766308f6f50f8baf8e9860be6182911e8106da7a7f73785f4c4/websockets-15.0.1-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8dd8327c795b3e3f219760fa603dcae1dcc148172290a8ab15158cf85a953413", size = 182252 },
465
+ { url = "https://files.pythonhosted.org/packages/d4/78/2d4fed9123e6620cbf1706c0de8a1632e1a28e7774d94346d7de1bba2ca3/websockets-15.0.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8fdc51055e6ff4adeb88d58a11042ec9a5eae317a0a53d12c062c8a8865909e8", size = 182521 },
466
+ { url = "https://files.pythonhosted.org/packages/e7/3b/66d4c1b444dd1a9823c4a81f50231b921bab54eee2f69e70319b4e21f1ca/websockets-15.0.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:693f0192126df6c2327cce3baa7c06f2a117575e32ab2308f7f8216c29d9e2e3", size = 181958 },
467
+ { url = "https://files.pythonhosted.org/packages/08/ff/e9eed2ee5fed6f76fdd6032ca5cd38c57ca9661430bb3d5fb2872dc8703c/websockets-15.0.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:54479983bd5fb469c38f2f5c7e3a24f9a4e70594cd68cd1fa6b9340dadaff7cf", size = 181918 },
468
+ { url = "https://files.pythonhosted.org/packages/d8/75/994634a49b7e12532be6a42103597b71098fd25900f7437d6055ed39930a/websockets-15.0.1-cp311-cp311-win32.whl", hash = "sha256:16b6c1b3e57799b9d38427dda63edcbe4926352c47cf88588c0be4ace18dac85", size = 176388 },
469
+ { url = "https://files.pythonhosted.org/packages/98/93/e36c73f78400a65f5e236cd376713c34182e6663f6889cd45a4a04d8f203/websockets-15.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:27ccee0071a0e75d22cb35849b1db43f2ecd3e161041ac1ee9d2352ddf72f065", size = 176828 },
470
+ { url = "https://files.pythonhosted.org/packages/51/6b/4545a0d843594f5d0771e86463606a3988b5a09ca5123136f8a76580dd63/websockets-15.0.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:3e90baa811a5d73f3ca0bcbf32064d663ed81318ab225ee4f427ad4e26e5aff3", size = 175437 },
471
+ { url = "https://files.pythonhosted.org/packages/f4/71/809a0f5f6a06522af902e0f2ea2757f71ead94610010cf570ab5c98e99ed/websockets-15.0.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:592f1a9fe869c778694f0aa806ba0374e97648ab57936f092fd9d87f8bc03665", size = 173096 },
472
+ { url = "https://files.pythonhosted.org/packages/3d/69/1a681dd6f02180916f116894181eab8b2e25b31e484c5d0eae637ec01f7c/websockets-15.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0701bc3cfcb9164d04a14b149fd74be7347a530ad3bbf15ab2c678a2cd3dd9a2", size = 173332 },
473
+ { url = "https://files.pythonhosted.org/packages/a6/02/0073b3952f5bce97eafbb35757f8d0d54812b6174ed8dd952aa08429bcc3/websockets-15.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e8b56bdcdb4505c8078cb6c7157d9811a85790f2f2b3632c7d1462ab5783d215", size = 183152 },
474
+ { url = "https://files.pythonhosted.org/packages/74/45/c205c8480eafd114b428284840da0b1be9ffd0e4f87338dc95dc6ff961a1/websockets-15.0.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0af68c55afbd5f07986df82831c7bff04846928ea8d1fd7f30052638788bc9b5", size = 182096 },
475
+ { url = "https://files.pythonhosted.org/packages/14/8f/aa61f528fba38578ec553c145857a181384c72b98156f858ca5c8e82d9d3/websockets-15.0.1-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64dee438fed052b52e4f98f76c5790513235efaa1ef7f3f2192c392cd7c91b65", size = 182523 },
476
+ { url = "https://files.pythonhosted.org/packages/ec/6d/0267396610add5bc0d0d3e77f546d4cd287200804fe02323797de77dbce9/websockets-15.0.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d5f6b181bb38171a8ad1d6aa58a67a6aa9d4b38d0f8c5f496b9e42561dfc62fe", size = 182790 },
477
+ { url = "https://files.pythonhosted.org/packages/02/05/c68c5adbf679cf610ae2f74a9b871ae84564462955d991178f95a1ddb7dd/websockets-15.0.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5d54b09eba2bada6011aea5375542a157637b91029687eb4fdb2dab11059c1b4", size = 182165 },
478
+ { url = "https://files.pythonhosted.org/packages/29/93/bb672df7b2f5faac89761cb5fa34f5cec45a4026c383a4b5761c6cea5c16/websockets-15.0.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3be571a8b5afed347da347bfcf27ba12b069d9d7f42cb8c7028b5e98bbb12597", size = 182160 },
479
+ { url = "https://files.pythonhosted.org/packages/ff/83/de1f7709376dc3ca9b7eeb4b9a07b4526b14876b6d372a4dc62312bebee0/websockets-15.0.1-cp312-cp312-win32.whl", hash = "sha256:c338ffa0520bdb12fbc527265235639fb76e7bc7faafbb93f6ba80d9c06578a9", size = 176395 },
480
+ { url = "https://files.pythonhosted.org/packages/7d/71/abf2ebc3bbfa40f391ce1428c7168fb20582d0ff57019b69ea20fa698043/websockets-15.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:fcd5cf9e305d7b8338754470cf69cf81f420459dbae8a3b40cee57417f4614a7", size = 176841 },
481
+ { url = "https://files.pythonhosted.org/packages/cb/9f/51f0cf64471a9d2b4d0fc6c534f323b664e7095640c34562f5182e5a7195/websockets-15.0.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ee443ef070bb3b6ed74514f5efaa37a252af57c90eb33b956d35c8e9c10a1931", size = 175440 },
482
+ { url = "https://files.pythonhosted.org/packages/8a/05/aa116ec9943c718905997412c5989f7ed671bc0188ee2ba89520e8765d7b/websockets-15.0.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5a939de6b7b4e18ca683218320fc67ea886038265fd1ed30173f5ce3f8e85675", size = 173098 },
483
+ { url = "https://files.pythonhosted.org/packages/ff/0b/33cef55ff24f2d92924923c99926dcce78e7bd922d649467f0eda8368923/websockets-15.0.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:746ee8dba912cd6fc889a8147168991d50ed70447bf18bcda7039f7d2e3d9151", size = 173329 },
484
+ { url = "https://files.pythonhosted.org/packages/31/1d/063b25dcc01faa8fada1469bdf769de3768b7044eac9d41f734fd7b6ad6d/websockets-15.0.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:595b6c3969023ecf9041b2936ac3827e4623bfa3ccf007575f04c5a6aa318c22", size = 183111 },
485
+ { url = "https://files.pythonhosted.org/packages/93/53/9a87ee494a51bf63e4ec9241c1ccc4f7c2f45fff85d5bde2ff74fcb68b9e/websockets-15.0.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c714d2fc58b5ca3e285461a4cc0c9a66bd0e24c5da9911e30158286c9b5be7f", size = 182054 },
486
+ { url = "https://files.pythonhosted.org/packages/ff/b2/83a6ddf56cdcbad4e3d841fcc55d6ba7d19aeb89c50f24dd7e859ec0805f/websockets-15.0.1-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f3c1e2ab208db911594ae5b4f79addeb3501604a165019dd221c0bdcabe4db8", size = 182496 },
487
+ { url = "https://files.pythonhosted.org/packages/98/41/e7038944ed0abf34c45aa4635ba28136f06052e08fc2168520bb8b25149f/websockets-15.0.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:229cf1d3ca6c1804400b0a9790dc66528e08a6a1feec0d5040e8b9eb14422375", size = 182829 },
488
+ { url = "https://files.pythonhosted.org/packages/e0/17/de15b6158680c7623c6ef0db361da965ab25d813ae54fcfeae2e5b9ef910/websockets-15.0.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:756c56e867a90fb00177d530dca4b097dd753cde348448a1012ed6c5131f8b7d", size = 182217 },
489
+ { url = "https://files.pythonhosted.org/packages/33/2b/1f168cb6041853eef0362fb9554c3824367c5560cbdaad89ac40f8c2edfc/websockets-15.0.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:558d023b3df0bffe50a04e710bc87742de35060580a293c2a984299ed83bc4e4", size = 182195 },
490
+ { url = "https://files.pythonhosted.org/packages/86/eb/20b6cdf273913d0ad05a6a14aed4b9a85591c18a987a3d47f20fa13dcc47/websockets-15.0.1-cp313-cp313-win32.whl", hash = "sha256:ba9e56e8ceeeedb2e080147ba85ffcd5cd0711b89576b83784d8605a7df455fa", size = 176393 },
491
+ { url = "https://files.pythonhosted.org/packages/1b/6c/c65773d6cab416a64d191d6ee8a8b1c68a09970ea6909d16965d26bfed1e/websockets-15.0.1-cp313-cp313-win_amd64.whl", hash = "sha256:e09473f095a819042ecb2ab9465aee615bd9c2028e4ef7d933600a8401c79561", size = 176837 },
492
+ { url = "https://files.pythonhosted.org/packages/fa/a8/5b41e0da817d64113292ab1f8247140aac61cbf6cfd085d6a0fa77f4984f/websockets-15.0.1-py3-none-any.whl", hash = "sha256:f7a866fbc1e97b5c617ee4116daaa09b722101d4a3c170c787450ba409f9736f", size = 169743 },
493
+ ]