diff --git a/README.md b/README.md
index 1340df18e3c1ca8d7fa1a264bbe4b1b4330238b2..bedbf462c80b265fbb6b61567a945e9000c53383 100644
--- a/README.md
+++ b/README.md
@@ -1,86 +1,24 @@
-# RightCodes Architecture Scaffold
+# Minimal Server + Web Starter
-This repository is the starting scaffold for the code and standards migration assistant. The layout mirrors the planned AI-driven workflow (ingestion -> extraction -> mapping -> rewrite -> validation -> export) and keeps agent orchestration, backend services, and UI concerns separated.
+This repository is a stripped-down template with:
+- `server/` - FastAPI API with a single `/health` endpoint.
+- `frontend/` - Vite + React app that renders a black full-screen page.
-## Top-Level Directories
-
-- `frontend/` - React application for uploads, diff review, validation checklists, and export actions.
-- `server/` - FastAPI backend that manages sessions, orchestrates agents, and exposes REST/WebSocket APIs.
-- `agents/` - OpenAI agent wrappers plus prompt assets for each processing stage.
-- `workers/` - File ingestion, document parsing, and pipeline jobs executed off the main API thread.
-- `storage/` - Versioned document blobs, manifests, caches, and export artefacts.
-- `docs/` - Architecture notes, pipeline diagrams, agent specs, and UI flows.
-- `scripts/` - Developer utilities, operational scripts, and local tooling hooks.
-- `data/` - Sample inputs, canonical standards metadata, and fixture mappings.
-- `infra/` - DevOps assets (containers, CI pipelines, observability).
-- `common/` - Shared domain models, schemas, and event definitions.
-
-## Getting Started
-
-### Prerequisites
-
-- Python 3.8+ (3.11+ recommended)
-- Node.js 18+ and npm (ships with Node.js)
-- Internet connectivity on the first launch so pip/npm can download dependencies
-
-### Launcher Quick Start
-
-The repository ships with `start-rightcodes.ps1` (PowerShell) and `start-rightcodes.bat` (Command Prompt) which check for Python and Node.js, create the virtual environment, install dependencies, and open the launcher UI.
+## Quick Start
+### Server
```powershell
-.\start-rightcodes.ps1
-```
-
-```bat
-start-rightcodes.bat
-```
-
-The scripts will guide you to the official Python and Node.js installers if either prerequisite is missing. After a successful first run, the cached `.venv/` and `frontend/node_modules/` folders allow the launcher to work offline.
-
-### Backend (FastAPI)
-
-```bash
-cd server
python -m venv .venv
-.venv\Scripts\activate # Windows
-pip install -r requirements.txt
-# copy the sample env and add your OpenAI key
-copy .env.example .env # or use New-Item -Path .env -ItemType File
+.venv\Scripts\activate
+pip install -r server/requirements.txt
+uvicorn server.app.main:app --reload --port 8000
```
-Edit `.env` and set `RIGHTCODES_OPENAI_API_KEY=sk-your-key`.
-
-```bash
-uvicorn app.main:app --reload --port 8000
-```
-
-The API is available at `http://localhost:8000/api` and Swagger UI at `http://localhost:8000/api/docs`.
-
-### Frontend (Vite + React)
-
-```bash
+### Web
+```powershell
cd frontend
npm install
npm run dev
```
-Navigate to `http://localhost:5173` to access the UI. Configure a custom API base URL by setting `VITE_API_BASE_URL` in `frontend/.env`.
-
-## Usage
-
-- Each conversion session requires the original report (`.docx`) plus one or more destination standards packs (`.pdf`). Upload all relevant standards PDFs so the AI pipeline can align existing references with the new context.
-- The web UI now shows live progress bars and an activity log so you can monitor each stage of the pipeline while it runs.
-- Backend agent calls expect an OpenAI API key in `RIGHTCODES_OPENAI_API_KEY`. On Windows PowerShell you can set it temporarily with `setx RIGHTCODES_OPENAI_API_KEY "sk-..."` (reopen your terminal afterward).
-- Optional: override the default OpenAI models by setting `RIGHTCODES_OPENAI_MODEL_EXTRACT`, `RIGHTCODES_OPENAI_MODEL_MAPPING`, `RIGHTCODES_OPENAI_MODEL_REWRITE`, `RIGHTCODES_OPENAI_MODEL_VALIDATE`, and `RIGHTCODES_OPENAI_MODEL_EMBED`. The pipeline stores standards embeddings under `storage/embeddings/`, enabling retrieval-augmented mapping, and the export stage now writes a converted DOCX to `storage/exports/` with a download button once the pipeline completes.
-
-## Offline Distribution Options
-
-- **Zip bundle:** Run `python tools/build_offline_package.py` after a successful online launch. The script creates `dist/rightcodes-offline.zip` containing the repository, the prepared `.venv/`, and `frontend/node_modules/`. Pass `--python-runtime` and `--node-runtime` to embed portable runtimes if you have them (for example, a locally extracted Python embeddable zip and Node.js binary folder). Extract the archive on another machine (same OS/architecture) and use the launcher scripts without needing internet access.
-- **Docker image:** Build a pre-baked container with `docker build -t rightcodes-launcher -f docker/Dockerfile .`. Supply `RIGHTCODES_OPENAI_API_KEY` (and optionally `RIGHTCODES_OPENAI_API_KEY_SOURCE`) at runtime: `docker run --rm -p 8765:8765 -p 8000:8000 -p 5173:5173 -e RIGHTCODES_OPENAI_API_KEY=sk-xxx rightcodes-launcher`.
-- **First-run reminder:** Even with these assets, the *initial* bundle build still requires internet so dependencies can be fetched once before packaging.
-
-## Next Steps
-
-1. Flesh out agent logic in `agents/` and integrate with orchestration hooks.
-2. Replace the in-memory session store with a durable persistence layer.
-3. Wire the worker queue (`workers/queue/`) to execute long-running stages asynchronously.
+Open `http://localhost:5173`.
diff --git a/agents/README.md b/agents/README.md
deleted file mode 100644
index 2cf821d2df2ea0b0a3ba2f240369024aaf711c58..0000000000000000000000000000000000000000
--- a/agents/README.md
+++ /dev/null
@@ -1,13 +0,0 @@
-# Agent Bundles
-
-Encapsulates prompts, tool definitions, and orchestration glue for each OpenAI Agent engaged in the pipeline.
-
-## Structure
-
-- `orchestrator/` - Coordinator agent spec, high-level playbooks, and session controller logic.
-- `extraction/` - Document clause extraction prompts, tool configs for parsers, and output schemas.
-- `standards_mapping/` - Normalization/mapping prompts, ontology helpers, and reference datasets.
-- `rewrite/` - Minimal-change rewrite prompts, safety rules, and diff formatting helpers.
-- `validation/` - Post-rewrite review prompts, calculation sanity checks, and reporting templates.
-- `export/` - Finalization prompts, merge instructions, and docx regeneration aids.
-- `shared/` - Common prompt fragments, JSON schema definitions, and evaluation heuristics.
diff --git a/agents/__init__.py b/agents/__init__.py
deleted file mode 100644
index 694dc3993b8a2dac9f085478bf7c64517dadc4ba..0000000000000000000000000000000000000000
--- a/agents/__init__.py
+++ /dev/null
@@ -1,15 +0,0 @@
-from .orchestrator.agent import OrchestratorAgent
-from .extraction.agent import ExtractionAgent
-from .standards_mapping.agent import StandardsMappingAgent
-from .rewrite.agent import RewriteAgent
-from .validation.agent import ValidationAgent
-from .export.agent import ExportAgent
-
-__all__ = [
- "OrchestratorAgent",
- "ExtractionAgent",
- "StandardsMappingAgent",
- "RewriteAgent",
- "ValidationAgent",
- "ExportAgent",
-]
diff --git a/agents/export/__init__.py b/agents/export/__init__.py
deleted file mode 100644
index 9f97147915126e71b4f5c060df5400a9f7995542..0000000000000000000000000000000000000000
--- a/agents/export/__init__.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from .agent import ExportAgent
-
-__all__ = ["ExportAgent"]
diff --git a/agents/export/agent.py b/agents/export/agent.py
deleted file mode 100644
index 06ca7a8a26db8b5595c6976b8d52f2c1ab8c7ed0..0000000000000000000000000000000000000000
--- a/agents/export/agent.py
+++ /dev/null
@@ -1,112 +0,0 @@
-from __future__ import annotations
-
-from datetime import datetime
-from pathlib import Path
-from typing import Any, Dict, List
-
-from docx import Document
-
-from server.app.services.diagnostics_service import get_diagnostics_service
-
-from ..shared.base import AgentContext, BaseAgent
-
-
-def _apply_table_replacements(document: Document, replacements: List[Dict[str, Any]]) -> int:
- tables = document.tables
- applied = 0
- for item in replacements:
- index = item.get("table_index")
- updated_rows = item.get("updated_rows")
- if not isinstance(index, int) or index < 0 or index >= len(tables):
- continue
- if not isinstance(updated_rows, list):
- continue
- table = tables[index]
- for row_idx, row_values in enumerate(updated_rows):
- if row_idx < len(table.rows):
- row = table.rows[row_idx]
- else:
- row = table.add_row()
- for col_idx, value in enumerate(row_values):
- if col_idx < len(row.cells):
- row.cells[col_idx].text = str(value) if value is not None else ""
- else:
- break
- applied += 1
- return applied
-
-
-def _apply_replacements(document: Document, replacements: List[Dict[str, Any]]) -> int:
- applied = 0
- paragraphs = document.paragraphs
- for item in replacements:
- try:
- index = int(item.get("paragraph_index"))
- except (TypeError, ValueError):
- continue
- if index < 0 or index >= len(paragraphs):
- continue
- updated_text = item.get("updated_text")
- if not isinstance(updated_text, str):
- continue
- paragraphs[index].text = updated_text
- applied += 1
- return applied
-
-
-class ExportAgent(BaseAgent):
- name = "export-agent"
-
- async def run(self, context: AgentContext) -> Dict[str, Any]:
- await self.emit_debug("Exporting updated document to DOCX.")
-
- rewrite_plan = context.payload.get("rewrite_plan") or {}
- replacements = rewrite_plan.get("replacements") or []
- table_replacements = rewrite_plan.get("table_replacements") or []
- source_path = context.payload.get("original_path")
- if not source_path or not Path(source_path).exists():
- raise RuntimeError("Original document path not supplied to export agent.")
-
- document = Document(source_path)
- applied_paragraphs = _apply_replacements(document, replacements)
- applied_tables = _apply_table_replacements(document, table_replacements)
-
- storage_root = _resolve_storage_root()
- export_dir = Path(storage_root) / "exports"
- export_dir.mkdir(parents=True, exist_ok=True)
- export_path = export_dir / f"{context.session_id}-converted.docx"
- document.save(export_path)
- diagnostics = get_diagnostics_service()
- diagnostics.record_event(
- node_id="exports",
- event_type="export.generated",
- message=f"Generated export for session `{context.session_id}`",
- metadata={
- "session_id": context.session_id,
- "path": str(export_path),
- "paragraph_replacements": applied_paragraphs,
- "table_updates": applied_tables,
- },
- )
-
- if applied_paragraphs or applied_tables:
- note = "Converted document generated using rewrite plan."
- else:
- note = "Export completed, but no replacements were applied."
-
- return {
- "export_path": str(export_path),
- "notes": note,
- "replacement_count": applied_paragraphs,
- "table_replacement_count": applied_tables,
- "generated_at": datetime.utcnow().isoformat(),
- }
-
-
-def _resolve_storage_root() -> Path:
- try:
- from server.app.core.config import get_settings # local import avoids circular dependency
-
- return get_settings().storage_dir
- except Exception: # noqa: BLE001
- return (Path(__file__).resolve().parents[2] / "storage").resolve()
diff --git a/agents/extraction/__init__.py b/agents/extraction/__init__.py
deleted file mode 100644
index ab6f9a789cc83fb5bcbf57d38eb9c94ae9bf8af9..0000000000000000000000000000000000000000
--- a/agents/extraction/__init__.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from .agent import ExtractionAgent
-
-__all__ = ["ExtractionAgent"]
diff --git a/agents/extraction/agent.py b/agents/extraction/agent.py
deleted file mode 100644
index d2394bdecfadb4cd4ea4d97f8f59aac9b2494438..0000000000000000000000000000000000000000
--- a/agents/extraction/agent.py
+++ /dev/null
@@ -1,239 +0,0 @@
-from __future__ import annotations
-
-from dataclasses import asdict, is_dataclass
-import json
-from typing import Any, Dict, Iterable, List
-
-from ..shared.base import AgentContext, BaseAgent
-
-
-class ExtractionAgent(BaseAgent):
- name = "extraction-agent"
- paragraph_chunk_size = 40
- table_chunk_size = 15
- max_text_chars = 1500
-
- async def run(self, context: AgentContext) -> Dict[str, Any]:
- raw_paragraphs = context.payload.get("paragraphs", [])
- raw_tables = context.payload.get("tables", [])
- paragraphs = [_prepare_paragraph(item, self.max_text_chars) for item in _normalise_items(raw_paragraphs)]
- tables = [_prepare_table(item, self.max_text_chars) for item in _normalise_items(raw_tables)]
- metadata = context.payload.get("metadata", {})
-
- if not paragraphs and not tables:
- await self.emit_debug("No document content supplied to extraction agent.")
- return {
- "document_summary": "",
- "sections": [],
- "tables": [],
- "references": [],
- "notes": "Skipped: no document content provided.",
- }
-
- schema = {
- "name": "ExtractionResult",
- "schema": {
- "type": "object",
- "properties": {
- "document_summary": {"type": "string"},
- "sections": {
- "type": "array",
- "items": {
- "type": "object",
- "properties": {
- "paragraph_index": {"type": "integer"},
- "text": {"type": "string"},
- "references": {
- "type": "array",
- "items": {"type": "string"},
- "default": [],
- },
- },
- "required": ["paragraph_index", "text"],
- "additionalProperties": False,
- },
- "default": [],
- },
- "tables": {
- "type": "array",
- "items": {
- "type": "object",
- "properties": {
- "table_index": {"type": "integer"},
- "summary": {"type": "string"},
- "references": {
- "type": "array",
- "items": {"type": "string"},
- "default": [],
- },
- },
- "required": ["table_index", "summary"],
- "additionalProperties": False,
- },
- "default": [],
- },
- "references": {
- "type": "array",
- "items": {"type": "string"},
- "default": [],
- },
- "notes": {"type": "string"},
- },
- "required": ["document_summary", "sections", "tables", "references"],
- "additionalProperties": False,
- },
- }
-
- model = self.settings.openai_model_extract
- aggregated_sections: List[Dict[str, Any]] = []
- aggregated_tables: List[Dict[str, Any]] = []
- aggregated_references: set[str] = set()
- summaries: List[str] = []
- notes: List[str] = []
-
- for chunk in _chunk_list(paragraphs, self.paragraph_chunk_size):
- batch = await self._process_batch(
- context=context,
- model=model,
- schema=schema,
- paragraphs=chunk,
- tables=[],
- metadata=metadata,
- )
- if batch:
- summaries.append(batch.get("document_summary", ""))
- aggregated_sections.extend(batch.get("sections", []))
- aggregated_tables.extend(batch.get("tables", []))
- aggregated_references.update(batch.get("references", []))
- if batch.get("notes"):
- notes.append(batch["notes"])
-
- for chunk in _chunk_list(tables, self.table_chunk_size):
- batch = await self._process_batch(
- context=context,
- model=model,
- schema=schema,
- paragraphs=[],
- tables=chunk,
- metadata=metadata,
- )
- if batch:
- aggregated_tables.extend(batch.get("tables", []))
- aggregated_references.update(batch.get("references", []))
- if batch.get("notes"):
- notes.append(batch["notes"])
-
- summary = " ".join(filter(None, summaries)).strip()
-
- for item in paragraphs:
- aggregated_references.update(item.get("references", []))
- for item in tables:
- aggregated_references.update(item.get("references", []))
-
- return {
- "document_summary": summary,
- "sections": aggregated_sections,
- "tables": aggregated_tables,
- "references": sorted(aggregated_references),
- "notes": " ".join(notes).strip(),
- }
-
- async def _process_batch(
- self,
- *,
- context: AgentContext,
- model: str,
- schema: Dict[str, Any],
- paragraphs: List[Dict[str, Any]],
- tables: List[Dict[str, Any]],
- metadata: Dict[str, Any],
- ) -> Dict[str, Any]:
- if not paragraphs and not tables:
- return {}
-
- payload = {
- "paragraphs": paragraphs,
- "tables": tables,
- "metadata": metadata,
- }
-
- messages = [
- {
- "role": "system",
- "content": (
- "You are an engineering standards analyst. "
- "Analyse the supplied report content, identify normative references, "
- "and return structured data following the JSON schema."
- ),
- },
- {
- "role": "user",
- "content": (
- f"Session ID: {context.session_id}\n"
- f"Payload: {json.dumps(payload, ensure_ascii=False)}"
- ),
- },
- ]
-
- try:
- result = await self.call_openai_json(model=model, messages=messages, schema=schema)
- return result
- except Exception as exc: # noqa: BLE001
- await self.emit_debug(f"Extraction chunk failed: {exc}")
- return {
- "document_summary": "",
- "sections": [],
- "tables": [],
- "references": [],
- "notes": f"Chunk failed: {exc}",
- }
-
-
-def _normalise_items(items: List[Any]) -> List[Dict[str, Any]]:
- normalised: List[Dict[str, Any]] = []
- for item in items:
- if is_dataclass(item):
- normalised.append(asdict(item))
- elif isinstance(item, dict):
- normalised.append(item)
- else:
- normalised.append({"value": str(item)})
- return normalised
-
-
-def _prepare_paragraph(item: Dict[str, Any], max_chars: int) -> Dict[str, Any]:
- text = item.get("text", "")
- if len(text) > max_chars:
- text = text[:max_chars] + "...(trimmed)"
- return {
- "index": item.get("index"),
- "text": text,
- "style": item.get("style"),
- "heading_level": item.get("heading_level"),
- "references": item.get("references", []),
- }
-
-
-def _prepare_table(item: Dict[str, Any], max_chars: int) -> Dict[str, Any]:
- rows = item.get("rows", [])
- preview_rows = []
- for row in rows:
- preview_row = []
- for cell in row:
- cell_text = str(cell)
- if len(cell_text) > max_chars:
- cell_text = cell_text[:max_chars] + "...(trimmed)"
- preview_row.append(cell_text)
- preview_rows.append(preview_row)
- return {
- "index": item.get("index"),
- "rows": preview_rows,
- "references": item.get("references", []),
- }
-
-
-def _chunk_list(items: List[Dict[str, Any]], size: int) -> Iterable[List[Dict[str, Any]]]:
- if size <= 0:
- size = len(items) or 1
- for idx in range(0, len(items), size):
- yield items[idx : idx + size]
diff --git a/agents/orchestrator/__init__.py b/agents/orchestrator/__init__.py
deleted file mode 100644
index 1e4be55f7405d1b789f8d3acc0c5d9c1a0ca16e1..0000000000000000000000000000000000000000
--- a/agents/orchestrator/__init__.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from .agent import OrchestratorAgent
-
-__all__ = ["OrchestratorAgent"]
diff --git a/agents/orchestrator/agent.py b/agents/orchestrator/agent.py
deleted file mode 100644
index dc3fc263f1c05244beee600126b4d2393f38c0e6..0000000000000000000000000000000000000000
--- a/agents/orchestrator/agent.py
+++ /dev/null
@@ -1,18 +0,0 @@
-from __future__ import annotations
-
-from typing import Any, Dict
-
-from ..shared.base import AgentContext, BaseAgent
-
-
-class OrchestratorAgent(BaseAgent):
- name = "orchestrator-agent"
-
- async def run(self, context: AgentContext) -> Dict[str, Any]:
- await self.emit_debug(f"Received session {context.session_id}")
- # In a future iteration this agent will orchestrate sub-agent calls.
- return {
- "next_stage": "ingest",
- "notes": "Placeholder orchestrator response.",
- "input_payload": context.payload,
- }
diff --git a/agents/rewrite/__init__.py b/agents/rewrite/__init__.py
deleted file mode 100644
index 971a30bc1c9db81943c8ba5d16c5d319c3738b9c..0000000000000000000000000000000000000000
--- a/agents/rewrite/__init__.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from .agent import RewriteAgent
-
-__all__ = ["RewriteAgent"]
diff --git a/agents/rewrite/agent.py b/agents/rewrite/agent.py
deleted file mode 100644
index e6c4aa75edc799d2acb30263c529355cae8c0787..0000000000000000000000000000000000000000
--- a/agents/rewrite/agent.py
+++ /dev/null
@@ -1,315 +0,0 @@
-from __future__ import annotations
-
-import json
-from typing import Any, Dict, Iterable, List, Sequence
-
-from ..shared.base import AgentContext, BaseAgent
-
-
-class RewriteAgent(BaseAgent):
- name = "rewrite-agent"
- paragraph_chunk_size = 20
- max_paragraph_chars = 1600
- max_table_chars = 1200
-
- async def run(self, context: AgentContext) -> Dict[str, Any]:
- mapping_result = context.payload.get("mapping_result") or {}
- mappings: List[Dict[str, Any]] = mapping_result.get("mappings", [])
- if not mappings:
- await self.emit_debug("Rewrite skipped: no mappings provided.")
- return {
- "replacements": [],
- "table_replacements": [],
- "change_log": [],
- "notes": "Rewrite skipped: no mappings provided.",
- }
-
- mapping_by_reference = _index_mappings(mappings)
- doc_paragraphs = _normalise_paragraphs(
- context.payload.get("document_paragraphs", []), self.max_paragraph_chars
- )
- doc_tables = _normalise_tables(
- context.payload.get("document_tables", []), self.max_table_chars
- )
-
- paragraphs_to_rewrite = [
- paragraph
- for paragraph in doc_paragraphs
- if any(ref in mapping_by_reference for ref in paragraph["references"])
- ]
- tables_to_rewrite = [
- table
- for table in doc_tables
- if any(ref in mapping_by_reference for ref in table["references"])
- ]
-
- if not paragraphs_to_rewrite and not tables_to_rewrite:
- await self.emit_debug("Rewrite skipped: no paragraphs or tables matched mapped references.")
- return {
- "replacements": [],
- "table_replacements": [],
- "change_log": [],
- "notes": "Rewrite skipped: no references found in document.",
- }
-
- aggregated_replacements: List[Dict[str, Any]] = []
- aggregated_table_replacements: List[Dict[str, Any]] = []
- change_log_entries: List[Dict[str, Any]] = []
- change_log_seen: set[tuple] = set()
- notes: List[str] = []
- target_voice = context.payload.get("target_voice", "Professional engineering tone")
- constraints = context.payload.get("constraints", [])
-
- pending_tables = {table["index"]: table for table in tables_to_rewrite}
-
- for chunk in _chunk_list(paragraphs_to_rewrite, self.paragraph_chunk_size):
- relevant_refs = sorted(
- {
- reference
- for paragraph in chunk
- for reference in paragraph["references"]
- if reference in mapping_by_reference
- }
- )
- if not relevant_refs:
- continue
-
- mapping_subset = _collect_mapping_subset(mapping_by_reference, relevant_refs)
- associated_tables = _collect_tables_for_refs(pending_tables, relevant_refs)
-
- payload = {
- "instructions": {
- "target_voice": target_voice,
- "constraints": constraints,
- "guidance": [
- "Preserve numbering, bullet markers, and formatting cues.",
- "Do not alter calculations, quantities, or engineering values.",
- "Only update normative references and surrounding wording necessary for clarity.",
- "Maintain section titles and headings.",
- ],
- },
- "paragraphs": chunk,
- "tables": associated_tables,
- "mappings": mapping_subset,
- }
-
- schema = _rewrite_schema()
- messages = [
- {
- "role": "system",
- "content": (
- "You are an engineering editor updating a report so its references align with the target standards. "
- "Return JSON matching the schema. Maintain original structure and numbering while replacing each reference with the mapped target references."
- ),
- },
- {
- "role": "user",
- "content": (
- f"Session: {context.session_id}\n"
- f"Payload: {json.dumps(payload, ensure_ascii=False)}"
- ),
- },
- ]
-
- try:
- result = await self.call_openai_json(
- model=self.settings.openai_model_rewrite,
- messages=messages,
- schema=schema,
- )
- aggregated_replacements.extend(result.get("replacements", []))
- aggregated_table_replacements.extend(result.get("table_replacements", []))
- for entry in result.get("change_log", []):
- key = (
- entry.get("reference"),
- entry.get("target_reference"),
- tuple(entry.get("affected_paragraphs", [])),
- )
- if key not in change_log_seen:
- change_log_entries.append(entry)
- change_log_seen.add(key)
- if result.get("notes"):
- notes.append(result["notes"])
- except Exception as exc: # noqa: BLE001
- await self.emit_debug(f"Rewrite chunk failed: {exc}")
- notes.append(f"Rewrite chunk failed: {exc}")
-
- if not aggregated_replacements and not aggregated_table_replacements:
- return {
- "replacements": [],
- "table_replacements": [],
- "change_log": change_log_entries,
- "notes": "Rewrite completed but no updates were suggested.",
- }
-
- return {
- "replacements": aggregated_replacements,
- "table_replacements": aggregated_table_replacements,
- "change_log": change_log_entries,
- "notes": " ".join(notes).strip(),
- }
-
-
-def _index_mappings(mappings: Sequence[Dict[str, Any]]) -> Dict[str, List[Dict[str, Any]]]:
- index: Dict[str, List[Dict[str, Any]]] = {}
- for mapping in mappings:
- ref = mapping.get("source_reference")
- if not isinstance(ref, str):
- continue
- index.setdefault(ref, []).append(mapping)
- return index
-
-
-def _collect_mapping_subset(
- mapping_by_reference: Dict[str, List[Dict[str, Any]]],
- references: Sequence[str],
-) -> List[Dict[str, Any]]:
- subset: List[Dict[str, Any]] = []
- for reference in references:
- subset.extend(mapping_by_reference.get(reference, []))
- return subset
-
-
-def _collect_tables_for_refs(
- pending_tables: Dict[int, Dict[str, Any]],
- references: Sequence[str],
-) -> List[Dict[str, Any]]:
- matched: List[Dict[str, Any]] = []
- for index in list(pending_tables.keys()):
- table = pending_tables[index]
- if any(ref in references for ref in table["references"]):
- matched.append(table)
- pending_tables.pop(index, None)
- return matched
-
-
-def _normalise_paragraphs(items: Sequence[Dict[str, Any]], max_chars: int) -> List[Dict[str, Any]]:
- paragraphs: List[Dict[str, Any]] = []
- for item in items:
- index = item.get("index")
- if index is None:
- continue
- text = str(item.get("text", ""))
- if len(text) > max_chars:
- text = text[:max_chars] + "...(trimmed)"
- paragraphs.append(
- {
- "index": index,
- "text": text,
- "style": item.get("style"),
- "heading_level": item.get("heading_level"),
- "references": item.get("references", []),
- }
- )
- return paragraphs
-
-
-def _normalise_tables(items: Sequence[Dict[str, Any]], max_chars: int) -> List[Dict[str, Any]]:
- tables: List[Dict[str, Any]] = []
- for item in items:
- index = item.get("index")
- if index is None:
- continue
- rows = []
- for row in item.get("rows", []):
- preview_row = []
- for cell in row:
- cell_text = str(cell)
- if len(cell_text) > max_chars:
- cell_text = cell_text[:max_chars] + "...(trimmed)"
- preview_row.append(cell_text)
- rows.append(preview_row)
- tables.append(
- {
- "index": index,
- "rows": rows,
- "references": item.get("references", []),
- }
- )
- return tables
-
-
-def _chunk_list(items: Sequence[Dict[str, Any]], size: int) -> Iterable[List[Dict[str, Any]]]:
- if size <= 0:
- size = len(items) or 1
- for idx in range(0, len(items), size):
- yield list(items[idx : idx + size])
-
-
-def _rewrite_schema() -> Dict[str, Any]:
- return {
- "name": "RewritePlanChunk",
- "schema": {
- "type": "object",
- "properties": {
- "replacements": {
- "type": "array",
- "items": {
- "type": "object",
- "properties": {
- "paragraph_index": {"type": "integer"},
- "original_text": {"type": "string"},
- "updated_text": {"type": "string"},
- "applied_mappings": {
- "type": "array",
- "items": {"type": "string"},
- "default": [],
- },
- "change_reason": {"type": "string"},
- },
- "required": ["paragraph_index", "updated_text"],
- "additionalProperties": False,
- },
- "default": [],
- },
- "table_replacements": {
- "type": "array",
- "items": {
- "type": "object",
- "properties": {
- "table_index": {"type": "integer"},
- "updated_rows": {
- "type": "array",
- "items": {
- "type": "array",
- "items": {"type": "string"},
- },
- "default": [],
- },
- "applied_mappings": {
- "type": "array",
- "items": {"type": "string"},
- "default": [],
- },
- "change_reason": {"type": "string"},
- },
- "required": ["table_index"],
- "additionalProperties": False,
- },
- "default": [],
- },
- "change_log": {
- "type": "array",
- "items": {
- "type": "object",
- "properties": {
- "reference": {"type": "string"},
- "target_reference": {"type": "string"},
- "affected_paragraphs": {
- "type": "array",
- "items": {"type": "integer"},
- "default": [],
- },
- "note": {"type": "string"},
- },
- "required": ["reference", "target_reference"],
- "additionalProperties": False,
- },
- "default": [],
- },
- "notes": {"type": "string"},
- },
- "required": ["replacements", "table_replacements", "change_log"],
- "additionalProperties": False,
- },
- }
diff --git a/agents/shared/base.py b/agents/shared/base.py
deleted file mode 100644
index 18599b1deb29438a0a226bba2ad4d3195b7bd7bc..0000000000000000000000000000000000000000
--- a/agents/shared/base.py
+++ /dev/null
@@ -1,49 +0,0 @@
-from __future__ import annotations
-
-from abc import ABC, abstractmethod
-from dataclasses import dataclass, field
-from typing import Any, Dict
-
-# We import config lazily to avoid circular imports during module initialisation.
-from typing import TYPE_CHECKING
-
-if TYPE_CHECKING:
- from server.app.core.config import Settings # pragma: no cover
-
-
-@dataclass
-class AgentContext:
- session_id: str
- payload: Dict[str, Any] = field(default_factory=dict)
-
-
-class BaseAgent(ABC):
- name: str
-
- @abstractmethod
- async def run(self, context: AgentContext) -> Dict[str, Any]:
- """Execute agent logic and return structured output."""
-
- async def emit_debug(self, message: str) -> None:
- # Placeholder until logging/event bus is wired in.
- print(f"[{self.name}] {message}")
-
- @property
- def settings(self):
- from server.app.core.config import get_settings # import here to avoid circular dependency
-
- return get_settings()
-
- async def call_openai_json(
- self,
- *,
- model: str,
- messages: list[Dict[str, Any]],
- schema: Dict[str, Any],
- ) -> Dict[str, Any]:
- from .client import create_json_response # import here to avoid circular dependency
-
- if not self.settings.openai_api_key:
- await self.emit_debug("OpenAI API key missing; returning empty response.")
- raise RuntimeError("OpenAI API key missing")
- return await create_json_response(model=model, messages=messages, schema=schema)
diff --git a/agents/shared/client.py b/agents/shared/client.py
deleted file mode 100644
index 844e37d9c8b90bd0d887d9756f0bcb32d2247f21..0000000000000000000000000000000000000000
--- a/agents/shared/client.py
+++ /dev/null
@@ -1,97 +0,0 @@
-from __future__ import annotations
-
-import json
-import logging
-from functools import lru_cache
-from typing import Any, Iterable
-
-from openai import AsyncOpenAI, OpenAIError
-
-try:
- from server.app.core.config import get_settings
- from server.app.services.diagnostics_service import get_diagnostics_service
-except ModuleNotFoundError as exc: # pragma: no cover
- raise RuntimeError(
- "Failed to import server configuration. Ensure the project root is on PYTHONPATH."
- ) from exc
-
-logger = logging.getLogger(__name__)
-
-
-@lru_cache
-def get_openai_client() -> AsyncOpenAI:
- settings = get_settings()
- if not settings.openai_api_key:
- diagnostics = get_diagnostics_service()
- diagnostics.record_event(
- node_id="openai",
- event_type="openai.missing_key",
- message="OpenAI API key missing; requests will fail.",
- metadata={},
- )
- raise RuntimeError(
- "OpenAI API key is not configured. Set RIGHTCODES_OPENAI_API_KEY before invoking agents."
- )
- diagnostics = get_diagnostics_service()
- diagnostics.record_event(
- node_id="openai",
- event_type="openai.client_ready",
- message="OpenAI client initialised.",
- metadata={},
- )
- return AsyncOpenAI(api_key=settings.openai_api_key, base_url=settings.openai_api_base)
-
-
-async def create_json_response(
- *,
- model: str,
- messages: Iterable[dict[str, Any]],
- schema: dict[str, Any],
-) -> dict[str, Any]:
- """Invoke OpenAI with a JSON schema response format."""
- client = get_openai_client()
- diagnostics = get_diagnostics_service()
- diagnostics.record_event(
- node_id="openai",
- event_type="openai.request",
- message=f"Requesting model `{model}`",
- metadata={"model": model},
- )
- try:
- response = await client.chat.completions.create(
- model=model,
- messages=list(messages),
- response_format={"type": "json_schema", "json_schema": schema},
- )
- except OpenAIError as exc:
- logger.exception("OpenAI call failed: %s", exc)
- diagnostics.record_event(
- node_id="openai",
- event_type="openai.error",
- message="OpenAI request failed.",
- metadata={"model": model, "error": str(exc)},
- )
- raise
-
- try:
- choice = response.choices[0]
- content = choice.message.content if choice and choice.message else None
- if not content:
- raise RuntimeError("OpenAI response did not include message content.")
- payload = json.loads(content)
- diagnostics.record_event(
- node_id="openai",
- event_type="openai.response",
- message="Received OpenAI response.",
- metadata={"model": model},
- )
- return payload
- except (AttributeError, json.JSONDecodeError) as exc:
- logger.exception("Failed to decode OpenAI response: %s", exc)
- diagnostics.record_event(
- node_id="openai",
- event_type="openai.error",
- message="Failed to decode OpenAI response.",
- metadata={"model": model, "error": str(exc)},
- )
- raise RuntimeError("OpenAI response was not valid JSON.") from exc
diff --git a/agents/shared/embeddings.py b/agents/shared/embeddings.py
deleted file mode 100644
index 477896aaf1cc4a4c1282854d900c1f66d7adf894..0000000000000000000000000000000000000000
--- a/agents/shared/embeddings.py
+++ /dev/null
@@ -1,19 +0,0 @@
-from __future__ import annotations
-
-from typing import Iterable, List, Sequence
-
-
-
-async def embed_texts(texts: Iterable[str]) -> List[List[float]]:
- texts = [text if text else "" for text in texts]
- if not texts:
- return []
- from ..shared.client import get_openai_client
- client = get_openai_client()
- from server.app.core.config import get_settings
- settings = get_settings()
- response = await client.embeddings.create(
- model=settings.openai_model_embed,
- input=list(texts),
- )
- return [item.embedding for item in response.data]
diff --git a/agents/standards_mapping/__init__.py b/agents/standards_mapping/__init__.py
deleted file mode 100644
index dd6d92a3e4ec2ba4a30f39df0e58e9885e6ee216..0000000000000000000000000000000000000000
--- a/agents/standards_mapping/__init__.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from .agent import StandardsMappingAgent
-
-__all__ = ["StandardsMappingAgent"]
diff --git a/agents/standards_mapping/agent.py b/agents/standards_mapping/agent.py
deleted file mode 100644
index fd9085f5b6eb8a8673d1674e40d6f20a0a5fbd77..0000000000000000000000000000000000000000
--- a/agents/standards_mapping/agent.py
+++ /dev/null
@@ -1,293 +0,0 @@
-from __future__ import annotations
-
-from dataclasses import asdict, is_dataclass
-import json
-from pathlib import Path
-from typing import Any, Dict, Iterable, List
-
-from ..shared.base import AgentContext, BaseAgent
-from ..shared.embeddings import embed_texts
-from common.embedding_store import EmbeddingStore, get_session_embedding_path
-
-
-class StandardsMappingAgent(BaseAgent):
- name = "standards-mapping-agent"
- reference_chunk_size = 20
- max_excerpt_chars = 800
-
- async def run(self, context: AgentContext) -> Dict[str, Any]:
- extraction_result = context.payload.get("extraction_result") or {}
- references: List[str] = extraction_result.get("references") or []
- sections = extraction_result.get("sections") or []
- tables = extraction_result.get("tables") or []
- standards_chunks = _normalise_items(context.payload.get("standards_chunks", []))
- target_metadata = context.payload.get("target_metadata", {})
- store = EmbeddingStore(get_session_embedding_path(context.session_id))
-
- if not references or not standards_chunks:
- await self.emit_debug("Insufficient data for standards mapping.")
- return {
- "mappings": [],
- "unmapped_references": references,
- "notes": "Mapping skipped due to missing references or standards content.",
- }
-
- schema = {
- "name": "StandardsMapping",
- "schema": {
- "type": "object",
- "properties": {
- "mappings": {
- "type": "array",
- "items": {
- "type": "object",
- "properties": {
- "source_reference": {"type": "string"},
- "source_context": {"type": "string"},
- "target_reference": {"type": "string"},
- "target_clause": {"type": "string"},
- "target_summary": {"type": "string"},
- "confidence": {"type": "number"},
- "rationale": {"type": "string"},
- },
- "required": [
- "source_reference",
- "target_reference",
- "confidence",
- ],
- "additionalProperties": False,
- },
- "default": [],
- },
- "unmapped_references": {
- "type": "array",
- "items": {"type": "string"},
- "default": [],
- },
- "notes": {"type": "string"},
- },
- "required": ["mappings", "unmapped_references"],
- "additionalProperties": False,
- },
- }
-
- standards_overview = _build_standards_overview(standards_chunks, self.max_excerpt_chars)
-
- model = self.settings.openai_model_mapping
- aggregated_mappings: List[Dict[str, Any]] = []
- aggregated_unmapped: set[str] = set()
- notes: List[str] = []
-
- for chunk in _chunk_list(references, self.reference_chunk_size):
- reference_context = _build_reference_context(
- chunk, sections, tables, self.max_excerpt_chars
- )
- retrieved_candidates = await _retrieve_candidates(
- chunk, reference_context, store, target_metadata, self.max_excerpt_chars
- )
- payload = {
- "references": chunk,
- "reference_context": reference_context,
- "retrieved_candidates": [
- {"reference": ref, "candidates": retrieved_candidates.get(ref, [])}
- for ref in chunk
- ],
- "standards_overview": standards_overview,
- "target_metadata": target_metadata,
- }
-
- messages = [
- {
- "role": "system",
- "content": (
- "You are an engineering standards migration specialist. "
- "Map each legacy reference to the best matching clause in the target standards. "
- "Use the provided context and standards overview to justify your mapping. "
- "Return JSON that conforms to the supplied schema."
- ),
- },
- {
- "role": "user",
- "content": (
- f"Session: {context.session_id}\n"
- f"Payload: {json.dumps(payload, ensure_ascii=False)}"
- ),
- },
- ]
-
- try:
- result = await self.call_openai_json(model=model, messages=messages, schema=schema)
- aggregated_mappings.extend(result.get("mappings", []))
- aggregated_unmapped.update(result.get("unmapped_references", []))
- if result.get("notes"):
- notes.append(result["notes"])
- except Exception as exc: # noqa: BLE001
- await self.emit_debug(f"Standards mapping chunk failed: {exc}")
- aggregated_unmapped.update(chunk)
- notes.append(f"Chunk failed: {exc}")
-
- await self.emit_debug("Standards mapping completed via OpenAI.")
- return {
- "mappings": aggregated_mappings,
- "unmapped_references": sorted(aggregated_unmapped),
- "notes": " ".join(notes).strip(),
- }
-
-
-def _normalise_items(items: List[Any]) -> List[Dict[str, Any]]:
- normalised: List[Dict[str, Any]] = []
- for item in items:
- if is_dataclass(item):
- normalised.append(asdict(item))
- elif isinstance(item, dict):
- normalised.append(item)
- else:
- normalised.append({"text": str(item)})
- return normalised
-
-
-def _chunk_list(items: List[str], size: int) -> Iterable[List[str]]:
- if size <= 0:
- size = len(items) or 1
- for idx in range(0, len(items), size):
- yield items[idx : idx + size]
-
-
-def _build_reference_context(
- references: List[str],
- sections: List[Dict[str, Any]],
- tables: List[Dict[str, Any]],
- max_chars: int,
-) -> List[Dict[str, Any]]:
- section_map: Dict[str, List[Dict[str, Any]]] = {}
- for section in sections:
- refs = section.get("references") or []
- for ref in refs:
- section_map.setdefault(ref, [])
- if len(section_map[ref]) < 3:
- text = section.get("text", "")
- if len(text) > max_chars:
- text = text[:max_chars] + "...(trimmed)"
- section_map[ref].append(
- {
- "paragraph_index": section.get("paragraph_index"),
- "text": text,
- }
- )
- table_map: Dict[str, List[Dict[str, Any]]] = {}
- for table in tables:
- refs = table.get("references") or []
- for ref in refs:
- table_map.setdefault(ref, [])
- if len(table_map[ref]) < 2:
- table_map[ref].append({"table_index": table.get("table_index"), "references": refs})
-
- context = []
- for ref in references:
- context.append(
- {
- "reference": ref,
- "paragraphs": section_map.get(ref, []),
- "tables": table_map.get(ref, []),
- }
- )
- return context
-
-
-def _build_standards_overview(
- standards_chunks: List[Dict[str, Any]],
- max_chars: int,
-) -> List[Dict[str, Any]]:
- grouped: Dict[str, Dict[str, Any]] = {}
- for chunk in standards_chunks:
- path = chunk.get("path", "unknown")
- heading = chunk.get("heading")
- clauses = chunk.get("clause_numbers") or []
- text = chunk.get("text", "")
- if len(text) > max_chars:
- text = text[:max_chars] + "...(trimmed)"
-
- group = grouped.setdefault(
- path,
- {
- "document": Path(path).name,
- "headings": [],
- "clauses": [],
- "snippets": [],
- },
- )
- if heading and heading not in group["headings"] and len(group["headings"]) < 120:
- group["headings"].append(heading)
- for clause in clauses:
- if clause not in group["clauses"] and len(group["clauses"]) < 120:
- group["clauses"].append(clause)
- if text and len(group["snippets"]) < 30:
- group["snippets"].append(text)
-
- overview: List[Dict[str, Any]] = []
- for data in grouped.values():
- overview.append(
- {
- "document": data["document"],
- "headings": data["headings"][:50],
- "clauses": data["clauses"][:50],
- "snippets": data["snippets"],
- }
- )
- return overview[:30]
-
-
-async def _retrieve_candidates(
- references: List[str],
- reference_context: List[Dict[str, Any]],
- store: EmbeddingStore,
- target_metadata: Dict[str, Any],
- max_chars: int,
-) -> Dict[str, List[Dict[str, Any]]]:
- if not references:
- return {}
- if store.is_empty:
- return {ref: [] for ref in references}
-
- context_lookup = {entry["reference"]: entry for entry in reference_context}
- embed_inputs = [
- _compose_reference_embedding_input(
- reference,
- context_lookup.get(reference, {}),
- target_metadata,
- max_chars,
- )
- for reference in references
- ]
- vectors = await embed_texts(embed_inputs)
- results: Dict[str, List[Dict[str, Any]]] = {}
- for reference, vector in zip(references, vectors):
- candidates = store.query(vector, top_k=8)
- results[reference] = candidates
- return results
-
-
-def _compose_reference_embedding_input(
- reference: str,
- context_entry: Dict[str, Any],
- target_metadata: Dict[str, Any],
- max_chars: int,
-) -> str:
- lines = [reference]
- target_standard = target_metadata.get("target_standard")
- if target_standard:
- lines.append(f"Target standard family: {target_standard}")
- paragraphs = context_entry.get("paragraphs") or []
- for paragraph in paragraphs[:2]:
- text = paragraph.get("text")
- if text:
- lines.append(text)
- tables = context_entry.get("tables") or []
- if tables:
- refs = tables[0].get("references") or []
- if refs:
- lines.append("Table references: " + ", ".join(refs))
- text = "\n".join(filter(None, lines))
- if len(text) > max_chars:
- text = text[:max_chars] + "...(trimmed)"
- return text
diff --git a/agents/validation/__init__.py b/agents/validation/__init__.py
deleted file mode 100644
index a4849b85267494d4c09d2f05d63d16c9ed23b644..0000000000000000000000000000000000000000
--- a/agents/validation/__init__.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from .agent import ValidationAgent
-
-__all__ = ["ValidationAgent"]
diff --git a/agents/validation/agent.py b/agents/validation/agent.py
deleted file mode 100644
index 25f2898232cda7286e88a49ada7c969c575736cb..0000000000000000000000000000000000000000
--- a/agents/validation/agent.py
+++ /dev/null
@@ -1,96 +0,0 @@
-from __future__ import annotations
-
-import json
-from typing import Any, Dict, List
-
-from ..shared.base import AgentContext, BaseAgent
-
-
-class ValidationAgent(BaseAgent):
- name = "validation-agent"
-
- async def run(self, context: AgentContext) -> Dict[str, Any]:
- await self.emit_debug("Running compliance checks.")
-
- extraction = context.payload.get("extraction_result") or {}
- mapping = context.payload.get("mapping_result") or {}
- rewrite_plan = context.payload.get("rewrite_plan") or {}
-
- if not mapping or not rewrite_plan:
- return {
- "issues": [],
- "verdict": "pending",
- "notes": "Validation skipped because mapping or rewrite data was unavailable.",
- }
-
- schema = {
- "name": "ValidationReport",
- "schema": {
- "type": "object",
- "properties": {
- "verdict": {
- "type": "string",
- "enum": ["approved", "changes_requested", "pending"],
- },
- "issues": {
- "type": "array",
- "items": {
- "type": "object",
- "properties": {
- "description": {"type": "string"},
- "severity": {
- "type": "string",
- "enum": ["info", "low", "medium", "high"],
- },
- "related_reference": {"type": "string"},
- },
- "required": ["description", "severity"],
- "additionalProperties": False,
- },
- "default": [],
- },
- "notes": {"type": "string"},
- },
- "required": ["verdict", "issues"],
- "additionalProperties": False,
- },
- }
-
- mapping_snippet = json.dumps(mapping.get("mappings", [])[:20], ensure_ascii=False)
- rewrite_snippet = json.dumps(rewrite_plan.get("replacements", [])[:20], ensure_ascii=False)
- references = extraction.get("references", [])
-
- messages = [
- {
- "role": "system",
- "content": (
- "You are a senior structural engineer reviewing a standards migration. "
- "Evaluate whether the proposed replacements maintain compliance and highlight any risks."
- ),
- },
- {
- "role": "user",
- "content": (
- f"Session: {context.session_id}\n"
- f"Detected references: {references}\n"
- f"Mappings sample: {mapping_snippet}\n"
- f"Rewrite sample: {rewrite_snippet}"
- ),
- },
- ]
-
- model = self.settings.openai_model_mapping
-
- try:
- result = await self.call_openai_json(model=model, messages=messages, schema=schema)
- await self.emit_debug("Validation agent completed via OpenAI.")
- if not result.get("notes"):
- result["notes"] = "Review generated by automated validation agent."
- return result
- except Exception as exc: # noqa: BLE001
- await self.emit_debug(f"Validation agent failed: {exc}")
- return {
- "issues": [],
- "verdict": "pending",
- "notes": f"Validation failed: {exc}",
- }
diff --git a/common/README.md b/common/README.md
deleted file mode 100644
index 341121688861594c295816ec788c7a24e495d418..0000000000000000000000000000000000000000
--- a/common/README.md
+++ /dev/null
@@ -1,10 +0,0 @@
-# Common Domain Assets
-
-Cross-cutting models, schemas, and events shared between backend services, agents, and workers.
-
-## Structure
-
-- `models/` - Core domain entities and DTOs reused across services.
-- `utils/` - Reusable helper functions (text normalization, ID generation).
-- `schemas/` - JSON schema definitions for agent input/output contracts.
-- `events/` - Event payload definitions for pipeline instrumentation.
diff --git a/common/embedding_store.py b/common/embedding_store.py
deleted file mode 100644
index 53e9e02091288bf536638e9f1e492f4644108173..0000000000000000000000000000000000000000
--- a/common/embedding_store.py
+++ /dev/null
@@ -1,98 +0,0 @@
-from __future__ import annotations
-
-import json
-from dataclasses import dataclass
-from pathlib import Path
-from typing import Any, List, Sequence, Tuple
-
-import numpy as np
-
-
-def _resolve_embedding_root() -> Path:
- try:
- from server.app.core.config import get_settings # local import to avoid hard dependency at import time
-
- storage_dir = get_settings().storage_dir
- except Exception: # noqa: BLE001
- storage_dir = Path(__file__).resolve().parents[1] / "storage"
- return Path(storage_dir) / "embeddings"
-
-
-def get_session_embedding_path(session_id: str) -> Path:
- return _resolve_embedding_root() / f"{session_id}.json"
-
-
-@dataclass
-class EmbeddingRecord:
- vector: List[float]
- metadata: dict[str, Any]
-
-
-class EmbeddingStore:
- def __init__(self, path: Path) -> None:
- self.path = path
- self._records: list[EmbeddingRecord] = []
- self._matrix: np.ndarray | None = None
- self._load()
-
- def _load(self) -> None:
- if not self.path.exists():
- return
- with self.path.open("r", encoding="utf-8") as fh:
- data = json.load(fh)
- self._records = [EmbeddingRecord(**item) for item in data]
-
- def save(self) -> None:
- self.path.parent.mkdir(parents=True, exist_ok=True)
- with self.path.open("w", encoding="utf-8") as fh:
- json.dump(
- [{"vector": rec.vector, "metadata": rec.metadata} for rec in self._records],
- fh,
- ensure_ascii=False,
- indent=2,
- )
-
- def clear(self) -> None:
- self._records.clear()
- self._matrix = None
-
- def extend(self, vectors: Sequence[Sequence[float]], metadatas: Sequence[dict[str, Any]]) -> None:
- for vector, metadata in zip(vectors, metadatas, strict=True):
- self._records.append(EmbeddingRecord(list(vector), dict(metadata)))
- self._matrix = None
-
- @property
- def is_empty(self) -> bool:
- return not self._records
-
- def _ensure_matrix(self) -> None:
- if self._matrix is None and self._records:
- self._matrix = np.array([rec.vector for rec in self._records], dtype=np.float32)
-
- def query(self, vector: Sequence[float], top_k: int = 5) -> List[Tuple[dict[str, Any], float]]:
- if self.is_empty:
- return []
- self._ensure_matrix()
- assert self._matrix is not None
- matrix = self._matrix
- vec = np.array(vector, dtype=np.float32)
- vec_norm = np.linalg.norm(vec)
- if not np.isfinite(vec_norm) or vec_norm == 0:
- return []
- matrix_norms = np.linalg.norm(matrix, axis=1)
- scores = matrix @ vec / (matrix_norms * vec_norm + 1e-12)
- top_k = min(top_k, len(scores))
- indices = np.argsort(scores)[::-1][:top_k]
- results: List[Tuple[dict[str, Any], float]] = []
- for idx in indices:
- score = float(scores[idx])
- metadata = self._records[int(idx)].metadata.copy()
- metadata["score"] = score
- results.append((metadata, score))
- return results
-
- def query_many(self, vectors: Sequence[Sequence[float]], top_k: int = 5) -> List[List[dict[str, Any]]]:
- return [
- [meta for meta, _ in self.query(vector, top_k=top_k)]
- for vector in vectors
- ]
diff --git a/docker/Dockerfile b/docker/Dockerfile
deleted file mode 100644
index 5bdfbdb03e1878a8b9ad6516a56c0e44bff8955f..0000000000000000000000000000000000000000
--- a/docker/Dockerfile
+++ /dev/null
@@ -1,20 +0,0 @@
-FROM node:20-bookworm
-
-RUN apt-get update \
- && apt-get install -y --no-install-recommends python3 python3-venv python3-pip \
- && rm -rf /var/lib/apt/lists/*
-
-WORKDIR /app
-
-COPY . /app
-
-RUN python3 -m venv /app/.venv \
- && /app/.venv/bin/pip install --upgrade pip \
- && /app/.venv/bin/pip install -r server/requirements.txt \
- && npm install --prefix frontend
-
-ENV PATH="/app/.venv/bin:${PATH}"
-
-EXPOSE 8000 5173 8765
-
-CMD ["python3", "start-rightcodes.py", "--host", "0.0.0.0", "--port", "8765", "--no-browser"]
diff --git a/docs/README.md b/docs/README.md
deleted file mode 100644
index 84d35cdbcf899cd112f781634c6bb05dcf729f8a..0000000000000000000000000000000000000000
--- a/docs/README.md
+++ /dev/null
@@ -1,12 +0,0 @@
-# Documentation Hub
-
-Authoritative documentation for architecture, pipelines, agents, and UI workflows.
-
-## Structure
-
-- `architecture/` - System diagrams, deployment topology, and context views.
-- `agents/` - Detailed specs of each agent, including prompt design and tool APIs.
-- `pipelines/` - Pypeflow diagrams, data contracts, and runbooks for each stage.
-- `standards/` - Reference material and taxonomy notes for supported codes and standards.
-- `ui/` - Wireframes, component inventories, and interaction design specs.
-- `specifications/` - Functional requirements, acceptance criteria, and use cases.
diff --git a/docs/agents/orchestrator.md b/docs/agents/orchestrator.md
deleted file mode 100644
index 90612eb6904252d4fc5cb3c1eacb0e8b35493796..0000000000000000000000000000000000000000
--- a/docs/agents/orchestrator.md
+++ /dev/null
@@ -1,12 +0,0 @@
-# Orchestrator Agent (Draft)
-
-- **Goal:** Coordinate pipeline stages, persist state transitions, and request human intervention when confidence drops below threshold.
-- **Inputs:** `document_manifest.json`, latest stage outputs, user session preferences.
-- **Outputs:** `progress_state.json`, downstream agent invocation plans, notifications/events.
-- **Tool Hooks:** Worker queue enqueuer, storage manifest writer, validation reporter.
-
-Action items:
-
-1. Define structured prompt schema and guardrails.
-2. Enumerate tool signatures for queueing, status updates, and failure escalation.
-3. Align logging with `common/events/` payload definitions.
diff --git a/docs/architecture/system-context.md b/docs/architecture/system-context.md
deleted file mode 100644
index 5409aa1f2bf8d8fb5b5045c879da0bf58809b238..0000000000000000000000000000000000000000
--- a/docs/architecture/system-context.md
+++ /dev/null
@@ -1,11 +0,0 @@
-# System Context (Draft)
-
-- **Primary Actors:** Engineering consultant (user), Orchestrator Agent, Validation Agent, Export Agent.
-- **External Systems:** OpenAI APIs, Object storage (S3-compatible), Auth provider (to be determined).
-- **Key Data Stores:** Session manifest store, document blob storage, telemetry pipeline.
-
-Pending tasks:
-
-1. Complete C4 level 1 context diagram.
-2. Document trust boundaries (uploaded documents vs generated artefacts).
-3. Define audit/logging requirements tied to standards compliance.
diff --git a/docs/pipelines/pypeflow-overview.md b/docs/pipelines/pypeflow-overview.md
deleted file mode 100644
index 703bb73027ac081539a8ced719502ca36ccc4b0f..0000000000000000000000000000000000000000
--- a/docs/pipelines/pypeflow-overview.md
+++ /dev/null
@@ -1,8 +0,0 @@
-# Pypeflow Overview
-
-> Placeholder: document the end-to-end job graph once the first pipeline prototype lands. Recommended sections:
->
-> 1. High-level mermaid diagram showing ingestion -> extraction -> mapping -> rewrite -> validation -> export.
-> 2. Stage-by-stage JSON artefact expectations (input/output schemas).
-> 3. Failure handling and retry strategy per stage.
-> 4. Hooks for agent overrides and manual approvals.
diff --git a/docs/ui/review-workflow.md b/docs/ui/review-workflow.md
deleted file mode 100644
index c7620c6b84027911080dfa98b58d5d79aa089c87..0000000000000000000000000000000000000000
--- a/docs/ui/review-workflow.md
+++ /dev/null
@@ -1,17 +0,0 @@
-# Review Workflow (Draft)
-
-Stages:
-
-1. Upload wizard captures Word report, one or more standards PDFs, and mapping intent.
-2. Diff workspace highlights replaced clauses with inline confidence tags.
-3. Validation dashboard lists outstanding checks, comments, and approval history.
-
-Open questions:
-
-- How should we present clause-level provenance (link back to PDF page)?
-- Do we surface agent rationales verbatim or summarised?
-- What accessibility requirements should inform colour coding and indicators?
-
-UI notes:
-
-- Show upload progress and pipeline activity log so users know when each processing stage completes.
diff --git a/frontend/README.md b/frontend/README.md
deleted file mode 100644
index caf1360ac065ecf20dd61c21e009cfe746ff173e..0000000000000000000000000000000000000000
--- a/frontend/README.md
+++ /dev/null
@@ -1,17 +0,0 @@
-# Frontend Module
-
-React/TypeScript single-page app responsible for user interaction and review workflows.
-
-## Structure
-
-- `public/` - Static assets and HTML shell.
-- `src/components/` - Shared UI components (uploaders, diff panels, status widgets).
-- `src/pages/` - Route-level containers for upload, review, and export views.
-- `src/hooks/` - Reusable logic for API access, session state, and polling.
-- `src/layouts/` - Shell layouts (wizard, review workspace).
-- `src/state/` - Store configuration (React Query, Zustand, or Redux).
-- `src/services/` - API clients, WebSocket connectors, and agent progress handlers.
-- `src/utils/` - Formatting helpers, doc diff utilities, and schema transformers.
-- `src/types/` - Shared TypeScript declarations.
-- `tests/` - Component and integration tests with fixtures and mocks.
-- `config/` - Build-time configuration (Vite/Webpack, env samples).
diff --git a/frontend/index.html b/frontend/index.html
index bffc5e451f87fa4a8cc5c538c7460f7ff06461e7..29d85040cb4f153aa1aeb5785eba66110589ff26 100644
--- a/frontend/index.html
+++ b/frontend/index.html
@@ -4,7 +4,7 @@
-
RightCodes
+ Starter
diff --git a/frontend/package-lock.json b/frontend/package-lock.json
deleted file mode 100644
index 44511977d735851dd90dd0f13ddd82198ae3dd10..0000000000000000000000000000000000000000
--- a/frontend/package-lock.json
+++ /dev/null
@@ -1,2030 +0,0 @@
-{
- "name": "rightcodes-frontend",
- "version": "0.1.0",
- "lockfileVersion": 3,
- "requires": true,
- "packages": {
- "": {
- "name": "rightcodes-frontend",
- "version": "0.1.0",
- "dependencies": {
- "@tanstack/react-query": "^5.51.11",
- "axios": "^1.7.7",
- "react": "^18.3.1",
- "react-dom": "^18.3.1",
- "react-router-dom": "^6.26.0"
- },
- "devDependencies": {
- "@types/react": "^18.3.3",
- "@types/react-dom": "^18.3.3",
- "@vitejs/plugin-react": "^4.3.1",
- "typescript": "^5.4.5",
- "vite": "^5.4.8"
- }
- },
- "node_modules/@babel/code-frame": {
- "version": "7.27.1",
- "resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.27.1.tgz",
- "integrity": "sha512-cjQ7ZlQ0Mv3b47hABuTevyTuYN4i+loJKGeV9flcCgIK37cCXRh+L1bd3iBHlynerhQ7BhCkn2BPbQUL+rGqFg==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/helper-validator-identifier": "^7.27.1",
- "js-tokens": "^4.0.0",
- "picocolors": "^1.1.1"
- },
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/compat-data": {
- "version": "7.28.4",
- "resolved": "https://registry.npmjs.org/@babel/compat-data/-/compat-data-7.28.4.tgz",
- "integrity": "sha512-YsmSKC29MJwf0gF8Rjjrg5LQCmyh+j/nD8/eP7f+BeoQTKYqs9RoWbjGOdy0+1Ekr68RJZMUOPVQaQisnIo4Rw==",
- "dev": true,
- "license": "MIT",
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/core": {
- "version": "7.28.4",
- "resolved": "https://registry.npmjs.org/@babel/core/-/core-7.28.4.tgz",
- "integrity": "sha512-2BCOP7TN8M+gVDj7/ht3hsaO/B/n5oDbiAyyvnRlNOs+u1o+JWNYTQrmpuNp1/Wq2gcFrI01JAW+paEKDMx/CA==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/code-frame": "^7.27.1",
- "@babel/generator": "^7.28.3",
- "@babel/helper-compilation-targets": "^7.27.2",
- "@babel/helper-module-transforms": "^7.28.3",
- "@babel/helpers": "^7.28.4",
- "@babel/parser": "^7.28.4",
- "@babel/template": "^7.27.2",
- "@babel/traverse": "^7.28.4",
- "@babel/types": "^7.28.4",
- "@jridgewell/remapping": "^2.3.5",
- "convert-source-map": "^2.0.0",
- "debug": "^4.1.0",
- "gensync": "^1.0.0-beta.2",
- "json5": "^2.2.3",
- "semver": "^6.3.1"
- },
- "engines": {
- "node": ">=6.9.0"
- },
- "funding": {
- "type": "opencollective",
- "url": "https://opencollective.com/babel"
- }
- },
- "node_modules/@babel/generator": {
- "version": "7.28.3",
- "resolved": "https://registry.npmjs.org/@babel/generator/-/generator-7.28.3.tgz",
- "integrity": "sha512-3lSpxGgvnmZznmBkCRnVREPUFJv2wrv9iAoFDvADJc0ypmdOxdUtcLeBgBJ6zE0PMeTKnxeQzyk0xTBq4Ep7zw==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/parser": "^7.28.3",
- "@babel/types": "^7.28.2",
- "@jridgewell/gen-mapping": "^0.3.12",
- "@jridgewell/trace-mapping": "^0.3.28",
- "jsesc": "^3.0.2"
- },
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/helper-compilation-targets": {
- "version": "7.27.2",
- "resolved": "https://registry.npmjs.org/@babel/helper-compilation-targets/-/helper-compilation-targets-7.27.2.tgz",
- "integrity": "sha512-2+1thGUUWWjLTYTHZWK1n8Yga0ijBz1XAhUXcKy81rd5g6yh7hGqMp45v7cadSbEHc9G3OTv45SyneRN3ps4DQ==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/compat-data": "^7.27.2",
- "@babel/helper-validator-option": "^7.27.1",
- "browserslist": "^4.24.0",
- "lru-cache": "^5.1.1",
- "semver": "^6.3.1"
- },
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/helper-globals": {
- "version": "7.28.0",
- "resolved": "https://registry.npmjs.org/@babel/helper-globals/-/helper-globals-7.28.0.tgz",
- "integrity": "sha512-+W6cISkXFa1jXsDEdYA8HeevQT/FULhxzR99pxphltZcVaugps53THCeiWA8SguxxpSp3gKPiuYfSWopkLQ4hw==",
- "dev": true,
- "license": "MIT",
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/helper-module-imports": {
- "version": "7.27.1",
- "resolved": "https://registry.npmjs.org/@babel/helper-module-imports/-/helper-module-imports-7.27.1.tgz",
- "integrity": "sha512-0gSFWUPNXNopqtIPQvlD5WgXYI5GY2kP2cCvoT8kczjbfcfuIljTbcWrulD1CIPIX2gt1wghbDy08yE1p+/r3w==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/traverse": "^7.27.1",
- "@babel/types": "^7.27.1"
- },
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/helper-module-transforms": {
- "version": "7.28.3",
- "resolved": "https://registry.npmjs.org/@babel/helper-module-transforms/-/helper-module-transforms-7.28.3.tgz",
- "integrity": "sha512-gytXUbs8k2sXS9PnQptz5o0QnpLL51SwASIORY6XaBKF88nsOT0Zw9szLqlSGQDP/4TljBAD5y98p2U1fqkdsw==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/helper-module-imports": "^7.27.1",
- "@babel/helper-validator-identifier": "^7.27.1",
- "@babel/traverse": "^7.28.3"
- },
- "engines": {
- "node": ">=6.9.0"
- },
- "peerDependencies": {
- "@babel/core": "^7.0.0"
- }
- },
- "node_modules/@babel/helper-plugin-utils": {
- "version": "7.27.1",
- "resolved": "https://registry.npmjs.org/@babel/helper-plugin-utils/-/helper-plugin-utils-7.27.1.tgz",
- "integrity": "sha512-1gn1Up5YXka3YYAHGKpbideQ5Yjf1tDa9qYcgysz+cNCXukyLl6DjPXhD3VRwSb8c0J9tA4b2+rHEZtc6R0tlw==",
- "dev": true,
- "license": "MIT",
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/helper-string-parser": {
- "version": "7.27.1",
- "resolved": "https://registry.npmjs.org/@babel/helper-string-parser/-/helper-string-parser-7.27.1.tgz",
- "integrity": "sha512-qMlSxKbpRlAridDExk92nSobyDdpPijUq2DW6oDnUqd0iOGxmQjyqhMIihI9+zv4LPyZdRje2cavWPbCbWm3eA==",
- "dev": true,
- "license": "MIT",
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/helper-validator-identifier": {
- "version": "7.27.1",
- "resolved": "https://registry.npmjs.org/@babel/helper-validator-identifier/-/helper-validator-identifier-7.27.1.tgz",
- "integrity": "sha512-D2hP9eA+Sqx1kBZgzxZh0y1trbuU+JoDkiEwqhQ36nodYqJwyEIhPSdMNd7lOm/4io72luTPWH20Yda0xOuUow==",
- "dev": true,
- "license": "MIT",
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/helper-validator-option": {
- "version": "7.27.1",
- "resolved": "https://registry.npmjs.org/@babel/helper-validator-option/-/helper-validator-option-7.27.1.tgz",
- "integrity": "sha512-YvjJow9FxbhFFKDSuFnVCe2WxXk1zWc22fFePVNEaWJEu8IrZVlda6N0uHwzZrUM1il7NC9Mlp4MaJYbYd9JSg==",
- "dev": true,
- "license": "MIT",
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/helpers": {
- "version": "7.28.4",
- "resolved": "https://registry.npmjs.org/@babel/helpers/-/helpers-7.28.4.tgz",
- "integrity": "sha512-HFN59MmQXGHVyYadKLVumYsA9dBFun/ldYxipEjzA4196jpLZd8UjEEBLkbEkvfYreDqJhZxYAWFPtrfhNpj4w==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/template": "^7.27.2",
- "@babel/types": "^7.28.4"
- },
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/parser": {
- "version": "7.28.4",
- "resolved": "https://registry.npmjs.org/@babel/parser/-/parser-7.28.4.tgz",
- "integrity": "sha512-yZbBqeM6TkpP9du/I2pUZnJsRMGGvOuIrhjzC1AwHwW+6he4mni6Bp/m8ijn0iOuZuPI2BfkCoSRunpyjnrQKg==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/types": "^7.28.4"
- },
- "bin": {
- "parser": "bin/babel-parser.js"
- },
- "engines": {
- "node": ">=6.0.0"
- }
- },
- "node_modules/@babel/plugin-transform-react-jsx-self": {
- "version": "7.27.1",
- "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-jsx-self/-/plugin-transform-react-jsx-self-7.27.1.tgz",
- "integrity": "sha512-6UzkCs+ejGdZ5mFFC/OCUrv028ab2fp1znZmCZjAOBKiBK2jXD1O+BPSfX8X2qjJ75fZBMSnQn3Rq2mrBJK2mw==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/helper-plugin-utils": "^7.27.1"
- },
- "engines": {
- "node": ">=6.9.0"
- },
- "peerDependencies": {
- "@babel/core": "^7.0.0-0"
- }
- },
- "node_modules/@babel/plugin-transform-react-jsx-source": {
- "version": "7.27.1",
- "resolved": "https://registry.npmjs.org/@babel/plugin-transform-react-jsx-source/-/plugin-transform-react-jsx-source-7.27.1.tgz",
- "integrity": "sha512-zbwoTsBruTeKB9hSq73ha66iFeJHuaFkUbwvqElnygoNbj/jHRsSeokowZFN3CZ64IvEqcmmkVe89OPXc7ldAw==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/helper-plugin-utils": "^7.27.1"
- },
- "engines": {
- "node": ">=6.9.0"
- },
- "peerDependencies": {
- "@babel/core": "^7.0.0-0"
- }
- },
- "node_modules/@babel/template": {
- "version": "7.27.2",
- "resolved": "https://registry.npmjs.org/@babel/template/-/template-7.27.2.tgz",
- "integrity": "sha512-LPDZ85aEJyYSd18/DkjNh4/y1ntkE5KwUHWTiqgRxruuZL2F1yuHligVHLvcHY2vMHXttKFpJn6LwfI7cw7ODw==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/code-frame": "^7.27.1",
- "@babel/parser": "^7.27.2",
- "@babel/types": "^7.27.1"
- },
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/traverse": {
- "version": "7.28.4",
- "resolved": "https://registry.npmjs.org/@babel/traverse/-/traverse-7.28.4.tgz",
- "integrity": "sha512-YEzuboP2qvQavAcjgQNVgsvHIDv6ZpwXvcvjmyySP2DIMuByS/6ioU5G9pYrWHM6T2YDfc7xga9iNzYOs12CFQ==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/code-frame": "^7.27.1",
- "@babel/generator": "^7.28.3",
- "@babel/helper-globals": "^7.28.0",
- "@babel/parser": "^7.28.4",
- "@babel/template": "^7.27.2",
- "@babel/types": "^7.28.4",
- "debug": "^4.3.1"
- },
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@babel/types": {
- "version": "7.28.4",
- "resolved": "https://registry.npmjs.org/@babel/types/-/types-7.28.4.tgz",
- "integrity": "sha512-bkFqkLhh3pMBUQQkpVgWDWq/lqzc2678eUyDlTBhRqhCHFguYYGM0Efga7tYk4TogG/3x0EEl66/OQ+WGbWB/Q==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/helper-string-parser": "^7.27.1",
- "@babel/helper-validator-identifier": "^7.27.1"
- },
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/@esbuild/aix-ppc64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.21.5.tgz",
- "integrity": "sha512-1SDgH6ZSPTlggy1yI6+Dbkiz8xzpHJEVAlF/AM1tHPLsf5STom9rwtjE4hKAF20FfXXNTFqEYXyJNWh1GiZedQ==",
- "cpu": [
- "ppc64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "aix"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/android-arm": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.21.5.tgz",
- "integrity": "sha512-vCPvzSjpPHEi1siZdlvAlsPxXl7WbOVUBBAowWug4rJHb68Ox8KualB+1ocNvT5fjv6wpkX6o/iEpbDrf68zcg==",
- "cpu": [
- "arm"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "android"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/android-arm64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.21.5.tgz",
- "integrity": "sha512-c0uX9VAUBQ7dTDCjq+wdyGLowMdtR/GoC2U5IYk/7D1H1JYC0qseD7+11iMP2mRLN9RcCMRcjC4YMclCzGwS/A==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "android"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/android-x64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.21.5.tgz",
- "integrity": "sha512-D7aPRUUNHRBwHxzxRvp856rjUHRFW1SdQATKXH2hqA0kAZb1hKmi02OpYRacl0TxIGz/ZmXWlbZgjwWYaCakTA==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "android"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/darwin-arm64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.21.5.tgz",
- "integrity": "sha512-DwqXqZyuk5AiWWf3UfLiRDJ5EDd49zg6O9wclZ7kUMv2WRFr4HKjXp/5t8JZ11QbQfUS6/cRCKGwYhtNAY88kQ==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "darwin"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/darwin-x64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.21.5.tgz",
- "integrity": "sha512-se/JjF8NlmKVG4kNIuyWMV/22ZaerB+qaSi5MdrXtd6R08kvs2qCN4C09miupktDitvh8jRFflwGFBQcxZRjbw==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "darwin"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/freebsd-arm64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.21.5.tgz",
- "integrity": "sha512-5JcRxxRDUJLX8JXp/wcBCy3pENnCgBR9bN6JsY4OmhfUtIHe3ZW0mawA7+RDAcMLrMIZaf03NlQiX9DGyB8h4g==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "freebsd"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/freebsd-x64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.21.5.tgz",
- "integrity": "sha512-J95kNBj1zkbMXtHVH29bBriQygMXqoVQOQYA+ISs0/2l3T9/kj42ow2mpqerRBxDJnmkUDCaQT/dfNXWX/ZZCQ==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "freebsd"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/linux-arm": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.21.5.tgz",
- "integrity": "sha512-bPb5AHZtbeNGjCKVZ9UGqGwo8EUu4cLq68E95A53KlxAPRmUyYv2D6F0uUI65XisGOL1hBP5mTronbgo+0bFcA==",
- "cpu": [
- "arm"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/linux-arm64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.21.5.tgz",
- "integrity": "sha512-ibKvmyYzKsBeX8d8I7MH/TMfWDXBF3db4qM6sy+7re0YXya+K1cem3on9XgdT2EQGMu4hQyZhan7TeQ8XkGp4Q==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/linux-ia32": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.21.5.tgz",
- "integrity": "sha512-YvjXDqLRqPDl2dvRODYmmhz4rPeVKYvppfGYKSNGdyZkA01046pLWyRKKI3ax8fbJoK5QbxblURkwK/MWY18Tg==",
- "cpu": [
- "ia32"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/linux-loong64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.21.5.tgz",
- "integrity": "sha512-uHf1BmMG8qEvzdrzAqg2SIG/02+4/DHB6a9Kbya0XDvwDEKCoC8ZRWI5JJvNdUjtciBGFQ5PuBlpEOXQj+JQSg==",
- "cpu": [
- "loong64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/linux-mips64el": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.21.5.tgz",
- "integrity": "sha512-IajOmO+KJK23bj52dFSNCMsz1QP1DqM6cwLUv3W1QwyxkyIWecfafnI555fvSGqEKwjMXVLokcV5ygHW5b3Jbg==",
- "cpu": [
- "mips64el"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/linux-ppc64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.21.5.tgz",
- "integrity": "sha512-1hHV/Z4OEfMwpLO8rp7CvlhBDnjsC3CttJXIhBi+5Aj5r+MBvy4egg7wCbe//hSsT+RvDAG7s81tAvpL2XAE4w==",
- "cpu": [
- "ppc64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/linux-riscv64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.21.5.tgz",
- "integrity": "sha512-2HdXDMd9GMgTGrPWnJzP2ALSokE/0O5HhTUvWIbD3YdjME8JwvSCnNGBnTThKGEB91OZhzrJ4qIIxk/SBmyDDA==",
- "cpu": [
- "riscv64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/linux-s390x": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.21.5.tgz",
- "integrity": "sha512-zus5sxzqBJD3eXxwvjN1yQkRepANgxE9lgOW2qLnmr8ikMTphkjgXu1HR01K4FJg8h1kEEDAqDcZQtbrRnB41A==",
- "cpu": [
- "s390x"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/linux-x64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.21.5.tgz",
- "integrity": "sha512-1rYdTpyv03iycF1+BhzrzQJCdOuAOtaqHTWJZCWvijKD2N5Xu0TtVC8/+1faWqcP9iBCWOmjmhoH94dH82BxPQ==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/netbsd-x64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.21.5.tgz",
- "integrity": "sha512-Woi2MXzXjMULccIwMnLciyZH4nCIMpWQAs049KEeMvOcNADVxo0UBIQPfSmxB3CWKedngg7sWZdLvLczpe0tLg==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "netbsd"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/openbsd-x64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.21.5.tgz",
- "integrity": "sha512-HLNNw99xsvx12lFBUwoT8EVCsSvRNDVxNpjZ7bPn947b8gJPzeHWyNVhFsaerc0n3TsbOINvRP2byTZ5LKezow==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "openbsd"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/sunos-x64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.21.5.tgz",
- "integrity": "sha512-6+gjmFpfy0BHU5Tpptkuh8+uw3mnrvgs+dSPQXQOv3ekbordwnzTVEb4qnIvQcYXq6gzkyTnoZ9dZG+D4garKg==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "sunos"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/win32-arm64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.21.5.tgz",
- "integrity": "sha512-Z0gOTd75VvXqyq7nsl93zwahcTROgqvuAcYDUr+vOv8uHhNSKROyU961kgtCD1e95IqPKSQKH7tBTslnS3tA8A==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "win32"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/win32-ia32": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.21.5.tgz",
- "integrity": "sha512-SWXFF1CL2RVNMaVs+BBClwtfZSvDgtL//G/smwAc5oVK/UPu2Gu9tIaRgFmYFFKrmg3SyAjSrElf0TiJ1v8fYA==",
- "cpu": [
- "ia32"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "win32"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@esbuild/win32-x64": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.21.5.tgz",
- "integrity": "sha512-tQd/1efJuzPC6rCFwEvLtci/xNFcTZknmXs98FYDfGE4wP9ClFV98nyKrzJKVPMhdDnjzLhdUyMX4PsQAPjwIw==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "win32"
- ],
- "engines": {
- "node": ">=12"
- }
- },
- "node_modules/@jridgewell/gen-mapping": {
- "version": "0.3.13",
- "resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.13.tgz",
- "integrity": "sha512-2kkt/7niJ6MgEPxF0bYdQ6etZaA+fQvDcLKckhy1yIQOzaoKjBBjSj63/aLVjYE3qhRt5dvM+uUyfCg6UKCBbA==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@jridgewell/sourcemap-codec": "^1.5.0",
- "@jridgewell/trace-mapping": "^0.3.24"
- }
- },
- "node_modules/@jridgewell/remapping": {
- "version": "2.3.5",
- "resolved": "https://registry.npmjs.org/@jridgewell/remapping/-/remapping-2.3.5.tgz",
- "integrity": "sha512-LI9u/+laYG4Ds1TDKSJW2YPrIlcVYOwi2fUC6xB43lueCjgxV4lffOCZCtYFiH6TNOX+tQKXx97T4IKHbhyHEQ==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@jridgewell/gen-mapping": "^0.3.5",
- "@jridgewell/trace-mapping": "^0.3.24"
- }
- },
- "node_modules/@jridgewell/resolve-uri": {
- "version": "3.1.2",
- "resolved": "https://registry.npmjs.org/@jridgewell/resolve-uri/-/resolve-uri-3.1.2.tgz",
- "integrity": "sha512-bRISgCIjP20/tbWSPWMEi54QVPRZExkuD9lJL+UIxUKtwVJA8wW1Trb1jMs1RFXo1CBTNZ/5hpC9QvmKWdopKw==",
- "dev": true,
- "license": "MIT",
- "engines": {
- "node": ">=6.0.0"
- }
- },
- "node_modules/@jridgewell/sourcemap-codec": {
- "version": "1.5.5",
- "resolved": "https://registry.npmjs.org/@jridgewell/sourcemap-codec/-/sourcemap-codec-1.5.5.tgz",
- "integrity": "sha512-cYQ9310grqxueWbl+WuIUIaiUaDcj7WOq5fVhEljNVgRfOUhY9fy2zTvfoqWsnebh8Sl70VScFbICvJnLKB0Og==",
- "dev": true,
- "license": "MIT"
- },
- "node_modules/@jridgewell/trace-mapping": {
- "version": "0.3.31",
- "resolved": "https://registry.npmjs.org/@jridgewell/trace-mapping/-/trace-mapping-0.3.31.tgz",
- "integrity": "sha512-zzNR+SdQSDJzc8joaeP8QQoCQr8NuYx2dIIytl1QeBEZHJ9uW6hebsrYgbz8hJwUQao3TWCMtmfV8Nu1twOLAw==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@jridgewell/resolve-uri": "^3.1.0",
- "@jridgewell/sourcemap-codec": "^1.4.14"
- }
- },
- "node_modules/@remix-run/router": {
- "version": "1.23.0",
- "resolved": "https://registry.npmjs.org/@remix-run/router/-/router-1.23.0.tgz",
- "integrity": "sha512-O3rHJzAQKamUz1fvE0Qaw0xSFqsA/yafi2iqeE0pvdFtCO1viYx8QL6f3Ln/aCCTLxs68SLf0KPM9eSeM8yBnA==",
- "license": "MIT",
- "engines": {
- "node": ">=14.0.0"
- }
- },
- "node_modules/@rolldown/pluginutils": {
- "version": "1.0.0-beta.27",
- "resolved": "https://registry.npmjs.org/@rolldown/pluginutils/-/pluginutils-1.0.0-beta.27.tgz",
- "integrity": "sha512-+d0F4MKMCbeVUJwG96uQ4SgAznZNSq93I3V+9NHA4OpvqG8mRCpGdKmK8l/dl02h2CCDHwW2FqilnTyDcAnqjA==",
- "dev": true,
- "license": "MIT"
- },
- "node_modules/@rollup/rollup-android-arm-eabi": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm-eabi/-/rollup-android-arm-eabi-4.52.4.tgz",
- "integrity": "sha512-BTm2qKNnWIQ5auf4deoetINJm2JzvihvGb9R6K/ETwKLql/Bb3Eg2H1FBp1gUb4YGbydMA3jcmQTR73q7J+GAA==",
- "cpu": [
- "arm"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "android"
- ]
- },
- "node_modules/@rollup/rollup-android-arm64": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-android-arm64/-/rollup-android-arm64-4.52.4.tgz",
- "integrity": "sha512-P9LDQiC5vpgGFgz7GSM6dKPCiqR3XYN1WwJKA4/BUVDjHpYsf3iBEmVz62uyq20NGYbiGPR5cNHI7T1HqxNs2w==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "android"
- ]
- },
- "node_modules/@rollup/rollup-darwin-arm64": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-arm64/-/rollup-darwin-arm64-4.52.4.tgz",
- "integrity": "sha512-QRWSW+bVccAvZF6cbNZBJwAehmvG9NwfWHwMy4GbWi/BQIA/laTIktebT2ipVjNncqE6GLPxOok5hsECgAxGZg==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "darwin"
- ]
- },
- "node_modules/@rollup/rollup-darwin-x64": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-darwin-x64/-/rollup-darwin-x64-4.52.4.tgz",
- "integrity": "sha512-hZgP05pResAkRJxL1b+7yxCnXPGsXU0fG9Yfd6dUaoGk+FhdPKCJ5L1Sumyxn8kvw8Qi5PvQ8ulenUbRjzeCTw==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "darwin"
- ]
- },
- "node_modules/@rollup/rollup-freebsd-arm64": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-freebsd-arm64/-/rollup-freebsd-arm64-4.52.4.tgz",
- "integrity": "sha512-xmc30VshuBNUd58Xk4TKAEcRZHaXlV+tCxIXELiE9sQuK3kG8ZFgSPi57UBJt8/ogfhAF5Oz4ZSUBN77weM+mQ==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "freebsd"
- ]
- },
- "node_modules/@rollup/rollup-freebsd-x64": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-freebsd-x64/-/rollup-freebsd-x64-4.52.4.tgz",
- "integrity": "sha512-WdSLpZFjOEqNZGmHflxyifolwAiZmDQzuOzIq9L27ButpCVpD7KzTRtEG1I0wMPFyiyUdOO+4t8GvrnBLQSwpw==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "freebsd"
- ]
- },
- "node_modules/@rollup/rollup-linux-arm-gnueabihf": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-gnueabihf/-/rollup-linux-arm-gnueabihf-4.52.4.tgz",
- "integrity": "sha512-xRiOu9Of1FZ4SxVbB0iEDXc4ddIcjCv2aj03dmW8UrZIW7aIQ9jVJdLBIhxBI+MaTnGAKyvMwPwQnoOEvP7FgQ==",
- "cpu": [
- "arm"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-linux-arm-musleabihf": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm-musleabihf/-/rollup-linux-arm-musleabihf-4.52.4.tgz",
- "integrity": "sha512-FbhM2p9TJAmEIEhIgzR4soUcsW49e9veAQCziwbR+XWB2zqJ12b4i/+hel9yLiD8pLncDH4fKIPIbt5238341Q==",
- "cpu": [
- "arm"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-linux-arm64-gnu": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-gnu/-/rollup-linux-arm64-gnu-4.52.4.tgz",
- "integrity": "sha512-4n4gVwhPHR9q/g8lKCyz0yuaD0MvDf7dV4f9tHt0C73Mp8h38UCtSCSE6R9iBlTbXlmA8CjpsZoujhszefqueg==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-linux-arm64-musl": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-arm64-musl/-/rollup-linux-arm64-musl-4.52.4.tgz",
- "integrity": "sha512-u0n17nGA0nvi/11gcZKsjkLj1QIpAuPFQbR48Subo7SmZJnGxDpspyw2kbpuoQnyK+9pwf3pAoEXerJs/8Mi9g==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-linux-loong64-gnu": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-loong64-gnu/-/rollup-linux-loong64-gnu-4.52.4.tgz",
- "integrity": "sha512-0G2c2lpYtbTuXo8KEJkDkClE/+/2AFPdPAbmaHoE870foRFs4pBrDehilMcrSScrN/fB/1HTaWO4bqw+ewBzMQ==",
- "cpu": [
- "loong64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-linux-ppc64-gnu": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-ppc64-gnu/-/rollup-linux-ppc64-gnu-4.52.4.tgz",
- "integrity": "sha512-teSACug1GyZHmPDv14VNbvZFX779UqWTsd7KtTM9JIZRDI5NUwYSIS30kzI8m06gOPB//jtpqlhmraQ68b5X2g==",
- "cpu": [
- "ppc64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-linux-riscv64-gnu": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-gnu/-/rollup-linux-riscv64-gnu-4.52.4.tgz",
- "integrity": "sha512-/MOEW3aHjjs1p4Pw1Xk4+3egRevx8Ji9N6HUIA1Ifh8Q+cg9dremvFCUbOX2Zebz80BwJIgCBUemjqhU5XI5Eg==",
- "cpu": [
- "riscv64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-linux-riscv64-musl": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-riscv64-musl/-/rollup-linux-riscv64-musl-4.52.4.tgz",
- "integrity": "sha512-1HHmsRyh845QDpEWzOFtMCph5Ts+9+yllCrREuBR/vg2RogAQGGBRC8lDPrPOMnrdOJ+mt1WLMOC2Kao/UwcvA==",
- "cpu": [
- "riscv64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-linux-s390x-gnu": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-s390x-gnu/-/rollup-linux-s390x-gnu-4.52.4.tgz",
- "integrity": "sha512-seoeZp4L/6D1MUyjWkOMRU6/iLmCU2EjbMTyAG4oIOs1/I82Y5lTeaxW0KBfkUdHAWN7j25bpkt0rjnOgAcQcA==",
- "cpu": [
- "s390x"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-linux-x64-gnu": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-gnu/-/rollup-linux-x64-gnu-4.52.4.tgz",
- "integrity": "sha512-Wi6AXf0k0L7E2gteNsNHUs7UMwCIhsCTs6+tqQ5GPwVRWMaflqGec4Sd8n6+FNFDw9vGcReqk2KzBDhCa1DLYg==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-linux-x64-musl": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-linux-x64-musl/-/rollup-linux-x64-musl-4.52.4.tgz",
- "integrity": "sha512-dtBZYjDmCQ9hW+WgEkaffvRRCKm767wWhxsFW3Lw86VXz/uJRuD438/XvbZT//B96Vs8oTA8Q4A0AfHbrxP9zw==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "linux"
- ]
- },
- "node_modules/@rollup/rollup-openharmony-arm64": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-openharmony-arm64/-/rollup-openharmony-arm64-4.52.4.tgz",
- "integrity": "sha512-1ox+GqgRWqaB1RnyZXL8PD6E5f7YyRUJYnCqKpNzxzP0TkaUh112NDrR9Tt+C8rJ4x5G9Mk8PQR3o7Ku2RKqKA==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "openharmony"
- ]
- },
- "node_modules/@rollup/rollup-win32-arm64-msvc": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-arm64-msvc/-/rollup-win32-arm64-msvc-4.52.4.tgz",
- "integrity": "sha512-8GKr640PdFNXwzIE0IrkMWUNUomILLkfeHjXBi/nUvFlpZP+FA8BKGKpacjW6OUUHaNI6sUURxR2U2g78FOHWQ==",
- "cpu": [
- "arm64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "win32"
- ]
- },
- "node_modules/@rollup/rollup-win32-ia32-msvc": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-ia32-msvc/-/rollup-win32-ia32-msvc-4.52.4.tgz",
- "integrity": "sha512-AIy/jdJ7WtJ/F6EcfOb2GjR9UweO0n43jNObQMb6oGxkYTfLcnN7vYYpG+CN3lLxrQkzWnMOoNSHTW54pgbVxw==",
- "cpu": [
- "ia32"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "win32"
- ]
- },
- "node_modules/@rollup/rollup-win32-x64-gnu": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-x64-gnu/-/rollup-win32-x64-gnu-4.52.4.tgz",
- "integrity": "sha512-UF9KfsH9yEam0UjTwAgdK0anlQ7c8/pWPU2yVjyWcF1I1thABt6WXE47cI71pGiZ8wGvxohBoLnxM04L/wj8mQ==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "win32"
- ]
- },
- "node_modules/@rollup/rollup-win32-x64-msvc": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/@rollup/rollup-win32-x64-msvc/-/rollup-win32-x64-msvc-4.52.4.tgz",
- "integrity": "sha512-bf9PtUa0u8IXDVxzRToFQKsNCRz9qLYfR/MpECxl4mRoWYjAeFjgxj1XdZr2M/GNVpT05p+LgQOHopYDlUu6/w==",
- "cpu": [
- "x64"
- ],
- "dev": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "win32"
- ]
- },
- "node_modules/@tanstack/query-core": {
- "version": "5.90.3",
- "resolved": "https://registry.npmjs.org/@tanstack/query-core/-/query-core-5.90.3.tgz",
- "integrity": "sha512-HtPOnCwmx4dd35PfXU8jjkhwYrsHfuqgC8RCJIwWglmhIUIlzPP0ZcEkDAc+UtAWCiLm7T8rxeEfHZlz3hYMCA==",
- "license": "MIT",
- "funding": {
- "type": "github",
- "url": "https://github.com/sponsors/tannerlinsley"
- }
- },
- "node_modules/@tanstack/react-query": {
- "version": "5.90.3",
- "resolved": "https://registry.npmjs.org/@tanstack/react-query/-/react-query-5.90.3.tgz",
- "integrity": "sha512-i/LRL6DtuhG6bjGzavIMIVuKKPWx2AnEBIsBfuMm3YoHne0a20nWmsatOCBcVSaT0/8/5YFjNkebHAPLVUSi0Q==",
- "license": "MIT",
- "dependencies": {
- "@tanstack/query-core": "5.90.3"
- },
- "funding": {
- "type": "github",
- "url": "https://github.com/sponsors/tannerlinsley"
- },
- "peerDependencies": {
- "react": "^18 || ^19"
- }
- },
- "node_modules/@types/babel__core": {
- "version": "7.20.5",
- "resolved": "https://registry.npmjs.org/@types/babel__core/-/babel__core-7.20.5.tgz",
- "integrity": "sha512-qoQprZvz5wQFJwMDqeseRXWv3rqMvhgpbXFfVyWhbx9X47POIA6i/+dXefEmZKoAgOaTdaIgNSMqMIU61yRyzA==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/parser": "^7.20.7",
- "@babel/types": "^7.20.7",
- "@types/babel__generator": "*",
- "@types/babel__template": "*",
- "@types/babel__traverse": "*"
- }
- },
- "node_modules/@types/babel__generator": {
- "version": "7.27.0",
- "resolved": "https://registry.npmjs.org/@types/babel__generator/-/babel__generator-7.27.0.tgz",
- "integrity": "sha512-ufFd2Xi92OAVPYsy+P4n7/U7e68fex0+Ee8gSG9KX7eo084CWiQ4sdxktvdl0bOPupXtVJPY19zk6EwWqUQ8lg==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/types": "^7.0.0"
- }
- },
- "node_modules/@types/babel__template": {
- "version": "7.4.4",
- "resolved": "https://registry.npmjs.org/@types/babel__template/-/babel__template-7.4.4.tgz",
- "integrity": "sha512-h/NUaSyG5EyxBIp8YRxo4RMe2/qQgvyowRwVMzhYhBCONbW8PUsg4lkFMrhgZhUe5z3L3MiLDuvyJ/CaPa2A8A==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/parser": "^7.1.0",
- "@babel/types": "^7.0.0"
- }
- },
- "node_modules/@types/babel__traverse": {
- "version": "7.28.0",
- "resolved": "https://registry.npmjs.org/@types/babel__traverse/-/babel__traverse-7.28.0.tgz",
- "integrity": "sha512-8PvcXf70gTDZBgt9ptxJ8elBeBjcLOAcOtoO/mPJjtji1+CdGbHgm77om1GrsPxsiE+uXIpNSK64UYaIwQXd4Q==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/types": "^7.28.2"
- }
- },
- "node_modules/@types/estree": {
- "version": "1.0.8",
- "resolved": "https://registry.npmjs.org/@types/estree/-/estree-1.0.8.tgz",
- "integrity": "sha512-dWHzHa2WqEXI/O1E9OjrocMTKJl2mSrEolh1Iomrv6U+JuNwaHXsXx9bLu5gG7BUWFIN0skIQJQ/L1rIex4X6w==",
- "dev": true,
- "license": "MIT"
- },
- "node_modules/@types/prop-types": {
- "version": "15.7.15",
- "resolved": "https://registry.npmjs.org/@types/prop-types/-/prop-types-15.7.15.tgz",
- "integrity": "sha512-F6bEyamV9jKGAFBEmlQnesRPGOQqS2+Uwi0Em15xenOxHaf2hv6L8YCVn3rPdPJOiJfPiCnLIRyvwVaqMY3MIw==",
- "dev": true,
- "license": "MIT"
- },
- "node_modules/@types/react": {
- "version": "18.3.26",
- "resolved": "https://registry.npmjs.org/@types/react/-/react-18.3.26.tgz",
- "integrity": "sha512-RFA/bURkcKzx/X9oumPG9Vp3D3JUgus/d0b67KB0t5S/raciymilkOa66olh78MUI92QLbEJevO7rvqU/kjwKA==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@types/prop-types": "*",
- "csstype": "^3.0.2"
- }
- },
- "node_modules/@types/react-dom": {
- "version": "18.3.7",
- "resolved": "https://registry.npmjs.org/@types/react-dom/-/react-dom-18.3.7.tgz",
- "integrity": "sha512-MEe3UeoENYVFXzoXEWsvcpg6ZvlrFNlOQ7EOsvhI3CfAXwzPfO8Qwuxd40nepsYKqyyVQnTdEfv68q91yLcKrQ==",
- "dev": true,
- "license": "MIT",
- "peerDependencies": {
- "@types/react": "^18.0.0"
- }
- },
- "node_modules/@vitejs/plugin-react": {
- "version": "4.7.0",
- "resolved": "https://registry.npmjs.org/@vitejs/plugin-react/-/plugin-react-4.7.0.tgz",
- "integrity": "sha512-gUu9hwfWvvEDBBmgtAowQCojwZmJ5mcLn3aufeCsitijs3+f2NsrPtlAWIR6OPiqljl96GVCUbLe0HyqIpVaoA==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@babel/core": "^7.28.0",
- "@babel/plugin-transform-react-jsx-self": "^7.27.1",
- "@babel/plugin-transform-react-jsx-source": "^7.27.1",
- "@rolldown/pluginutils": "1.0.0-beta.27",
- "@types/babel__core": "^7.20.5",
- "react-refresh": "^0.17.0"
- },
- "engines": {
- "node": "^14.18.0 || >=16.0.0"
- },
- "peerDependencies": {
- "vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0"
- }
- },
- "node_modules/asynckit": {
- "version": "0.4.0",
- "resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
- "integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==",
- "license": "MIT"
- },
- "node_modules/axios": {
- "version": "1.12.2",
- "resolved": "https://registry.npmjs.org/axios/-/axios-1.12.2.tgz",
- "integrity": "sha512-vMJzPewAlRyOgxV2dU0Cuz2O8zzzx9VYtbJOaBgXFeLc4IV/Eg50n4LowmehOOR61S8ZMpc2K5Sa7g6A4jfkUw==",
- "license": "MIT",
- "dependencies": {
- "follow-redirects": "^1.15.6",
- "form-data": "^4.0.4",
- "proxy-from-env": "^1.1.0"
- }
- },
- "node_modules/baseline-browser-mapping": {
- "version": "2.8.16",
- "resolved": "https://registry.npmjs.org/baseline-browser-mapping/-/baseline-browser-mapping-2.8.16.tgz",
- "integrity": "sha512-OMu3BGQ4E7P1ErFsIPpbJh0qvDudM/UuJeHgkAvfWe+0HFJCXh+t/l8L6fVLR55RI/UbKrVLnAXZSVwd9ysWYw==",
- "dev": true,
- "license": "Apache-2.0",
- "bin": {
- "baseline-browser-mapping": "dist/cli.js"
- }
- },
- "node_modules/browserslist": {
- "version": "4.26.3",
- "resolved": "https://registry.npmjs.org/browserslist/-/browserslist-4.26.3.tgz",
- "integrity": "sha512-lAUU+02RFBuCKQPj/P6NgjlbCnLBMp4UtgTx7vNHd3XSIJF87s9a5rA3aH2yw3GS9DqZAUbOtZdCCiZeVRqt0w==",
- "dev": true,
- "funding": [
- {
- "type": "opencollective",
- "url": "https://opencollective.com/browserslist"
- },
- {
- "type": "tidelift",
- "url": "https://tidelift.com/funding/github/npm/browserslist"
- },
- {
- "type": "github",
- "url": "https://github.com/sponsors/ai"
- }
- ],
- "license": "MIT",
- "dependencies": {
- "baseline-browser-mapping": "^2.8.9",
- "caniuse-lite": "^1.0.30001746",
- "electron-to-chromium": "^1.5.227",
- "node-releases": "^2.0.21",
- "update-browserslist-db": "^1.1.3"
- },
- "bin": {
- "browserslist": "cli.js"
- },
- "engines": {
- "node": "^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7"
- }
- },
- "node_modules/call-bind-apply-helpers": {
- "version": "1.0.2",
- "resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz",
- "integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==",
- "license": "MIT",
- "dependencies": {
- "es-errors": "^1.3.0",
- "function-bind": "^1.1.2"
- },
- "engines": {
- "node": ">= 0.4"
- }
- },
- "node_modules/caniuse-lite": {
- "version": "1.0.30001750",
- "resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001750.tgz",
- "integrity": "sha512-cuom0g5sdX6rw00qOoLNSFCJ9/mYIsuSOA+yzpDw8eopiFqcVwQvZHqov0vmEighRxX++cfC0Vg1G+1Iy/mSpQ==",
- "dev": true,
- "funding": [
- {
- "type": "opencollective",
- "url": "https://opencollective.com/browserslist"
- },
- {
- "type": "tidelift",
- "url": "https://tidelift.com/funding/github/npm/caniuse-lite"
- },
- {
- "type": "github",
- "url": "https://github.com/sponsors/ai"
- }
- ],
- "license": "CC-BY-4.0"
- },
- "node_modules/combined-stream": {
- "version": "1.0.8",
- "resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
- "integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
- "license": "MIT",
- "dependencies": {
- "delayed-stream": "~1.0.0"
- },
- "engines": {
- "node": ">= 0.8"
- }
- },
- "node_modules/convert-source-map": {
- "version": "2.0.0",
- "resolved": "https://registry.npmjs.org/convert-source-map/-/convert-source-map-2.0.0.tgz",
- "integrity": "sha512-Kvp459HrV2FEJ1CAsi1Ku+MY3kasH19TFykTz2xWmMeq6bk2NU3XXvfJ+Q61m0xktWwt+1HSYf3JZsTms3aRJg==",
- "dev": true,
- "license": "MIT"
- },
- "node_modules/csstype": {
- "version": "3.1.3",
- "resolved": "https://registry.npmjs.org/csstype/-/csstype-3.1.3.tgz",
- "integrity": "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw==",
- "dev": true,
- "license": "MIT"
- },
- "node_modules/debug": {
- "version": "4.4.3",
- "resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz",
- "integrity": "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "ms": "^2.1.3"
- },
- "engines": {
- "node": ">=6.0"
- },
- "peerDependenciesMeta": {
- "supports-color": {
- "optional": true
- }
- }
- },
- "node_modules/delayed-stream": {
- "version": "1.0.0",
- "resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
- "integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
- "license": "MIT",
- "engines": {
- "node": ">=0.4.0"
- }
- },
- "node_modules/dunder-proto": {
- "version": "1.0.1",
- "resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
- "integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==",
- "license": "MIT",
- "dependencies": {
- "call-bind-apply-helpers": "^1.0.1",
- "es-errors": "^1.3.0",
- "gopd": "^1.2.0"
- },
- "engines": {
- "node": ">= 0.4"
- }
- },
- "node_modules/electron-to-chromium": {
- "version": "1.5.237",
- "resolved": "https://registry.npmjs.org/electron-to-chromium/-/electron-to-chromium-1.5.237.tgz",
- "integrity": "sha512-icUt1NvfhGLar5lSWH3tHNzablaA5js3HVHacQimfP8ViEBOQv+L7DKEuHdbTZ0SKCO1ogTJTIL1Gwk9S6Qvcg==",
- "dev": true,
- "license": "ISC"
- },
- "node_modules/es-define-property": {
- "version": "1.0.1",
- "resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz",
- "integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==",
- "license": "MIT",
- "engines": {
- "node": ">= 0.4"
- }
- },
- "node_modules/es-errors": {
- "version": "1.3.0",
- "resolved": "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz",
- "integrity": "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==",
- "license": "MIT",
- "engines": {
- "node": ">= 0.4"
- }
- },
- "node_modules/es-object-atoms": {
- "version": "1.1.1",
- "resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz",
- "integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==",
- "license": "MIT",
- "dependencies": {
- "es-errors": "^1.3.0"
- },
- "engines": {
- "node": ">= 0.4"
- }
- },
- "node_modules/es-set-tostringtag": {
- "version": "2.1.0",
- "resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz",
- "integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==",
- "license": "MIT",
- "dependencies": {
- "es-errors": "^1.3.0",
- "get-intrinsic": "^1.2.6",
- "has-tostringtag": "^1.0.2",
- "hasown": "^2.0.2"
- },
- "engines": {
- "node": ">= 0.4"
- }
- },
- "node_modules/esbuild": {
- "version": "0.21.5",
- "resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.21.5.tgz",
- "integrity": "sha512-mg3OPMV4hXywwpoDxu3Qda5xCKQi+vCTZq8S9J/EpkhB2HzKXq4SNFZE3+NK93JYxc8VMSep+lOUSC/RVKaBqw==",
- "dev": true,
- "hasInstallScript": true,
- "license": "MIT",
- "bin": {
- "esbuild": "bin/esbuild"
- },
- "engines": {
- "node": ">=12"
- },
- "optionalDependencies": {
- "@esbuild/aix-ppc64": "0.21.5",
- "@esbuild/android-arm": "0.21.5",
- "@esbuild/android-arm64": "0.21.5",
- "@esbuild/android-x64": "0.21.5",
- "@esbuild/darwin-arm64": "0.21.5",
- "@esbuild/darwin-x64": "0.21.5",
- "@esbuild/freebsd-arm64": "0.21.5",
- "@esbuild/freebsd-x64": "0.21.5",
- "@esbuild/linux-arm": "0.21.5",
- "@esbuild/linux-arm64": "0.21.5",
- "@esbuild/linux-ia32": "0.21.5",
- "@esbuild/linux-loong64": "0.21.5",
- "@esbuild/linux-mips64el": "0.21.5",
- "@esbuild/linux-ppc64": "0.21.5",
- "@esbuild/linux-riscv64": "0.21.5",
- "@esbuild/linux-s390x": "0.21.5",
- "@esbuild/linux-x64": "0.21.5",
- "@esbuild/netbsd-x64": "0.21.5",
- "@esbuild/openbsd-x64": "0.21.5",
- "@esbuild/sunos-x64": "0.21.5",
- "@esbuild/win32-arm64": "0.21.5",
- "@esbuild/win32-ia32": "0.21.5",
- "@esbuild/win32-x64": "0.21.5"
- }
- },
- "node_modules/escalade": {
- "version": "3.2.0",
- "resolved": "https://registry.npmjs.org/escalade/-/escalade-3.2.0.tgz",
- "integrity": "sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA==",
- "dev": true,
- "license": "MIT",
- "engines": {
- "node": ">=6"
- }
- },
- "node_modules/follow-redirects": {
- "version": "1.15.11",
- "resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.11.tgz",
- "integrity": "sha512-deG2P0JfjrTxl50XGCDyfI97ZGVCxIpfKYmfyrQ54n5FO/0gfIES8C/Psl6kWVDolizcaaxZJnTS0QSMxvnsBQ==",
- "funding": [
- {
- "type": "individual",
- "url": "https://github.com/sponsors/RubenVerborgh"
- }
- ],
- "license": "MIT",
- "engines": {
- "node": ">=4.0"
- },
- "peerDependenciesMeta": {
- "debug": {
- "optional": true
- }
- }
- },
- "node_modules/form-data": {
- "version": "4.0.4",
- "resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.4.tgz",
- "integrity": "sha512-KrGhL9Q4zjj0kiUt5OO4Mr/A/jlI2jDYs5eHBpYHPcBEVSiipAvn2Ko2HnPe20rmcuuvMHNdZFp+4IlGTMF0Ow==",
- "license": "MIT",
- "dependencies": {
- "asynckit": "^0.4.0",
- "combined-stream": "^1.0.8",
- "es-set-tostringtag": "^2.1.0",
- "hasown": "^2.0.2",
- "mime-types": "^2.1.12"
- },
- "engines": {
- "node": ">= 6"
- }
- },
- "node_modules/fsevents": {
- "version": "2.3.3",
- "resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.3.tgz",
- "integrity": "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==",
- "dev": true,
- "hasInstallScript": true,
- "license": "MIT",
- "optional": true,
- "os": [
- "darwin"
- ],
- "engines": {
- "node": "^8.16.0 || ^10.6.0 || >=11.0.0"
- }
- },
- "node_modules/function-bind": {
- "version": "1.1.2",
- "resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
- "integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==",
- "license": "MIT",
- "funding": {
- "url": "https://github.com/sponsors/ljharb"
- }
- },
- "node_modules/gensync": {
- "version": "1.0.0-beta.2",
- "resolved": "https://registry.npmjs.org/gensync/-/gensync-1.0.0-beta.2.tgz",
- "integrity": "sha512-3hN7NaskYvMDLQY55gnW3NQ+mesEAepTqlg+VEbj7zzqEMBVNhzcGYYeqFo/TlYz6eQiFcp1HcsCZO+nGgS8zg==",
- "dev": true,
- "license": "MIT",
- "engines": {
- "node": ">=6.9.0"
- }
- },
- "node_modules/get-intrinsic": {
- "version": "1.3.0",
- "resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
- "integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==",
- "license": "MIT",
- "dependencies": {
- "call-bind-apply-helpers": "^1.0.2",
- "es-define-property": "^1.0.1",
- "es-errors": "^1.3.0",
- "es-object-atoms": "^1.1.1",
- "function-bind": "^1.1.2",
- "get-proto": "^1.0.1",
- "gopd": "^1.2.0",
- "has-symbols": "^1.1.0",
- "hasown": "^2.0.2",
- "math-intrinsics": "^1.1.0"
- },
- "engines": {
- "node": ">= 0.4"
- },
- "funding": {
- "url": "https://github.com/sponsors/ljharb"
- }
- },
- "node_modules/get-proto": {
- "version": "1.0.1",
- "resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
- "integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==",
- "license": "MIT",
- "dependencies": {
- "dunder-proto": "^1.0.1",
- "es-object-atoms": "^1.0.0"
- },
- "engines": {
- "node": ">= 0.4"
- }
- },
- "node_modules/gopd": {
- "version": "1.2.0",
- "resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz",
- "integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==",
- "license": "MIT",
- "engines": {
- "node": ">= 0.4"
- },
- "funding": {
- "url": "https://github.com/sponsors/ljharb"
- }
- },
- "node_modules/has-symbols": {
- "version": "1.1.0",
- "resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz",
- "integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==",
- "license": "MIT",
- "engines": {
- "node": ">= 0.4"
- },
- "funding": {
- "url": "https://github.com/sponsors/ljharb"
- }
- },
- "node_modules/has-tostringtag": {
- "version": "1.0.2",
- "resolved": "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz",
- "integrity": "sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==",
- "license": "MIT",
- "dependencies": {
- "has-symbols": "^1.0.3"
- },
- "engines": {
- "node": ">= 0.4"
- },
- "funding": {
- "url": "https://github.com/sponsors/ljharb"
- }
- },
- "node_modules/hasown": {
- "version": "2.0.2",
- "resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz",
- "integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==",
- "license": "MIT",
- "dependencies": {
- "function-bind": "^1.1.2"
- },
- "engines": {
- "node": ">= 0.4"
- }
- },
- "node_modules/js-tokens": {
- "version": "4.0.0",
- "resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-4.0.0.tgz",
- "integrity": "sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ==",
- "license": "MIT"
- },
- "node_modules/jsesc": {
- "version": "3.1.0",
- "resolved": "https://registry.npmjs.org/jsesc/-/jsesc-3.1.0.tgz",
- "integrity": "sha512-/sM3dO2FOzXjKQhJuo0Q173wf2KOo8t4I8vHy6lF9poUp7bKT0/NHE8fPX23PwfhnykfqnC2xRxOnVw5XuGIaA==",
- "dev": true,
- "license": "MIT",
- "bin": {
- "jsesc": "bin/jsesc"
- },
- "engines": {
- "node": ">=6"
- }
- },
- "node_modules/json5": {
- "version": "2.2.3",
- "resolved": "https://registry.npmjs.org/json5/-/json5-2.2.3.tgz",
- "integrity": "sha512-XmOWe7eyHYH14cLdVPoyg+GOH3rYX++KpzrylJwSW98t3Nk+U8XOl8FWKOgwtzdb8lXGf6zYwDUzeHMWfxasyg==",
- "dev": true,
- "license": "MIT",
- "bin": {
- "json5": "lib/cli.js"
- },
- "engines": {
- "node": ">=6"
- }
- },
- "node_modules/loose-envify": {
- "version": "1.4.0",
- "resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.4.0.tgz",
- "integrity": "sha512-lyuxPGr/Wfhrlem2CL/UcnUc1zcqKAImBDzukY7Y5F/yQiNdko6+fRLevlw1HgMySw7f611UIY408EtxRSoK3Q==",
- "license": "MIT",
- "dependencies": {
- "js-tokens": "^3.0.0 || ^4.0.0"
- },
- "bin": {
- "loose-envify": "cli.js"
- }
- },
- "node_modules/lru-cache": {
- "version": "5.1.1",
- "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-5.1.1.tgz",
- "integrity": "sha512-KpNARQA3Iwv+jTA0utUVVbrh+Jlrr1Fv0e56GGzAFOXN7dk/FviaDW8LHmK52DlcH4WP2n6gI8vN1aesBFgo9w==",
- "dev": true,
- "license": "ISC",
- "dependencies": {
- "yallist": "^3.0.2"
- }
- },
- "node_modules/math-intrinsics": {
- "version": "1.1.0",
- "resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
- "integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==",
- "license": "MIT",
- "engines": {
- "node": ">= 0.4"
- }
- },
- "node_modules/mime-db": {
- "version": "1.52.0",
- "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
- "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
- "license": "MIT",
- "engines": {
- "node": ">= 0.6"
- }
- },
- "node_modules/mime-types": {
- "version": "2.1.35",
- "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
- "integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
- "license": "MIT",
- "dependencies": {
- "mime-db": "1.52.0"
- },
- "engines": {
- "node": ">= 0.6"
- }
- },
- "node_modules/ms": {
- "version": "2.1.3",
- "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
- "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
- "dev": true,
- "license": "MIT"
- },
- "node_modules/nanoid": {
- "version": "3.3.11",
- "resolved": "https://registry.npmjs.org/nanoid/-/nanoid-3.3.11.tgz",
- "integrity": "sha512-N8SpfPUnUp1bK+PMYW8qSWdl9U+wwNWI4QKxOYDy9JAro3WMX7p2OeVRF9v+347pnakNevPmiHhNmZ2HbFA76w==",
- "dev": true,
- "funding": [
- {
- "type": "github",
- "url": "https://github.com/sponsors/ai"
- }
- ],
- "license": "MIT",
- "bin": {
- "nanoid": "bin/nanoid.cjs"
- },
- "engines": {
- "node": "^10 || ^12 || ^13.7 || ^14 || >=15.0.1"
- }
- },
- "node_modules/node-releases": {
- "version": "2.0.23",
- "resolved": "https://registry.npmjs.org/node-releases/-/node-releases-2.0.23.tgz",
- "integrity": "sha512-cCmFDMSm26S6tQSDpBCg/NR8NENrVPhAJSf+XbxBG4rPFaaonlEoE9wHQmun+cls499TQGSb7ZyPBRlzgKfpeg==",
- "dev": true,
- "license": "MIT"
- },
- "node_modules/picocolors": {
- "version": "1.1.1",
- "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz",
- "integrity": "sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==",
- "dev": true,
- "license": "ISC"
- },
- "node_modules/postcss": {
- "version": "8.5.6",
- "resolved": "https://registry.npmjs.org/postcss/-/postcss-8.5.6.tgz",
- "integrity": "sha512-3Ybi1tAuwAP9s0r1UQ2J4n5Y0G05bJkpUIO0/bI9MhwmD70S5aTWbXGBwxHrelT+XM1k6dM0pk+SwNkpTRN7Pg==",
- "dev": true,
- "funding": [
- {
- "type": "opencollective",
- "url": "https://opencollective.com/postcss/"
- },
- {
- "type": "tidelift",
- "url": "https://tidelift.com/funding/github/npm/postcss"
- },
- {
- "type": "github",
- "url": "https://github.com/sponsors/ai"
- }
- ],
- "license": "MIT",
- "dependencies": {
- "nanoid": "^3.3.11",
- "picocolors": "^1.1.1",
- "source-map-js": "^1.2.1"
- },
- "engines": {
- "node": "^10 || ^12 || >=14"
- }
- },
- "node_modules/proxy-from-env": {
- "version": "1.1.0",
- "resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
- "integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
- "license": "MIT"
- },
- "node_modules/react": {
- "version": "18.3.1",
- "resolved": "https://registry.npmjs.org/react/-/react-18.3.1.tgz",
- "integrity": "sha512-wS+hAgJShR0KhEvPJArfuPVN1+Hz1t0Y6n5jLrGQbkb4urgPE/0Rve+1kMB1v/oWgHgm4WIcV+i7F2pTVj+2iQ==",
- "license": "MIT",
- "dependencies": {
- "loose-envify": "^1.1.0"
- },
- "engines": {
- "node": ">=0.10.0"
- }
- },
- "node_modules/react-dom": {
- "version": "18.3.1",
- "resolved": "https://registry.npmjs.org/react-dom/-/react-dom-18.3.1.tgz",
- "integrity": "sha512-5m4nQKp+rZRb09LNH59GM4BxTh9251/ylbKIbpe7TpGxfJ+9kv6BLkLBXIjjspbgbnIBNqlI23tRnTWT0snUIw==",
- "license": "MIT",
- "dependencies": {
- "loose-envify": "^1.1.0",
- "scheduler": "^0.23.2"
- },
- "peerDependencies": {
- "react": "^18.3.1"
- }
- },
- "node_modules/react-refresh": {
- "version": "0.17.0",
- "resolved": "https://registry.npmjs.org/react-refresh/-/react-refresh-0.17.0.tgz",
- "integrity": "sha512-z6F7K9bV85EfseRCp2bzrpyQ0Gkw1uLoCel9XBVWPg/TjRj94SkJzUTGfOa4bs7iJvBWtQG0Wq7wnI0syw3EBQ==",
- "dev": true,
- "license": "MIT",
- "engines": {
- "node": ">=0.10.0"
- }
- },
- "node_modules/react-router": {
- "version": "6.30.1",
- "resolved": "https://registry.npmjs.org/react-router/-/react-router-6.30.1.tgz",
- "integrity": "sha512-X1m21aEmxGXqENEPG3T6u0Th7g0aS4ZmoNynhbs+Cn+q+QGTLt+d5IQ2bHAXKzKcxGJjxACpVbnYQSCRcfxHlQ==",
- "license": "MIT",
- "dependencies": {
- "@remix-run/router": "1.23.0"
- },
- "engines": {
- "node": ">=14.0.0"
- },
- "peerDependencies": {
- "react": ">=16.8"
- }
- },
- "node_modules/react-router-dom": {
- "version": "6.30.1",
- "resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-6.30.1.tgz",
- "integrity": "sha512-llKsgOkZdbPU1Eg3zK8lCn+sjD9wMRZZPuzmdWWX5SUs8OFkN5HnFVC0u5KMeMaC9aoancFI/KoLuKPqN+hxHw==",
- "license": "MIT",
- "dependencies": {
- "@remix-run/router": "1.23.0",
- "react-router": "6.30.1"
- },
- "engines": {
- "node": ">=14.0.0"
- },
- "peerDependencies": {
- "react": ">=16.8",
- "react-dom": ">=16.8"
- }
- },
- "node_modules/rollup": {
- "version": "4.52.4",
- "resolved": "https://registry.npmjs.org/rollup/-/rollup-4.52.4.tgz",
- "integrity": "sha512-CLEVl+MnPAiKh5pl4dEWSyMTpuflgNQiLGhMv8ezD5W/qP8AKvmYpCOKRRNOh7oRKnauBZ4SyeYkMS+1VSyKwQ==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "@types/estree": "1.0.8"
- },
- "bin": {
- "rollup": "dist/bin/rollup"
- },
- "engines": {
- "node": ">=18.0.0",
- "npm": ">=8.0.0"
- },
- "optionalDependencies": {
- "@rollup/rollup-android-arm-eabi": "4.52.4",
- "@rollup/rollup-android-arm64": "4.52.4",
- "@rollup/rollup-darwin-arm64": "4.52.4",
- "@rollup/rollup-darwin-x64": "4.52.4",
- "@rollup/rollup-freebsd-arm64": "4.52.4",
- "@rollup/rollup-freebsd-x64": "4.52.4",
- "@rollup/rollup-linux-arm-gnueabihf": "4.52.4",
- "@rollup/rollup-linux-arm-musleabihf": "4.52.4",
- "@rollup/rollup-linux-arm64-gnu": "4.52.4",
- "@rollup/rollup-linux-arm64-musl": "4.52.4",
- "@rollup/rollup-linux-loong64-gnu": "4.52.4",
- "@rollup/rollup-linux-ppc64-gnu": "4.52.4",
- "@rollup/rollup-linux-riscv64-gnu": "4.52.4",
- "@rollup/rollup-linux-riscv64-musl": "4.52.4",
- "@rollup/rollup-linux-s390x-gnu": "4.52.4",
- "@rollup/rollup-linux-x64-gnu": "4.52.4",
- "@rollup/rollup-linux-x64-musl": "4.52.4",
- "@rollup/rollup-openharmony-arm64": "4.52.4",
- "@rollup/rollup-win32-arm64-msvc": "4.52.4",
- "@rollup/rollup-win32-ia32-msvc": "4.52.4",
- "@rollup/rollup-win32-x64-gnu": "4.52.4",
- "@rollup/rollup-win32-x64-msvc": "4.52.4",
- "fsevents": "~2.3.2"
- }
- },
- "node_modules/scheduler": {
- "version": "0.23.2",
- "resolved": "https://registry.npmjs.org/scheduler/-/scheduler-0.23.2.tgz",
- "integrity": "sha512-UOShsPwz7NrMUqhR6t0hWjFduvOzbtv7toDH1/hIrfRNIDBnnBWd0CwJTGvTpngVlmwGCdP9/Zl/tVrDqcuYzQ==",
- "license": "MIT",
- "dependencies": {
- "loose-envify": "^1.1.0"
- }
- },
- "node_modules/semver": {
- "version": "6.3.1",
- "resolved": "https://registry.npmjs.org/semver/-/semver-6.3.1.tgz",
- "integrity": "sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==",
- "dev": true,
- "license": "ISC",
- "bin": {
- "semver": "bin/semver.js"
- }
- },
- "node_modules/source-map-js": {
- "version": "1.2.1",
- "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz",
- "integrity": "sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA==",
- "dev": true,
- "license": "BSD-3-Clause",
- "engines": {
- "node": ">=0.10.0"
- }
- },
- "node_modules/typescript": {
- "version": "5.9.3",
- "resolved": "https://registry.npmjs.org/typescript/-/typescript-5.9.3.tgz",
- "integrity": "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==",
- "dev": true,
- "license": "Apache-2.0",
- "bin": {
- "tsc": "bin/tsc",
- "tsserver": "bin/tsserver"
- },
- "engines": {
- "node": ">=14.17"
- }
- },
- "node_modules/update-browserslist-db": {
- "version": "1.1.3",
- "resolved": "https://registry.npmjs.org/update-browserslist-db/-/update-browserslist-db-1.1.3.tgz",
- "integrity": "sha512-UxhIZQ+QInVdunkDAaiazvvT/+fXL5Osr0JZlJulepYu6Jd7qJtDZjlur0emRlT71EN3ScPoE7gvsuIKKNavKw==",
- "dev": true,
- "funding": [
- {
- "type": "opencollective",
- "url": "https://opencollective.com/browserslist"
- },
- {
- "type": "tidelift",
- "url": "https://tidelift.com/funding/github/npm/browserslist"
- },
- {
- "type": "github",
- "url": "https://github.com/sponsors/ai"
- }
- ],
- "license": "MIT",
- "dependencies": {
- "escalade": "^3.2.0",
- "picocolors": "^1.1.1"
- },
- "bin": {
- "update-browserslist-db": "cli.js"
- },
- "peerDependencies": {
- "browserslist": ">= 4.21.0"
- }
- },
- "node_modules/vite": {
- "version": "5.4.20",
- "resolved": "https://registry.npmjs.org/vite/-/vite-5.4.20.tgz",
- "integrity": "sha512-j3lYzGC3P+B5Yfy/pfKNgVEg4+UtcIJcVRt2cDjIOmhLourAqPqf8P7acgxeiSgUB7E3p2P8/3gNIgDLpwzs4g==",
- "dev": true,
- "license": "MIT",
- "dependencies": {
- "esbuild": "^0.21.3",
- "postcss": "^8.4.43",
- "rollup": "^4.20.0"
- },
- "bin": {
- "vite": "bin/vite.js"
- },
- "engines": {
- "node": "^18.0.0 || >=20.0.0"
- },
- "funding": {
- "url": "https://github.com/vitejs/vite?sponsor=1"
- },
- "optionalDependencies": {
- "fsevents": "~2.3.3"
- },
- "peerDependencies": {
- "@types/node": "^18.0.0 || >=20.0.0",
- "less": "*",
- "lightningcss": "^1.21.0",
- "sass": "*",
- "sass-embedded": "*",
- "stylus": "*",
- "sugarss": "*",
- "terser": "^5.4.0"
- },
- "peerDependenciesMeta": {
- "@types/node": {
- "optional": true
- },
- "less": {
- "optional": true
- },
- "lightningcss": {
- "optional": true
- },
- "sass": {
- "optional": true
- },
- "sass-embedded": {
- "optional": true
- },
- "stylus": {
- "optional": true
- },
- "sugarss": {
- "optional": true
- },
- "terser": {
- "optional": true
- }
- }
- },
- "node_modules/yallist": {
- "version": "3.1.1",
- "resolved": "https://registry.npmjs.org/yallist/-/yallist-3.1.1.tgz",
- "integrity": "sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g==",
- "dev": true,
- "license": "ISC"
- }
- }
-}
diff --git a/frontend/package.json b/frontend/package.json
index 14308f2ed04188e463c4d3b148870c33e2de19de..594bae6137ac1bfcc081518c75072aba31131012 100644
--- a/frontend/package.json
+++ b/frontend/package.json
@@ -9,11 +9,8 @@
"preview": "vite preview"
},
"dependencies": {
- "@tanstack/react-query": "^5.51.11",
- "axios": "^1.7.7",
"react": "^18.3.1",
- "react-dom": "^18.3.1",
- "react-router-dom": "^6.26.0"
+ "react-dom": "^18.3.1"
},
"devDependencies": {
"@types/react": "^18.3.3",
diff --git a/frontend/public/index.html b/frontend/public/index.html
deleted file mode 100644
index bffc5e451f87fa4a8cc5c538c7460f7ff06461e7..0000000000000000000000000000000000000000
--- a/frontend/public/index.html
+++ /dev/null
@@ -1,13 +0,0 @@
-
-
-
-
-
-
- RightCodes
-
-
-
-
-
-
diff --git a/frontend/src/App.tsx b/frontend/src/App.tsx
index 8ac1d9b10d760c9a037dee49d73ee050314669b8..aa711e4421051c5ef444e5efb343c4ebb61a9728 100644
--- a/frontend/src/App.tsx
+++ b/frontend/src/App.tsx
@@ -1,34 +1,3 @@
-import { BrowserRouter, NavLink, Route, Routes } from "react-router-dom";
-import SessionsPage from "./pages/SessionsPage";
-import PresetsPage from "./pages/PresetsPage";
-import PresetEditPage from "./pages/PresetEditPage";
-import ObservatoryPage from "./pages/ObservatoryPage";
-
export default function App() {
- return (
-
-
-
-
-
- } />
- } />
- } />
- } />
-
-
-
-
-
- );
+ return ;
}
diff --git a/frontend/src/components/PresetManager.tsx b/frontend/src/components/PresetManager.tsx
deleted file mode 100644
index 0017f44ccbd6503c8fc0f3720e419a0bd56d8d50..0000000000000000000000000000000000000000
--- a/frontend/src/components/PresetManager.tsx
+++ /dev/null
@@ -1,193 +0,0 @@
-import { useState } from "react";
-import { Link } from "react-router-dom";
-import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
-import { createPreset, deletePreset, fetchPresets } from "../services/api";
-import type { StandardsPreset } from "../types/session";
-
-export default function PresetManager() {
- const queryClient = useQueryClient();
- const { data: presets = [], isLoading, isError, error } = useQuery({
- queryKey: ["presets"],
- queryFn: fetchPresets,
- refetchInterval: (currentPresets) =>
- Array.isArray(currentPresets) &&
- currentPresets.some((preset) => preset.status === "processing")
- ? 2000
- : false,
- });
-
- const [name, setName] = useState("");
- const [description, setDescription] = useState("");
- const [files, setFiles] = useState([]);
-
- const createPresetMutation = useMutation({
- mutationFn: createPreset,
- onSuccess: () => {
- queryClient.invalidateQueries({ queryKey: ["presets"] });
- resetForm();
- },
- onError: (err: unknown) => {
- alert(`Preset creation failed: ${(err as Error).message}`);
- },
- });
-
- const deletePresetMutation = useMutation({
- mutationFn: (id: string) => deletePreset(id),
- onSuccess: () => {
- queryClient.invalidateQueries({ queryKey: ["presets"] });
- },
- onError: (err: unknown) => {
- alert(`Preset deletion failed: ${(err as Error).message}`);
- },
- });
-
- const resetForm = () => {
- setName("");
- setDescription("");
- setFiles([]);
- const fileInput = document.getElementById("preset-standards-input") as HTMLInputElement | null;
- if (fileInput) {
- fileInput.value = "";
- }
- };
-
- const handleSubmit = (event: React.FormEvent) => {
- event.preventDefault();
- if (!files.length) {
- alert("Select at least one PDF to build a preset.");
- return;
- }
- createPresetMutation.mutate({
- name,
- description: description || undefined,
- files,
- });
- };
-
- const handleDelete = (preset: StandardsPreset) => {
- if (!window.confirm(`Delete preset "${preset.name}"? This cannot be undone.`)) {
- return;
- }
- deletePresetMutation.mutate(preset.id);
- };
-
- return (
-
- Manage Standards Presets
-
- Reuse commonly referenced standards without re-uploading them for every session.
-
-
-
-
-
Saved Presets
- {isLoading &&
Loading presets...
}
- {isError &&
Failed to load presets: {(error as Error).message}
}
- {!isLoading && !presets.length &&
No presets created yet.
}
- {presets.length > 0 && (
-
- {presets.map((preset: StandardsPreset) => {
- const totalDocs = preset.total_count || preset.document_count || 0;
- const processed = Math.min(preset.processed_count || 0, totalDocs);
- const progressPercent = totalDocs
- ? Math.min(100, Math.round((processed / totalDocs) * 100))
- : preset.status === "ready"
- ? 100
- : 0;
- const nextDoc = Math.min(processed + 1, totalDocs);
- return (
- -
-
-
-
{preset.name}
-
- {preset.document_count} file{preset.document_count === 1 ? "" : "s"} · Updated{" "}
- {new Date(preset.updated_at).toLocaleString()}
-
- {preset.description &&
{preset.description}
}
-
-
- {preset.status === "ready" && Ready}
- {preset.status === "processing" && (
-
- Processing {processed}/{totalDocs}
-
- )}
- {preset.status === "failed" && (
-
- Failed{preset.last_error ? `: ${preset.last_error}` : ""}
-
- )}
-
-
- {preset.status === "processing" && (
-
-
-
- Currently processing document {nextDoc} of {totalDocs}.
-
-
- )}
-
- {preset.documents.map((doc) => doc.split(/[/\\]/).pop() ?? doc).join(", ")}
-
-
-
- Edit
-
-
-
-
- );
- })}
-
- )}
-
-
- );
-}
diff --git a/frontend/src/components/SessionDetails.tsx b/frontend/src/components/SessionDetails.tsx
deleted file mode 100644
index fee5fa463dab750debb130d4d4b18a8ffd0d3308..0000000000000000000000000000000000000000
--- a/frontend/src/components/SessionDetails.tsx
+++ /dev/null
@@ -1,578 +0,0 @@
-import { useEffect, useState } from "react";
-import { useQuery } from "@tanstack/react-query";
-import { fetchSessionById } from "../services/api";
-import type { Session, SessionSummary } from "../types/session";
-
-interface DocParse {
- path: string;
- paragraphs: Array<{ index: number; text: string; style?: string | null; heading_level?: number | null; references?: string[] }>;
- tables: Array<{ index: number; rows: string[][]; references?: string[] }>;
- summary?: Record;
-}
-
-interface StandardsParseEntry {
- path: string;
- summary?: Record;
- chunks?: Array<{
- page_number: number;
- chunk_index: number;
- text: string;
- heading?: string | null;
- clause_numbers?: string[];
- references?: string[];
- is_ocr?: boolean;
- }>;
-}
-
-interface ExtractionResult {
- document_summary?: string;
- sections?: Array<{ paragraph_index: number; text: string; references?: string[] }>;
- tables?: Array<{ table_index: number; summary: string; references?: string[] }>;
- references?: string[];
- notes?: string;
-}
-
-interface MappingResult {
- mappings?: Array<{
- source_reference: string;
- source_context?: string;
- target_reference: string;
- target_clause?: string;
- target_summary?: string;
- confidence?: number;
- rationale?: string;
- }>;
- unmapped_references?: string[];
- notes?: string;
-}
-
-interface RewritePlan {
- replacements?: Array<{
- paragraph_index: number;
- original_text: string;
- updated_text: string;
- applied_mapping: string;
- change_reason?: string;
- }>;
- notes?: string;
-}
-
-interface ValidationReport {
- issues?: Array<{ description?: string; severity?: string }>;
- verdict?: string;
- notes?: string;
-}
-
-interface ExportManifest {
- export_path?: string | null;
- notes?: string;
- replacement_count?: number;
-}
-
-interface SessionDetailsProps {
- session: SessionSummary | null;
-}
-
-export default function SessionDetails({ session }: SessionDetailsProps) {
- if (!session) {
- return Select a session to inspect its progress.
;
- }
-
- const apiBaseUrl = import.meta.env.VITE_API_BASE_URL ?? "http://localhost:8000/api";
- const partialSession = session as Partial;
- const summaryStandards = Array.isArray(partialSession.standards_docs)
- ? [...partialSession.standards_docs]
- : [];
- const summaryLogs = Array.isArray(partialSession.logs) ? [...partialSession.logs] : [];
- const summaryMetadata =
- partialSession.metadata && typeof partialSession.metadata === "object"
- ? { ...(partialSession.metadata as Record) }
- : {};
- const normalizedSummary: Session = {
- ...session,
- standards_docs: summaryStandards,
- logs: summaryLogs,
- metadata: summaryMetadata,
- };
- const sessionQuery = useQuery({
- queryKey: ["session", session.id],
- queryFn: () => fetchSessionById(session.id),
- enabled: Boolean(session?.id),
- refetchInterval: (data) =>
- data && ["review", "completed", "failed"].includes(data.status) ? false : 2000,
- placeholderData: () => normalizedSummary,
- });
-
- const latest = sessionQuery.data ?? normalizedSummary;
- const standardsDocs = Array.isArray(latest.standards_docs) ? latest.standards_docs : [];
- const standardNames = standardsDocs.map((path) => path.split(/[/\\]/).pop() ?? path);
- const activityLogs = Array.isArray(latest.logs) ? latest.logs : [];
- const inFlight = !["review", "completed", "failed"].includes(latest.status);
-
- useEffect(() => {
- if (sessionQuery.data) {
- console.log(`[Session ${sessionQuery.data.id}] status: ${sessionQuery.data.status}`);
- }
- }, [sessionQuery.data?.id, sessionQuery.data?.status]);
-
- const metadata = (latest.metadata ?? {}) as Record;
- const docParse = metadata.doc_parse as DocParse | undefined;
- const standardsParse = metadata.standards_parse as StandardsParseEntry[] | undefined;
- const extractionResult = metadata.extraction_result as ExtractionResult | undefined;
- const mappingResult = metadata.mapping_result as MappingResult | undefined;
- const rewritePlan = metadata.rewrite_plan as RewritePlan | undefined;
- const validationReport = metadata.validation_report as ValidationReport | undefined;
- const exportManifest = metadata.export_manifest as ExportManifest | undefined;
- const exportDownloadUrl =
- exportManifest?.export_path !== undefined
- ? `${apiBaseUrl}/sessions/${latest.id}/export`
- : null;
-
- const docSummary = docParse?.summary as
- | { paragraph_count?: number; table_count?: number; reference_count?: number }
- | undefined;
-
- const standardsProgress = metadata.standards_ingest_progress as
- | {
- total?: number;
- processed?: number;
- current_file?: string;
- cached_count?: number;
- parsed_count?: number;
- completed?: boolean;
- }
- | undefined;
- const standardsProgressFile =
- standardsProgress?.current_file?.split(/[/\\]/).pop() ?? undefined;
- const standardsProgressTotal =
- typeof standardsProgress?.total === "number" ? standardsProgress.total : undefined;
- const standardsProgressProcessed =
- standardsProgressTotal !== undefined
- ? Math.min(
- typeof standardsProgress?.processed === "number" ? standardsProgress.processed : 0,
- standardsProgressTotal
- )
- : undefined;
- const standardsCachedCount =
- typeof standardsProgress?.cached_count === "number" ? standardsProgress.cached_count : undefined;
- const standardsParsedCount =
- typeof standardsProgress?.parsed_count === "number" ? standardsProgress.parsed_count : undefined;
- const pipelineProgress = metadata.pipeline_progress as
- | {
- total?: number;
- current_index?: number;
- stage?: string | null;
- status?: string;
- }
- | undefined;
- const pipelineStageLabel = pipelineProgress?.stage
- ? pipelineProgress.stage
- .split("_")
- .map((part) => part.charAt(0).toUpperCase() + part.slice(1))
- .join(" ")
- : undefined;
- const pipelineStageStatus = pipelineProgress?.status
- ? pipelineProgress.status.charAt(0).toUpperCase() + pipelineProgress.status.slice(1)
- : undefined;
- const presetMetadata = Array.isArray(metadata.presets)
- ? (metadata.presets as Array<{ id?: string; name?: string | null; documents?: string[] }>)
- : [];
- const presetNames = presetMetadata
- .map((item) => item?.name || item?.id || "")
- .filter((value) => Boolean(value)) as string[];
- const presetDocTotal = presetMetadata.reduce(
- (acc, item) => acc + (Array.isArray(item?.documents) ? item.documents.length : 0),
- 0
- );
-
- const [showAllMappings, setShowAllMappings] = useState(false);
- const [showAllReplacements, setShowAllReplacements] = useState(false);
-
- const sections = extractionResult?.sections ?? [];
- const sectionsPreview = sections.slice(0, 6);
-
- const mappingLimit = 6;
- const mappings = mappingResult?.mappings ?? [];
- const mappingsToDisplay = showAllMappings ? mappings : mappings.slice(0, mappingLimit);
-
- const replacementLimit = 6;
- const replacements = rewritePlan?.replacements ?? [];
- const replacementsToDisplay = showAllReplacements ? replacements : replacements.slice(0, replacementLimit);
-
- const tableReplacements = rewritePlan?.table_replacements ?? [];
- const changeLog = rewritePlan?.change_log ?? [];
-
- return (
-
-
-
Session Overview
-
- {sessionQuery.isFetching && Updating...}
-
-
-
- {inFlight && (
-
-
-
Processing pipeline... refreshing every 2s.
- {pipelineProgress?.total ? (
-
- Pipeline stages:{" "}
- {Math.min(pipelineProgress.current_index ?? 0, pipelineProgress.total)} /{" "}
- {pipelineProgress.total}
- {pipelineStageLabel ? ` — ${pipelineStageLabel}` : ""}
- {pipelineStageStatus ? ` (${pipelineStageStatus})` : ""}
-
- ) : null}
- {standardsProgressTotal !== undefined ? (
-
- Standards PDFs: {standardsProgressProcessed ?? 0} / {standardsProgressTotal}
- {standardsProgressFile ? ` — ${standardsProgressFile}` : ""}
- {standardsCachedCount !== undefined || standardsParsedCount !== undefined ? (
- <>
- {" "}(cached {standardsCachedCount ?? 0}, parsed {standardsParsedCount ?? 0})
- >
- ) : null}
-
- ) : null}
-
- )}
-
-
-
- Status
- - {latest.status}
-
-
-
- Created
- - {new Date(latest.created_at).toLocaleString()}
-
-
-
- Updated
- - {new Date(latest.updated_at).toLocaleString()}
-
-
-
- Origin
- - {latest.target_standard}
-
-
-
- Destination
- - {latest.destination_standard}
-
- {standardNames.length > 0 && (
-
-
- Standards PDFs
- - {standardNames.join(", ")}
-
- )}
- {presetNames.length ? (
-
-
- Preset
- -
- {presetNames.join(", ")}
- {presetDocTotal ? ` (${presetDocTotal} file${presetDocTotal === 1 ? "" : "s"})` : ""}
-
-
- ) : null}
-
- {latest.status === "review" && (
-
- Pipeline completed. Review the AI change set and validation notes next.
-
- )}
- {latest.last_error && (
-
- Last error: {latest.last_error}
-
- )}
-
-
Activity Log
- {activityLogs.length ? (
-
- {activityLogs.map((entry, index) => (
- - {entry}
- ))}
-
- ) : (
-
No activity recorded yet.
- )}
-
-
- {docParse && (
-
-
Document Parsing
-
- Analysed {docSummary?.paragraph_count ?? docParse.paragraphs.length} paragraphs and{" "}
- {docSummary?.table_count ?? docParse.tables.length} tables.
-
-
- {docParse.paragraphs.slice(0, 5).map((paragraph) => (
- -
- Paragraph {paragraph.index}: {paragraph.text}
- {paragraph.references?.length ? (
- — refs: {paragraph.references.join(", ")}
- ) : null}
-
- ))}
-
-
- )}
-
- {standardsParse && standardsParse.length > 0 && (
-
-
Standards Corpus
-
- {standardsParse.slice(0, 4).map((entry) => {
- const name = entry.path.split(/[/\\]/).pop() ?? entry.path;
- const summary = entry.summary as
- | { chunk_count?: number; reference_count?: number; ocr_chunk_count?: number }
- | undefined;
- const chunkCount = summary?.chunk_count ?? entry.chunks?.length ?? 0;
- const ocrChunkCount =
- summary?.ocr_chunk_count ?? entry.chunks?.filter((chunk) => chunk.is_ocr)?.length ?? 0;
- return (
- -
- {name} - {chunkCount} chunks analysed
- {ocrChunkCount ? (
-
- {" "}
- ({ocrChunkCount} OCR supplement{ocrChunkCount === 1 ? "" : "s"})
-
- ) : null}
-
- );
- })}
-
-
- )}
-
- {extractionResult && (
-
-
Extraction Summary
- {extractionResult.document_summary && (
-
{extractionResult.document_summary}
- )}
- {extractionResult.references && extractionResult.references.length ? (
-
- References detected: {extractionResult.references.slice(0, 20).join(", ")}
- {extractionResult.references.length > 20 ? "..." : ""}
-
- ) : null}
- {sectionsPreview.length ? (
-
- {sectionsPreview.map((section) => (
- -
- Paragraph {section.paragraph_index}: {section.text}
-
- ))}
-
- ) : null}
- {extractionResult.notes &&
{extractionResult.notes}
}
-
- )}
-
- {mappingResult && (
-
-
Reference Mapping
- {mappings.length ? (
- <>
-
- Showing {mappingsToDisplay.length} of {mappings.length} mappings.
-
-
- {mappings.length > mappingLimit && (
-
- )}
- >
- ) : (
-
- {mappingResult.notes ?? "No mapping actions recorded."}
-
- )}
-
- )}
-
- {rewritePlan && (
-
-
Rewrite Plan
- {replacements.length ? (
- <>
-
- Showing {replacementsToDisplay.length} of {replacements.length} replacements.
-
-
- {replacementsToDisplay.map((replacement, idx) => {
- const appliedMappings =
- Array.isArray(replacement.applied_mappings) && replacement.applied_mappings.length
- ? replacement.applied_mappings
- : replacement.applied_mapping
- ? [replacement.applied_mapping]
- : [];
- return (
- -
- Paragraph {replacement.paragraph_index}
-
-
- Original: {replacement.original_text}
-
-
- Updated: {replacement.updated_text}
-
- {appliedMappings.length ? (
-
Mapping: {appliedMappings.join(", ")}
- ) : null}
- {replacement.change_reason && (
-
Reason: {replacement.change_reason}
- )}
-
-
- );
- })}
-
- {replacements.length > replacementLimit && (
-
- )}
- >
- ) : (
-
{rewritePlan.notes ?? "No rewrite actions required."}
- )}
- {tableReplacements.length ? (
-
- ) : null}
- {changeLog.length ? (
-
-
Change Log
-
- {changeLog.map((entry, idx) => (
- -
- {entry.reference}{" -> "}{entry.target_reference}
- {entry.affected_paragraphs && entry.affected_paragraphs.length ? (
- {" "}(paragraphs {entry.affected_paragraphs.join(", ")})
- ) : null}
- {entry.note &&
{entry.note}
}
-
- ))}
-
-
- ) : null}
- {rewritePlan.notes && replacements.length ? (
-
{rewritePlan.notes}
- ) : null}
-
- )}
-
- {validationReport && (
-
-
Validation
- {validationReport.verdict && (
-
- Verdict: {validationReport.verdict}
-
- )}
- {validationReport.issues?.length ? (
-
- {validationReport.issues.map((issue, idx) => (
- -
- {issue.description ?? "Issue reported"}
- {issue.severity && ({issue.severity})}
-
- ))}
-
- ) : (
-
No validation issues reported.
- )}
- {validationReport.notes &&
{validationReport.notes}
}
-
- )}
-
- {exportManifest && (
-
-
Export
- {exportDownloadUrl ? (
- <>
-
- {exportManifest.notes ?? "Converted document generated using rewrite plan."}
-
- {typeof exportManifest.replacement_count === "number" && (
-
- {exportManifest.replacement_count} paragraph replacements applied.
-
- )}
-
- Download DOCX
-
- >
- ) : (
-
{exportManifest.notes ?? "Export not yet generated."}
- )}
-
- )}
-
- );
-}
-
diff --git a/frontend/src/components/SessionList.tsx b/frontend/src/components/SessionList.tsx
deleted file mode 100644
index 8f76c939be4d4f1eb098a350e1c54da0c6c1c43c..0000000000000000000000000000000000000000
--- a/frontend/src/components/SessionList.tsx
+++ /dev/null
@@ -1,59 +0,0 @@
-import { useQuery } from "@tanstack/react-query";
-import { fetchSessions } from "../services/api";
-import type { SessionSummary } from "../types/session";
-
-interface SessionListProps {
- selectedId?: string | null;
- onSelect?: (session: SessionSummary) => void;
-}
-
-export default function SessionList({ selectedId, onSelect }: SessionListProps) {
- const { data, isLoading, isError, error } = useQuery({
- queryKey: ["sessions"],
- queryFn: fetchSessions,
- refetchInterval: 5000,
- });
- const dateFormatter = new Intl.DateTimeFormat(undefined, {
- dateStyle: "medium",
- timeStyle: "short",
- });
-
- if (isLoading) {
- return Loading sessions...
;
- }
- if (isError) {
- return Failed to load sessions: {(error as Error).message}
;
- }
-
- if (!data?.length) {
- return No sessions yet. Upload a report to get started.
;
- }
-
- return (
-
-
Recent Sessions
-
Auto-refreshing every 5s.
-
- {data.map((session) => (
- - onSelect?.(session)}
- >
-
-
{session.name}
-
- {session.target_standard} {"->"} {session.destination_standard}
-
-
- {dateFormatter.format(new Date(session.created_at))} {" · "}
- {session.source_doc.split(/[/\\]/).pop() ?? session.source_doc}
-
-
- {session.status}
-
- ))}
-
-
- );
-}
diff --git a/frontend/src/components/UploadForm.tsx b/frontend/src/components/UploadForm.tsx
deleted file mode 100644
index 43f6bbde98db547f4db4127c29a5370197c4addd..0000000000000000000000000000000000000000
--- a/frontend/src/components/UploadForm.tsx
+++ /dev/null
@@ -1,189 +0,0 @@
-import { useState } from "react";
-import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
-import { fetchPresets, uploadSession } from "../services/api";
-import type { Session, StandardsPreset } from "../types/session";
-
-interface UploadFormProps {
- onSuccess?: (session: Session) => void;
-}
-
-export default function UploadForm({ onSuccess }: UploadFormProps) {
- const queryClient = useQueryClient();
- const [name, setName] = useState("");
- const [targetStandard, setTargetStandard] = useState("");
- const [destinationStandard, setDestinationStandard] = useState("");
- const [file, setFile] = useState(null);
- const [standardsFiles, setStandardsFiles] = useState([]);
- const [selectedPresetId, setSelectedPresetId] = useState("");
-
- const { data: presets = [], isLoading: presetsLoading, isError: presetsError } = useQuery({
- queryKey: ["presets"],
- queryFn: fetchPresets,
- refetchInterval: (currentPresets) =>
- Array.isArray(currentPresets) &&
- currentPresets.some((preset) => preset.status === "processing")
- ? 2000
- : false,
- });
- const selectedPreset = presets.find((preset) => preset.id === selectedPresetId);
-
- const mutation = useMutation({
- mutationFn: uploadSession,
- onSuccess: (session) => {
- console.log("Session created:", session);
- queryClient.invalidateQueries({ queryKey: ["sessions"] });
- onSuccess?.(session);
- resetForm();
- },
- onError: (error: unknown) => {
- console.error("Session creation failed:", error);
- },
- });
-
- function resetForm() {
- setName("");
- setTargetStandard("");
- setDestinationStandard("");
- setFile(null);
- setStandardsFiles([]);
- setSelectedPresetId("");
- const reportInput = document.getElementById("source-doc-input") as HTMLInputElement | null;
- if (reportInput) {
- reportInput.value = "";
- }
- const standardsInput = document.getElementById("standards-doc-input") as HTMLInputElement | null;
- if (standardsInput) {
- standardsInput.value = "";
- }
- }
-
- const handleSubmit = (event: React.FormEvent) => {
- event.preventDefault();
- if (!file) {
- alert("Please select a report to upload.");
- return;
- }
- if (!standardsFiles.length && !selectedPresetId) {
- alert("Upload at least one standards PDF or choose a preset bundle.");
- return;
- }
- if (selectedPresetId && selectedPreset && selectedPreset.status !== "ready") {
- alert("The selected preset is still processing. Please wait until it is ready.");
- return;
- }
- mutation.mutate({
- name,
- target_standard: targetStandard,
- destination_standard: destinationStandard,
- metadata: {},
- sourceFile: file,
- standardsFiles,
- standardsPresetId: selectedPresetId || undefined,
- });
- };
-
- return (
-
- );
-}
diff --git a/frontend/src/main.tsx b/frontend/src/main.tsx
index 4d49a2ccdf44fc7964a7af0aebd527692c258a4c..81da3460b0b1152941e22664426723efc0f935ad 100644
--- a/frontend/src/main.tsx
+++ b/frontend/src/main.tsx
@@ -1,15 +1,10 @@
import React from "react";
import ReactDOM from "react-dom/client";
-import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
import App from "./App";
import "./styles.css";
-const queryClient = new QueryClient();
-
ReactDOM.createRoot(document.getElementById("root") as HTMLElement).render(
-
-
-
+
,
);
diff --git a/frontend/src/pages/ObservatoryPage.tsx b/frontend/src/pages/ObservatoryPage.tsx
deleted file mode 100644
index b236224f459ea8e63ded4dc897d6f5f1c796116b..0000000000000000000000000000000000000000
--- a/frontend/src/pages/ObservatoryPage.tsx
+++ /dev/null
@@ -1,179 +0,0 @@
-import { useEffect, useMemo, useState } from "react";
-import { useQuery } from "@tanstack/react-query";
-import { fetchDiagnosticsEvents, fetchDiagnosticsTopology } from "../services/api";
-import type { DiagnosticsEvent, DiagnosticsNode, DiagnosticsTopology } from "../types/diagnostics";
-
-const EVENT_PULSE_WINDOW_MS = 5000;
-const EVENTS_LIMIT = 60;
-
-function buildNodeMap(topology?: DiagnosticsTopology): Map {
- if (!topology) {
- return new Map();
- }
- return new Map(topology.nodes.map((node) => [node.id, node]));
-}
-
-function useActiveNodes(events: DiagnosticsEvent[] | undefined, tick: number): Set {
- return useMemo(() => {
- const active = new Set();
- if (!events) {
- return active;
- }
- const now = Date.now();
- for (const event of events) {
- if (!event.node_id) {
- continue;
- }
- const timestamp = Date.parse(event.timestamp);
- if (Number.isNaN(timestamp)) {
- continue;
- }
- if (now - timestamp <= EVENT_PULSE_WINDOW_MS) {
- active.add(event.node_id);
- }
- }
- return active;
- }, [events, tick]);
-}
-
-const timeFormatter = new Intl.DateTimeFormat(undefined, {
- hour: "2-digit",
- minute: "2-digit",
- second: "2-digit",
-});
-
-export default function ObservatoryPage() {
- const [timeTick, setTimeTick] = useState(Date.now());
- useEffect(() => {
- const id = window.setInterval(() => setTimeTick(Date.now()), 1000);
- return () => window.clearInterval(id);
- }, []);
-
- const {
- data: topology,
- isLoading: topologyLoading,
- isError: topologyError,
- error: topologyErrorDetails,
- } = useQuery({
- queryKey: ["diagnostics", "topology"],
- queryFn: fetchDiagnosticsTopology,
- refetchInterval: 15000,
- });
-
- const {
- data: eventsData,
- isLoading: eventsLoading,
- isError: eventsError,
- error: eventsErrorDetails,
- } = useQuery({
- queryKey: ["diagnostics", "events"],
- queryFn: () => fetchDiagnosticsEvents(EVENTS_LIMIT),
- refetchInterval: 2000,
- });
-
- const nodeMap = buildNodeMap(topology);
- const activeNodes = useActiveNodes(eventsData, timeTick);
-
- const edges = useMemo(() => {
- if (!topology) {
- return [];
- }
- return topology.edges
- .map((edge) => {
- const source = nodeMap.get(edge.source);
- const target = nodeMap.get(edge.target);
- if (!source || !target) {
- return null;
- }
- return (
-
- );
- })
- .filter(Boolean) as JSX.Element[];
- }, [nodeMap, topology]);
-
- const nodes = useMemo(() => {
- if (!topology) {
- return [];
- }
- return topology.nodes.map((node) => {
- const isActive = activeNodes.has(node.id);
- return (
-
-
-
- {node.label}
-
-
- );
- });
- }, [activeNodes, topology]);
-
- return (
-
-
-
System Observatory
-
- Visual map of key storage locations, services, and external dependencies. Nodes pulse when recent activity
- is detected.
-
-
-
-
- Topology
- {topologyLoading ? (
- Loading topology...
- ) : topologyError ? (
- Failed to load topology: {(topologyErrorDetails as Error).message}
- ) : (
-
- )}
-
-
-
-
- );
-}
diff --git a/frontend/src/pages/PresetEditPage.tsx b/frontend/src/pages/PresetEditPage.tsx
deleted file mode 100644
index a563c8fc3eb3e33a519daf566e996033db4cd721..0000000000000000000000000000000000000000
--- a/frontend/src/pages/PresetEditPage.tsx
+++ /dev/null
@@ -1,202 +0,0 @@
-import { useEffect, useState } from "react";
-import { Link, Navigate, useParams } from "react-router-dom";
-import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
-import {
- addPresetDocuments,
- fetchPresetById,
- removePresetDocument,
- updatePreset,
-} from "../services/api";
-import type { StandardsPreset } from "../types/session";
-
-export default function PresetEditPage() {
- const { presetId } = useParams<{ presetId: string }>();
- const queryClient = useQueryClient();
-
- const {
- data: preset,
- isLoading,
- isError,
- error,
- } = useQuery({
- queryKey: ["preset", presetId],
- queryFn: () => fetchPresetById(presetId ?? ""),
- enabled: Boolean(presetId),
- refetchInterval: (currentPreset) =>
- currentPreset?.status === "processing" ? 2000 : false,
- });
-
- const [name, setName] = useState("");
- const [description, setDescription] = useState("");
-
- useEffect(() => {
- if (preset) {
- setName(preset.name);
- setDescription(preset.description ?? "");
- }
- }, [preset?.id, preset?.name, preset?.description]);
-
- const updateMutation = useMutation({
- mutationFn: (payload: { name?: string; description?: string | null }) =>
- updatePreset(presetId ?? "", payload),
- onSuccess: (updated) => {
- queryClient.setQueryData(["preset", presetId], updated);
- queryClient.invalidateQueries({ queryKey: ["presets"] });
- },
- onError: (err: unknown) => {
- alert(`Failed to update preset: ${(err as Error).message}`);
- },
- });
-
- const addDocumentsMutation = useMutation({
- mutationFn: (files: File[]) => addPresetDocuments(presetId ?? "", files),
- onSuccess: (updated) => {
- queryClient.setQueryData(["preset", presetId], updated);
- queryClient.invalidateQueries({ queryKey: ["presets"] });
- },
- onError: (err: unknown) => {
- alert(`Failed to add documents: ${(err as Error).message}`);
- },
- });
-
- const removeDocumentMutation = useMutation({
- mutationFn: (documentPath: string) => removePresetDocument(presetId ?? "", documentPath),
- onSuccess: (updated) => {
- queryClient.setQueryData(["preset", presetId], updated);
- queryClient.invalidateQueries({ queryKey: ["presets"] });
- },
- onError: (err: unknown) => {
- alert(`Failed to remove document: ${(err as Error).message}`);
- },
- });
-
- if (!presetId) {
- return ;
- }
-
- const totalDocs = preset?.total_count ?? preset?.document_count ?? 0;
- const processedDocs = Math.min(preset?.processed_count ?? 0, totalDocs);
- const progressPercent = totalDocs
- ? Math.min(100, Math.round((processedDocs / totalDocs) * 100))
- : preset?.status === "ready"
- ? 100
- : 0;
- const nextDoc = Math.min(processedDocs + 1, totalDocs);
-
- const handleSave = (event: React.FormEvent) => {
- event.preventDefault();
- const trimmedName = name.trim();
- if (!trimmedName) {
- alert("Preset name cannot be empty.");
- return;
- }
- updateMutation.mutate({
- name: trimmedName,
- description: description.trim() || "",
- });
- };
-
- const handleAddFiles = (event: React.ChangeEvent) => {
- const files = Array.from(event.target.files ?? []);
- if (!files.length) {
- return;
- }
- addDocumentsMutation.mutate(files);
- event.target.value = "";
- };
-
- const handleRemoveDocument = (documentPath: string) => {
- if (!window.confirm("Remove this document from the preset?")) {
- return;
- }
- removeDocumentMutation.mutate(documentPath);
- };
-
- return (
-
-
-
-
-
Edit Preset
-
Update preset metadata and manage the associated PDF files.
-
-
- ← Back to Presets
-
-
-
- {isLoading &&
Loading preset details...
}
- {isError &&
Failed to load preset: {(error as Error).message}
}
- {preset && (
- <>
-
-
-
-
- Status: {preset.status}
- {preset.status === "processing" && (
- <>
- {" "}
- · {processedDocs}/{totalDocs} processed
- >
- )}
-
- {preset.last_error &&
Last error: {preset.last_error}
}
-
- {preset.status === "processing" && totalDocs > 0 && (
-
-
-
- Currently parsing document {nextDoc} of {totalDocs}.
-
-
- )}
-
-
-
Documents
- {preset.documents.length ? (
-
- {preset.documents.map((doc) => (
- -
- {doc.split(/[/\\]/).pop() ?? doc}
-
-
- ))}
-
- ) : (
-
No documents attached yet.
- )}
-
-
- >
- )}
-
-
- );
-}
-
diff --git a/frontend/src/pages/PresetsPage.tsx b/frontend/src/pages/PresetsPage.tsx
deleted file mode 100644
index 7eca0fa117fadb90597d18e97084fcbc3194c34e..0000000000000000000000000000000000000000
--- a/frontend/src/pages/PresetsPage.tsx
+++ /dev/null
@@ -1,9 +0,0 @@
-import PresetManager from "../components/PresetManager";
-
-export default function PresetsPage() {
- return (
-
- );
-}
diff --git a/frontend/src/pages/SessionsPage.tsx b/frontend/src/pages/SessionsPage.tsx
deleted file mode 100644
index acd4ba385681dcfe17d7971a69420449e4571c60..0000000000000000000000000000000000000000
--- a/frontend/src/pages/SessionsPage.tsx
+++ /dev/null
@@ -1,19 +0,0 @@
-import { useState } from "react";
-import UploadForm from "../components/UploadForm";
-import SessionDetails from "../components/SessionDetails";
-import SessionList from "../components/SessionList";
-import type { Session, SessionSummary } from "../types/session";
-
-export default function SessionsPage() {
- const [selectedSession, setSelectedSession] = useState(null);
-
- return (
-
- );
-}
diff --git a/frontend/src/services/api.ts b/frontend/src/services/api.ts
deleted file mode 100644
index 8c0073842326426c2a09dec7e4b175889e0d3b97..0000000000000000000000000000000000000000
--- a/frontend/src/services/api.ts
+++ /dev/null
@@ -1,113 +0,0 @@
-import axios from "axios";
-import type {
- CreatePresetPayload,
- CreateSessionPayload,
- Session,
- SessionStatusResponse,
- SessionSummary,
- StandardsPreset,
- UpdatePresetPayload,
-} from "../types/session";
-import type { DiagnosticsEvent, DiagnosticsTopology } from "../types/diagnostics";
-
-const API_BASE_URL = import.meta.env.VITE_API_BASE_URL ?? "http://localhost:8000/api";
-
-export async function uploadSession(payload: CreateSessionPayload): Promise {
- const formData = new FormData();
- formData.append("name", payload.name);
- formData.append("target_standard", payload.target_standard);
- formData.append("destination_standard", payload.destination_standard);
- formData.append("source_doc", payload.sourceFile);
- (payload.standardsFiles ?? []).forEach((file) => {
- formData.append("standards_pdfs", file);
- });
- if (payload.standardsPresetId) {
- formData.append("standards_preset_id", payload.standardsPresetId);
- }
- if (payload.metadata) {
- formData.append("metadata", JSON.stringify(payload.metadata));
- }
-
- const response = await axios.post(`${API_BASE_URL}/sessions`, formData, {
- headers: { "Content-Type": "multipart/form-data" },
- });
- return response.data;
-}
-
-export async function fetchSessions(): Promise {
- const response = await axios.get(`${API_BASE_URL}/sessions`);
- return response.data;
-}
-
-export async function fetchSessionById(id: string): Promise {
- const response = await axios.get(`${API_BASE_URL}/sessions/${id}`);
- return response.data;
-}
-
-export async function fetchSessionStatus(id: string): Promise {
- const response = await axios.get(`${API_BASE_URL}/sessions/${id}/status`);
- return response.data;
-}
-
-export async function fetchPresets(): Promise {
- const response = await axios.get(`${API_BASE_URL}/presets`);
- return response.data;
-}
-
-export async function fetchPresetById(id: string): Promise {
- const response = await axios.get(`${API_BASE_URL}/presets/${id}`);
- return response.data;
-}
-
-export async function createPreset(payload: CreatePresetPayload): Promise {
- const formData = new FormData();
- formData.append("name", payload.name);
- if (payload.description) {
- formData.append("description", payload.description);
- }
- payload.files.forEach((file) => {
- formData.append("standards_pdfs", file);
- });
-
- const response = await axios.post(`${API_BASE_URL}/presets`, formData, {
- headers: { "Content-Type": "multipart/form-data" },
- });
- return response.data;
-}
-
-export async function addPresetDocuments(id: string, files: File[]): Promise {
- const formData = new FormData();
- files.forEach((file) => formData.append("standards_pdfs", file));
- const response = await axios.post(`${API_BASE_URL}/presets/${id}/documents`, formData, {
- headers: { "Content-Type": "multipart/form-data" },
- });
- return response.data;
-}
-
-export async function updatePreset(id: string, payload: UpdatePresetPayload): Promise {
- const response = await axios.patch(`${API_BASE_URL}/presets/${id}`, payload);
- return response.data;
-}
-
-export async function deletePreset(id: string): Promise {
- await axios.delete(`${API_BASE_URL}/presets/${id}`);
-}
-
-export async function removePresetDocument(id: string, documentPath: string): Promise {
- const response = await axios.delete(`${API_BASE_URL}/presets/${id}/documents`, {
- params: { document: documentPath },
- });
- return response.data;
-}
-
-export async function fetchDiagnosticsTopology(): Promise {
- const response = await axios.get(`${API_BASE_URL}/diagnostics/topology`);
- return response.data;
-}
-
-export async function fetchDiagnosticsEvents(limit = 50): Promise {
- const response = await axios.get<{ events: DiagnosticsEvent[] }>(`${API_BASE_URL}/diagnostics/events`, {
- params: { limit },
- });
- return response.data.events;
-}
diff --git a/frontend/src/styles.css b/frontend/src/styles.css
index 90292e28de299e5c9bb606824eae7bc92161afab..e90c6a579486e5a577cca8077963f674011844fe 100644
--- a/frontend/src/styles.css
+++ b/frontend/src/styles.css
@@ -1,643 +1,15 @@
-:root {
- color-scheme: light;
- font-family: "Inter", system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", sans-serif;
- line-height: 1.5;
- font-weight: 400;
- color: #0f172a;
- background-color: #f8fafc;
-}
-
* {
box-sizing: border-box;
}
body {
margin: 0;
+ background: #000000;
}
-a {
- color: inherit;
-}
-
-button {
- font: inherit;
-}
-
-.app-shell {
- min-height: 100vh;
- display: flex;
- flex-direction: column;
- padding: 1.5rem;
- gap: 1.5rem;
-}
-
-header {
- display: flex;
- flex-direction: column;
- gap: 0.5rem;
-}
-
-.primary-nav {
- display: flex;
- gap: 0.75rem;
- margin-top: 0.5rem;
-}
-
-.primary-nav a {
- color: #1d4ed8;
- padding: 0.4rem 0.75rem;
- border-radius: 999px;
- border: 1px solid transparent;
- text-decoration: none;
- transition: background 0.2s, color 0.2s, border-color 0.2s;
- font-weight: 600;
- font-size: 0.9rem;
-}
-
-.primary-nav a:hover {
- border-color: #bfdbfe;
- background: #e0f2fe;
-}
-
-.primary-nav a.active {
- background: #1d4ed8;
- color: #ffffff;
-}
-
-main {
- flex: 1;
-}
-
-.grid {
- display: grid;
- grid-template-columns: minmax(0, 1.2fr) minmax(0, 1fr);
- gap: 1.5rem;
-}
-
-.card {
- background: #ffffff;
- padding: 1.25rem;
- border-radius: 12px;
- box-shadow: 0 2px 8px rgba(15, 23, 42, 0.08);
- display: flex;
- flex-direction: column;
- gap: 0.75rem;
-}
-
-.field {
- display: flex;
- flex-direction: column;
- gap: 0.25rem;
- font-weight: 600;
- color: #0f172a;
-}
-
-.field input {
- padding: 0.5rem 0.75rem;
- border-radius: 8px;
- border: 1px solid #cbd5f5;
- font: inherit;
-}
-
-.field select {
- padding: 0.5rem 0.75rem;
- border-radius: 8px;
- border: 1px solid #cbd5f5;
- font: inherit;
- background: #ffffff;
-}
-
-button[type="submit"] {
- background: #1d4ed8;
- color: #ffffff;
- border: none;
- padding: 0.6rem 1rem;
- border-radius: 8px;
- cursor: pointer;
- transition: background 0.2s;
-}
-
-button[type="submit"]:disabled {
- background: #93c5fd;
- cursor: not-allowed;
-}
-
-.error {
- color: #b91c1c;
-}
-
-.success {
- color: #047857;
-}
-
-.muted {
- color: #64748b;
- font-size: 0.875rem;
-}
-
-.progress-block {
- display: flex;
- flex-direction: column;
- gap: 0.5rem;
-}
-
-.progress-bar {
- position: relative;
- width: 100%;
- height: 6px;
- background: #dbeafe;
- border-radius: 999px;
- overflow: hidden;
-}
-
-.progress-bar-fill {
- position: absolute;
- inset: 0;
- background: linear-gradient(90deg, #1d4ed8, #38bdf8);
- animation: progress-pulse 1.2s linear infinite;
-}
-
-@keyframes progress-pulse {
- 0% {
- transform: translateX(-50%);
- }
- 50% {
- transform: translateX(0%);
- }
- 100% {
- transform: translateX(100%);
- }
-}
-
-.notice {
- padding: 0.75rem;
- border-radius: 8px;
- border: 1px solid transparent;
-}
-
-.notice.success {
- background: #ecfdf5;
- border-color: #bbf7d0;
- color: #036949;
-}
-
-.log-panel {
- display: flex;
- flex-direction: column;
- gap: 0.5rem;
+#root,
+.black-screen {
+ width: 100vw;
+ height: 100vh;
+ background: #000000;
}
-
-.log-panel ul {
- margin: 0;
- padding-left: 1rem;
- display: flex;
- flex-direction: column;
- gap: 0.35rem;
-}
-
-.log-panel li {
- font-size: 0.9rem;
- color: #0f172a;
-}
-
-.section-card {
- margin-top: 1.25rem;
- padding-top: 1rem;
- border-top: 1px solid #e2e8f0;
- display: flex;
- flex-direction: column;
- gap: 0.6rem;
-}
-
-.section-card h3 {
- margin: 0;
- font-size: 1rem;
- color: #1e293b;
-}
-
-.section-list {
- list-style: disc;
- margin: 0;
- padding-left: 1.25rem;
- display: flex;
- flex-direction: column;
- gap: 0.5rem;
-}
-
-.section-list li {
- color: #0f172a;
-}
-
-.diff-block {
- display: flex;
- flex-direction: column;
- gap: 0.35rem;
- margin-top: 0.35rem;
- padding-left: 0.5rem;
- border-left: 2px solid #cbd5f5;
-}
-
-.field small {
- font-weight: 400;
- color: #64748b;
-}
-
-.session-list {
- list-style: none;
- margin: 0;
- padding: 0;
- display: flex;
- flex-direction: column;
- gap: 0.75rem;
-}
-
-.session-list li {
- display: flex;
- justify-content: space-between;
- align-items: center;
- padding: 0.75rem;
- border-radius: 10px;
- border: 1px solid transparent;
- cursor: pointer;
- transition: border-color 0.2s, background 0.2s;
-}
-
-.session-list li:hover {
- border-color: #bfdbfe;
-}
-
-.session-list li.selected {
- border-color: #1d4ed8;
- background: #eff6ff;
-}
-
-.status {
- text-transform: uppercase;
- font-weight: 600;
- font-size: 0.75rem;
-}
-
-.status-created,
-.status-processing {
- color: #1d4ed8;
-}
-
-.status-review {
- color: #0f766e;
-}
-
-.status-completed {
- color: #16a34a;
-}
-
-.status-failed {
- color: #b91c1c;
-}
-
-.details-grid {
- display: grid;
- grid-template-columns: repeat(auto-fit, minmax(160px, 1fr));
- gap: 0.75rem;
-}
-
-.details-grid dt {
- font-size: 0.75rem;
- text-transform: uppercase;
- color: #475569;
-}
-
-.details-grid dd {
- margin: 0;
- font-weight: 600;
-}
-
-footer {
- text-align: center;
- color: #64748b;
- font-size: 0.875rem;
-}
-
-@media (max-width: 960px) {
- .grid {
- grid-template-columns: 1fr;
- }
-}
-.actions-row {
- display: flex;
- justify-content: flex-end;
-}
-
-.card-header {
- display: flex;
- align-items: center;
- justify-content: space-between;
- gap: 1rem;
- margin-bottom: 1rem;
-}
-
-.card-header h2 {
- margin: 0;
-}
-
-.header-actions {
- display: flex;
- align-items: center;
- gap: 0.75rem;
-}
-
-.text-small {
- font-size: 0.875rem;
-}
-
-.ghost-button {
- background: transparent;
- border: 1px solid #cbd5f5;
- color: #1d4ed8;
- padding: 0.4rem 0.8rem;
- border-radius: 8px;
- cursor: pointer;
- transition: background 0.2s, color 0.2s;
-}
-
-.ghost-button:hover:not(:disabled) {
- background: #e0f2fe;
-}
-
-.ghost-button:disabled {
- color: #94a3b8;
- cursor: not-allowed;
-}
-
-.subsection {
- margin-top: 0.75rem;
- display: flex;
- flex-direction: column;
- gap: 0.5rem;
-}
-
-.stacked {
- display: flex;
- flex-direction: column;
- gap: 0.75rem;
-}
-
-.preset-list ul {
- list-style: none;
- padding: 0;
- margin: 0;
- display: flex;
- flex-direction: column;
- gap: 0.75rem;
-}
-
-.preset-list li {
- border: 1px solid #e2e8f0;
- border-radius: 10px;
- padding: 0.75rem;
- display: flex;
- flex-direction: column;
- gap: 0.35rem;
-}
-
-.preset-docs {
- font-size: 0.8rem;
-}
-
-.preset-actions {
- display: flex;
- gap: 0.5rem;
- margin-top: 0.5rem;
-}
-
-.ghost-button.danger {
- border-color: #fca5a5;
- color: #b91c1c;
-}
-
-.ghost-button.danger:hover:not(:disabled) {
- background: #fee2e2;
-}
-
-.ghost-button.danger:disabled {
- color: #fca5a5;
- border-color: #fca5a5;
-}
-
-.edit-header {
- display: flex;
- align-items: center;
- justify-content: space-between;
- gap: 1rem;
- flex-wrap: wrap;
-}
-
-.document-list {
- list-style: none;
- padding: 0;
- margin: 0;
- display: flex;
- flex-direction: column;
- gap: 0.5rem;
-}
-
-.document-item {
- display: flex;
- align-items: center;
- justify-content: space-between;
- gap: 0.75rem;
- padding: 0.6rem 0.75rem;
- border: 1px solid #e2e8f0;
- border-radius: 8px;
-}
-
-.preset-status-card {
- margin-top: 1rem;
- padding: 0.75rem;
- border: 1px solid #e2e8f0;
- border-radius: 10px;
- background: #f8fafc;
-}
-
-
-.page-stack {
- display: flex;
- flex-direction: column;
- gap: 1.5rem;
-}
-
-.preset-header {
- display: flex;
- justify-content: space-between;
- gap: 1rem;
-}
-
-.preset-status {
- display: flex;
- align-items: center;
- gap: 0.5rem;
- font-size: 0.85rem;
-}
-
-.status-ready {
- color: #16a34a;
- font-weight: 600;
-}
-
-.status-processing {
- color: #1d4ed8;
- font-weight: 600;
-}
-
-.status-failed {
- color: #b91c1c;
- font-weight: 600;
-}
-
-.linear-progress {
- width: 100%;
- height: 6px;
- border-radius: 999px;
- background: #dbeafe;
- overflow: hidden;
-}
-
-.linear-progress-fill {
- height: 100%;
- background: linear-gradient(90deg, #1d4ed8, #38bdf8);
- transition: width 0.3s ease;
-}
-
-
-/* Observatory */
-.observatory-page {
- display: flex;
- flex-direction: column;
- gap: 1.5rem;
-}
-
-.observatory-header h2 {
- margin: 0;
-}
-
-.observatory-header p {
- margin: 0.25rem 0 0;
- color: #475569;
-}
-
-.observatory-content {
- display: flex;
- flex-wrap: wrap;
- gap: 1.5rem;
-}
-
-.observatory-graph-card,
-.observatory-feed-card {
- background: #ffffff;
- border-radius: 12px;
- padding: 1.25rem;
- box-shadow: 0 10px 25px rgba(15, 23, 42, 0.08);
- flex: 1 1 320px;
- min-height: 320px;
-}
-
-.observatory-graph-card {
- flex: 2 1 520px;
- display: flex;
- flex-direction: column;
-}
-
-.observatory-canvas {
- margin-top: 1rem;
- width: 100%;
- height: 100%;
- min-height: 320px;
- border-radius: 12px;
- background: radial-gradient(circle at center, #dbeafe, #e2e8f0 70%);
- padding: 1rem;
-}
-
-.observatory-edge {
- stroke: rgba(30, 64, 175, 0.35);
- stroke-width: 0.8;
- stroke-linecap: round;
-}
-
-.observatory-node circle {
- transform-origin: center;
- transform-box: fill-box;
- fill: #1d4ed8;
- opacity: 0.7;
- transition: r 0.2s ease, opacity 0.2s ease;
-}
-
-.observatory-node text {
- font-size: 3px;
- fill: #0f172a;
- pointer-events: none;
-}
-
-.observatory-node.group-storage circle {
- fill: #0ea5e9;
-}
-
-.observatory-node.group-service circle {
- fill: #16a34a;
-}
-
-.observatory-node.group-external circle {
- fill: #f97316;
-}
-
-.observatory-node circle.pulse {
- animation: observatory-pulse 1.2s ease-in-out infinite;
- opacity: 0.95;
-}
-
-@keyframes observatory-pulse {
- 0%, 100% {
- transform: scale(1);
- opacity: 0.95;
- }
- 50% {
- transform: scale(1.3);
- opacity: 0.6;
- }
-}
-
-.observatory-feed-card {
- display: flex;
- flex-direction: column;
-}
-
-.observatory-feed {
- list-style: none;
- padding: 0;
- margin: 1rem 0 0;
- display: flex;
- flex-direction: column;
- gap: 0.75rem;
- max-height: 360px;
- overflow-y: auto;
-}
-
-.observatory-feed li {
- padding: 0.75rem;
- border-radius: 10px;
- background: #f8fafc;
- border: 1px solid #e2e8f0;
-}
-
-.observatory-feed .feed-row {
- display: flex;
- align-items: center;
- justify-content: space-between;
- margin-bottom: 0.35rem;
-}
-
-.observatory-feed .feed-row strong {
- color: #0f172a;
-}
-
-.observatory-feed .feed-row .muted {
- font-size: 0.85rem;
-}
\ No newline at end of file
diff --git a/frontend/src/types/diagnostics.ts b/frontend/src/types/diagnostics.ts
deleted file mode 100644
index f22b7cfe798d46ec903bf6e1cb418a4029d7fefe..0000000000000000000000000000000000000000
--- a/frontend/src/types/diagnostics.ts
+++ /dev/null
@@ -1,27 +0,0 @@
-export interface DiagnosticsNode {
- id: string;
- label: string;
- group: "storage" | "service" | "external" | string;
- position: { x: number; y: number };
- last_event_at?: string | null;
-}
-
-export interface DiagnosticsEdge {
- id: string;
- source: string;
- target: string;
-}
-
-export interface DiagnosticsTopology {
- nodes: DiagnosticsNode[];
- edges: DiagnosticsEdge[];
-}
-
-export interface DiagnosticsEvent {
- id: string;
- timestamp: string;
- event_type: string;
- message: string;
- node_id?: string | null;
- metadata?: Record;
-}
diff --git a/frontend/src/types/session.ts b/frontend/src/types/session.ts
deleted file mode 100644
index f228aea4b2cea134ea59d9ad1987f55664677cbb..0000000000000000000000000000000000000000
--- a/frontend/src/types/session.ts
+++ /dev/null
@@ -1,59 +0,0 @@
-export interface SessionSummary {
- id: string;
- name: string;
- status: string;
- created_at: string;
- updated_at: string;
- source_doc: string;
- target_standard: string;
- destination_standard: string;
- standards_count: number;
- last_error?: string | null;
-}
-
-export interface Session extends SessionSummary {
- standards_docs: string[];
- logs: string[];
- metadata: Record;
-}
-
-export interface CreateSessionPayload {
- name: string;
- target_standard: string;
- destination_standard: string;
- metadata?: Record;
- sourceFile: File;
- standardsFiles?: File[];
- standardsPresetId?: string | null;
-}
-
-export interface SessionStatusResponse {
- id: string;
- status: string;
- updated_at: string;
-}
-
-export interface StandardsPreset {
- id: string;
- name: string;
- description?: string | null;
- documents: string[];
- document_count: number;
- status: string;
- processed_count: number;
- total_count: number;
- last_error?: string | null;
- created_at: string;
- updated_at: string;
-}
-
-export interface CreatePresetPayload {
- name: string;
- description?: string;
- files: File[];
-}
-
-export interface UpdatePresetPayload {
- name?: string;
- description?: string | null;
-}
diff --git a/infra/README.md b/infra/README.md
deleted file mode 100644
index 5ab4715f2496cf08f1d17e7fab0cd154f4c37d27..0000000000000000000000000000000000000000
--- a/infra/README.md
+++ /dev/null
@@ -1,10 +0,0 @@
-# Infrastructure Assets
-
-Deployment and operational tooling for local, staging, and production environments.
-
-## Structure
-
-- `docker/` - Container definitions, compose files, and local runtime configs.
-- `configs/` - Shared configuration overlays (env templates, secrets examples).
-- `observability/` - Logging, metrics, and tracing configuration.
-- `ci/` - Continuous integration workflows and automation manifests.
diff --git a/notes/comments/Initial Start Up.docx b/notes/comments/Initial Start Up.docx
deleted file mode 100644
index 8afa16adb239c62509a7a02a26c2e9d532f3eb55..0000000000000000000000000000000000000000
Binary files a/notes/comments/Initial Start Up.docx and /dev/null differ
diff --git a/notes/comments/~$itial Start Up.docx b/notes/comments/~$itial Start Up.docx
deleted file mode 100644
index cbdcb002a610ca9a222c7472afca89ce6d961ec2..0000000000000000000000000000000000000000
Binary files a/notes/comments/~$itial Start Up.docx and /dev/null differ
diff --git a/scripts/README.md b/scripts/README.md
deleted file mode 100644
index cb4f3b9d6aac5c49244edf2b2f933c7bc4f8f15f..0000000000000000000000000000000000000000
--- a/scripts/README.md
+++ /dev/null
@@ -1,9 +0,0 @@
-# Scripts
-
-Automation helpers for development, QA, and operations.
-
-## Structure
-
-- `dev/` - Local environment bootstrap, data loaders, and convenience tasks.
-- `tools/` - Linting, formatting, and code generation utilities.
-- `ops/` - Deployment, monitoring, and maintenance scripts.
diff --git a/server/.env.example b/server/.env.example
deleted file mode 100644
index 1dbfbbbab4b860fe3c9ca2fc4fdaa8c38c7abffc..0000000000000000000000000000000000000000
--- a/server/.env.example
+++ /dev/null
@@ -1,12 +0,0 @@
-RIGHTCODES_APP_NAME=RightCodes Server
-RIGHTCODES_DEBUG=true
-RIGHTCODES_API_PREFIX=/api
-RIGHTCODES_CORS_ORIGINS=["http://localhost:5173"]
-RIGHTCODES_STORAGE_DIR=./storage
-RIGHTCODES_OPENAI_API_KEY=sk-your-openai-key
-RIGHTCODES_OPENAI_API_BASE=https://api.openai.com/v1
-RIGHTCODES_OPENAI_MODEL_EXTRACT=gpt-4.1-mini
-RIGHTCODES_OPENAI_MODEL_MAPPING=gpt-4.1-mini
-RIGHTCODES_OPENAI_MODEL_REWRITE=gpt-4o-mini
-RIGHTCODES_OPENAI_MODEL_VALIDATE=gpt-4.1-mini
-RIGHTCODES_OPENAI_MODEL_EMBED=text-embedding-3-large
diff --git a/server/README.md b/server/README.md
deleted file mode 100644
index 9b7e3b82987b6503b8800596d5f5cba039692c8e..0000000000000000000000000000000000000000
--- a/server/README.md
+++ /dev/null
@@ -1,18 +0,0 @@
-# Server Module
-
-FastAPI service layer coordinating file ingestion, session lifecycle, and agent pipelines.
-
-## Structure
-
-- `app/api/routes/` - REST and WebSocket endpoints for uploads, progress, and exports.
-- `app/api/schemas/` - Pydantic request and response models.
-- `app/api/dependencies/` - Shared dependency injection utilities.
-- `app/core/` - Settings, logging, security, and feature flags.
-- `app/models/` - ORM entities and persistence abstractions.
-- `app/services/` - Business logic for session orchestration and diff management.
-- `app/workflows/` - Pypeflow job definitions and coordination logic.
-- `app/events/` - Event emitters and subscribers for audit trail and notifications.
-- `app/utils/` - Shared helpers (file adapters, serialization, doc merge).
-- `scripts/` - Operational scripts (db migrations, data seeding).
-- `tests/` - Unit and integration tests (API, workflows, persistence).
-- `config/` - Configuration templates (env files, local settings).
diff --git a/server/app/__init__.py b/server/app/__init__.py
index 89f7867bdafdde25dec9e2b5c73037154ff56826..813f14cbd7fc8e08e0e2e9924ce486f641ce54f4 100644
--- a/server/app/__init__.py
+++ b/server/app/__init__.py
@@ -1,11 +1,3 @@
-from pathlib import Path
-import sys
-
-# Ensure project root is on sys.path so shared packages (e.g., `agents`, `workers`) resolve.
-PROJECT_ROOT = Path(__file__).resolve().parents[2]
-if str(PROJECT_ROOT) not in sys.path:
- sys.path.append(str(PROJECT_ROOT))
-
from .main import create_app
__all__ = ["create_app"]
diff --git a/server/app/api/dependencies/__init__.py b/server/app/api/dependencies/__init__.py
deleted file mode 100644
index ac2e8b6221228322b3f294fdfdff700ea83b307c..0000000000000000000000000000000000000000
--- a/server/app/api/dependencies/__init__.py
+++ /dev/null
@@ -1,13 +0,0 @@
-from .services import (
- get_diagnostics_service,
- get_pipeline_orchestrator,
- get_preset_service,
- get_session_service,
-)
-
-__all__ = [
- "get_session_service",
- "get_pipeline_orchestrator",
- "get_preset_service",
- "get_diagnostics_service",
-]
diff --git a/server/app/api/dependencies/services.py b/server/app/api/dependencies/services.py
deleted file mode 100644
index a05ecfb6877f6a7bbef4248a058c2155af6013f7..0000000000000000000000000000000000000000
--- a/server/app/api/dependencies/services.py
+++ /dev/null
@@ -1,35 +0,0 @@
-from functools import lru_cache
-
-from ...services import (
- FileCache,
- PresetService,
- SessionService,
- get_diagnostics_service as _get_diagnostics_service,
-)
-from ...workflows import PipelineOrchestrator, register_default_stages
-
-
-@lru_cache
-def get_session_service() -> SessionService:
- return SessionService()
-
-
-@lru_cache
-def get_file_cache() -> FileCache:
- return FileCache()
-
-
-@lru_cache
-def get_preset_service() -> PresetService:
- return PresetService()
-
-
-@lru_cache
-def get_pipeline_orchestrator() -> PipelineOrchestrator:
- orchestrator = PipelineOrchestrator(get_session_service())
- return register_default_stages(orchestrator, file_cache=get_file_cache())
-
-
-@lru_cache
-def get_diagnostics_service():
- return _get_diagnostics_service()
diff --git a/server/app/api/routes/__init__.py b/server/app/api/routes/__init__.py
deleted file mode 100644
index 26284491b7fe8adffd5f50ef117c7c9ca224a0af..0000000000000000000000000000000000000000
--- a/server/app/api/routes/__init__.py
+++ /dev/null
@@ -1,14 +0,0 @@
-from fastapi import APIRouter
-
-from .diagnostics import router as diagnostics_router
-from .health import router as health_router
-from .presets import router as presets_router
-from .sessions import router as sessions_router
-
-api_router = APIRouter()
-api_router.include_router(health_router, tags=["system"])
-api_router.include_router(sessions_router, prefix="/sessions", tags=["sessions"])
-api_router.include_router(presets_router, prefix="/presets", tags=["presets"])
-api_router.include_router(diagnostics_router, prefix="/diagnostics", tags=["diagnostics"])
-
-__all__ = ["api_router"]
diff --git a/server/app/api/routes/diagnostics.py b/server/app/api/routes/diagnostics.py
deleted file mode 100644
index fdd3559954ed5873a8d2004b938c935630921e09..0000000000000000000000000000000000000000
--- a/server/app/api/routes/diagnostics.py
+++ /dev/null
@@ -1,40 +0,0 @@
-from __future__ import annotations
-
-from datetime import datetime
-from typing import Optional
-
-from fastapi import APIRouter, Depends, HTTPException, Query
-
-from ...services.diagnostics_service import DiagnosticsService
-from ..dependencies import get_diagnostics_service
-
-router = APIRouter()
-
-
-@router.get("/topology")
-def get_topology(
- diagnostics: DiagnosticsService = Depends(get_diagnostics_service),
-) -> dict[str, object]:
- """Return the static system topology with latest node activity."""
- return diagnostics.get_topology()
-
-
-@router.get("/events")
-def get_events(
- diagnostics: DiagnosticsService = Depends(get_diagnostics_service),
- limit: int = Query(50, ge=1, le=200),
- since: Optional[str] = Query(None, description="ISO timestamp to filter events"),
-) -> dict[str, object]:
- """Return recent diagnostic events."""
- # Validate since value early to provide clearer error messages.
- since_value: Optional[str] = None
- if since:
- try:
- datetime.fromisoformat(since)
- except ValueError as exc: # pragma: no cover - defensive
- raise HTTPException(
- status_code=400, detail="Parameter 'since' must be a valid ISO timestamp."
- ) from exc
- since_value = since
- events = diagnostics.get_events(limit=limit, since=since_value)
- return {"events": events}
diff --git a/server/app/api/routes/health.py b/server/app/api/routes/health.py
deleted file mode 100644
index 4e18d0ccfb524ff01747dc94fa8b369914c8cea2..0000000000000000000000000000000000000000
--- a/server/app/api/routes/health.py
+++ /dev/null
@@ -1,8 +0,0 @@
-from fastapi import APIRouter
-
-router = APIRouter()
-
-
-@router.get("/health", summary="Health check")
-async def healthcheck() -> dict[str, str]:
- return {"status": "ok"}
diff --git a/server/app/api/routes/presets.py b/server/app/api/routes/presets.py
deleted file mode 100644
index 5d9c76461496cb3808a6ddae07f24eae38aaa2f1..0000000000000000000000000000000000000000
--- a/server/app/api/routes/presets.py
+++ /dev/null
@@ -1,138 +0,0 @@
-from __future__ import annotations
-
-from typing import List, Optional
-
-from fastapi import (
- APIRouter,
- BackgroundTasks,
- Depends,
- File,
- Form,
- HTTPException,
- Query,
- Response,
- UploadFile,
- status,
-)
-
-from ...api.schemas import PresetResponse, PresetUpdateRequest, build_preset_response
-from ...services import FileCache, PresetService
-from ...utils import save_upload
-from ..dependencies.services import get_file_cache, get_preset_service
-
-router = APIRouter()
-
-
-@router.get("", response_model=list[PresetResponse])
-async def list_presets(preset_service: PresetService = Depends(get_preset_service)) -> list[PresetResponse]:
- return [build_preset_response(preset) for preset in preset_service.list_presets()]
-
-
-@router.get("/{preset_id}", response_model=PresetResponse)
-async def get_preset(
- preset_id: str,
- preset_service: PresetService = Depends(get_preset_service),
-) -> PresetResponse:
- preset = preset_service.get_preset(preset_id)
- if not preset:
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Preset not found")
- return build_preset_response(preset)
-
-
-@router.post("", response_model=PresetResponse, status_code=status.HTTP_201_CREATED)
-async def create_preset(
- background_tasks: BackgroundTasks,
- name: str = Form(...),
- description: Optional[str] = Form(None),
- standards_pdfs: List[UploadFile] = File(...),
- preset_service: PresetService = Depends(get_preset_service),
- file_cache: FileCache = Depends(get_file_cache),
-) -> PresetResponse:
- if not standards_pdfs:
- raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="At least one PDF is required.")
-
- stored_paths = []
- for upload in standards_pdfs:
- stored_paths.append(save_upload(upload.filename, upload.file, subdir="presets/originals"))
-
- try:
- preset = preset_service.start_preset(
- name=name,
- description=description,
- documents=stored_paths,
- )
- except ValueError as exc:
- raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(exc)) from exc
-
- background_tasks.add_task(preset_service.process_preset, str(preset.id), file_cache)
-
- return build_preset_response(preset)
-
-
-@router.post("/{preset_id}/documents", response_model=PresetResponse)
-async def add_preset_documents(
- preset_id: str,
- background_tasks: BackgroundTasks,
- standards_pdfs: List[UploadFile] = File(...),
- preset_service: PresetService = Depends(get_preset_service),
- file_cache: FileCache = Depends(get_file_cache),
-) -> PresetResponse:
- if not standards_pdfs:
- raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="At least one PDF is required.")
-
- stored_paths = [
- save_upload(upload.filename, upload.file, subdir="presets/originals")
- for upload in standards_pdfs
- ]
-
- preset = preset_service.add_documents(preset_id, stored_paths)
- if not preset:
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Preset not found")
-
- if stored_paths:
- background_tasks.add_task(preset_service.process_preset, str(preset.id), file_cache)
-
- return build_preset_response(preset)
-
-
-@router.patch("/{preset_id}", response_model=PresetResponse)
-async def update_preset(
- preset_id: str,
- payload: PresetUpdateRequest,
- preset_service: PresetService = Depends(get_preset_service),
-) -> PresetResponse:
- preset = preset_service.update_preset(
- preset_id,
- name=payload.name,
- description=payload.description,
- )
- if not preset:
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Preset not found")
- return build_preset_response(preset)
-
-
-@router.delete("/{preset_id}/documents", response_model=PresetResponse)
-async def remove_preset_document(
- preset_id: str,
- background_tasks: BackgroundTasks,
- document: str = Query(..., description="Path of the document to remove"),
- preset_service: PresetService = Depends(get_preset_service),
- file_cache: FileCache = Depends(get_file_cache),
-) -> PresetResponse:
- preset = preset_service.remove_document(preset_id, document)
- if not preset:
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Preset not found")
- if preset.status == "processing" and preset.total_count > 0:
- background_tasks.add_task(preset_service.process_preset, str(preset.id), file_cache)
- return build_preset_response(preset)
-
-
-@router.delete("/{preset_id}", status_code=status.HTTP_204_NO_CONTENT)
-async def delete_preset(
- preset_id: str,
- preset_service: PresetService = Depends(get_preset_service),
-) -> Response:
- deleted = preset_service.delete_preset(preset_id)
- if not deleted:
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Preset not found")
- return Response(status_code=status.HTTP_204_NO_CONTENT)
diff --git a/server/app/api/routes/sessions.py b/server/app/api/routes/sessions.py
deleted file mode 100644
index ad65bd67847b6d32b7c9ba281c1cee5bd45c6ce4..0000000000000000000000000000000000000000
--- a/server/app/api/routes/sessions.py
+++ /dev/null
@@ -1,178 +0,0 @@
-from __future__ import annotations
-
-import json
-from pathlib import Path
-from typing import List, Optional
-
-from fastapi import (
- APIRouter,
- BackgroundTasks,
- Depends,
- File,
- Form,
- HTTPException,
- UploadFile,
- status,
-)
-from fastapi.responses import FileResponse
-
-from ...api.schemas import (
- SessionResponse,
- SessionStatusResponse,
- SessionSummaryResponse,
- build_session_response,
- build_session_summary,
-)
-from ...services import PresetService, SessionService
-from ...utils import save_upload
-from ...utils.paths import (
- canonical_storage_path,
- resolve_storage_path,
- to_storage_relative,
-)
-from ...workflows import PipelineOrchestrator
-from ..dependencies import get_pipeline_orchestrator, get_preset_service, get_session_service
-
-router = APIRouter()
-
-
-@router.post("", response_model=SessionResponse, status_code=status.HTTP_201_CREATED)
-async def create_session(
- background_tasks: BackgroundTasks,
- name: str = Form(...),
- target_standard: str = Form(...),
- destination_standard: str = Form(...),
- metadata: Optional[str] = Form(None),
- source_doc: UploadFile = File(...),
- standards_pdfs: Optional[List[UploadFile]] = File(None),
- standards_preset_id: Optional[str] = Form(None),
- session_service: SessionService = Depends(get_session_service),
- preset_service: PresetService = Depends(get_preset_service),
- orchestrator: PipelineOrchestrator = Depends(get_pipeline_orchestrator),
-) -> SessionResponse:
- extra_metadata = json.loads(metadata) if metadata else {}
- report_path = resolve_storage_path(save_upload(source_doc.filename, source_doc.file))
- standards_paths: list[Path] = []
- if standards_pdfs:
- for upload in standards_pdfs:
- uploaded_path = save_upload(upload.filename, upload.file)
- standards_paths.append(resolve_storage_path(uploaded_path))
-
- preset_payloads: list[dict[str, object]] = []
- preset_doc_paths: list[str] = []
- selected_preset = None
- if standards_preset_id:
- preset = preset_service.get_preset(standards_preset_id)
- if not preset:
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Preset not found")
- selected_preset = preset
- preset_payloads = [dict(payload) for payload in preset.parsed_payloads]
- normalised_payloads: list[dict[str, object]] = []
- for payload in preset_payloads:
- normalised = dict(payload)
- path_value = normalised.get("path")
- if path_value:
- normalised["path"] = to_storage_relative(path_value)
- normalised_payloads.append(normalised)
- preset_payloads = normalised_payloads
- preset_doc_paths = [to_storage_relative(path) for path in preset.standards_docs]
- existing_paths = {canonical_storage_path(path) for path in standards_paths}
- for doc_path in preset.standards_docs:
- resolved_doc = resolve_storage_path(doc_path)
- doc_key = canonical_storage_path(resolved_doc)
- if doc_key not in existing_paths:
- standards_paths.append(resolved_doc)
- existing_paths.add(doc_key)
-
- if not standards_paths:
- raise HTTPException(
- status_code=status.HTTP_400_BAD_REQUEST,
- detail="Provide at least one standards PDF or choose a preset.",
- )
-
- session_metadata = dict(extra_metadata)
- if preset_payloads:
- session_metadata["preset_standards_payload"] = preset_payloads
- session_metadata["preset_standards_doc_paths"] = preset_doc_paths
- session_metadata.setdefault("presets", []).append(
- {
- "id": str(selected_preset.id) if selected_preset else standards_preset_id,
- "name": selected_preset.name if selected_preset else None,
- "documents": preset_doc_paths,
- }
- )
-
- session = session_service.create_session(
- name=name,
- source_doc=report_path,
- target_standard=target_standard,
- destination_standard=destination_standard,
- standards_docs=standards_paths,
- metadata=session_metadata,
- )
- if selected_preset:
- session.logs.append(
- f"Preset `{selected_preset.name}` applied with {len(preset_doc_paths)} document(s)."
- )
- session_service.save_session(session)
- background_tasks.add_task(orchestrator.run, session)
- return build_session_response(session)
-
-
-@router.get("", response_model=list[SessionSummaryResponse])
-async def list_sessions(
- session_service: SessionService = Depends(get_session_service),
-) -> list[SessionSummaryResponse]:
- sessions = [build_session_summary(session) for session in session_service.list_sessions()]
- return sorted(sessions, key=lambda session: session.created_at, reverse=True)
-
-
-@router.get("/{session_id}", response_model=SessionResponse)
-async def get_session(
- session_id: str,
- session_service: SessionService = Depends(get_session_service),
-) -> SessionResponse:
- session = session_service.get_session(session_id)
- if not session:
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Session not found")
- return build_session_response(session)
-
-
-@router.get("/{session_id}/status", response_model=SessionStatusResponse)
-async def get_session_status(
- session_id: str,
- session_service: SessionService = Depends(get_session_service),
-) -> SessionStatusResponse:
- session = session_service.get_session(session_id)
- if not session:
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Session not found")
- return SessionStatusResponse(
- id=session.id,
- status=session.status.value,
- updated_at=session.updated_at,
- )
-
-
-@router.get("/{session_id}/export", response_class=FileResponse)
-async def download_export(
- session_id: str,
- session_service: SessionService = Depends(get_session_service),
-) -> FileResponse:
- session = session_service.get_session(session_id)
- if not session:
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Session not found")
-
- manifest = session.metadata.get("export_manifest") if session.metadata else None
- if not manifest:
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Export not available")
-
- export_path = manifest.get("export_path")
- if not export_path or not Path(export_path).exists():
- raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Export file missing")
-
- filename = Path(export_path).name
- return FileResponse(
- export_path,
- media_type="application/vnd.openxmlformats-officedocument.wordprocessingml.document",
- filename=filename,
- )
diff --git a/server/app/api/schemas/__init__.py b/server/app/api/schemas/__init__.py
deleted file mode 100644
index f61782e5430aa09d3a3198c140d6e39152ee0c5a..0000000000000000000000000000000000000000
--- a/server/app/api/schemas/__init__.py
+++ /dev/null
@@ -1,23 +0,0 @@
-from .presets import PresetResponse, PresetUpdateRequest, build_preset_response
-from .sessions import (
- SessionCreateRequest,
- SessionResponse,
- SessionStatusResponse,
- SessionSummaryResponse,
- build_session_response,
- build_session_summary,
- session_to_response,
-)
-
-__all__ = [
- "PresetResponse",
- "PresetUpdateRequest",
- "SessionCreateRequest",
- "SessionResponse",
- "SessionSummaryResponse",
- "SessionStatusResponse",
- "build_session_response",
- "build_session_summary",
- "build_preset_response",
- "session_to_response",
-]
diff --git a/server/app/api/schemas/presets.py b/server/app/api/schemas/presets.py
deleted file mode 100644
index 57bab877fae008185b3bd95c22c0abb8679aeb5b..0000000000000000000000000000000000000000
--- a/server/app/api/schemas/presets.py
+++ /dev/null
@@ -1,46 +0,0 @@
-from __future__ import annotations
-
-from datetime import datetime
-from typing import List, Optional
-from uuid import UUID
-
-from pydantic import BaseModel, Field
-
-from ...models import StandardsPreset
-from ...utils.paths import to_storage_relative
-
-
-class PresetResponse(BaseModel):
- id: UUID
- name: str
- description: Optional[str] = None
- documents: List[str] = Field(default_factory=list)
- document_count: int
- status: str
- processed_count: int
- total_count: int
- last_error: Optional[str] = None
- created_at: datetime
- updated_at: datetime
-
-
-class PresetUpdateRequest(BaseModel):
- name: Optional[str] = None
- description: Optional[str] = None
-
-
-def build_preset_response(preset: StandardsPreset) -> PresetResponse:
- documents = [to_storage_relative(path) for path in preset.standards_docs]
- return PresetResponse(
- id=preset.id,
- name=preset.name,
- description=preset.description,
- documents=documents,
- document_count=len(documents),
- status=preset.status,
- processed_count=preset.processed_count,
- total_count=preset.total_count or len(documents),
- last_error=preset.last_error,
- created_at=preset.created_at,
- updated_at=preset.updated_at,
- )
diff --git a/server/app/api/schemas/sessions.py b/server/app/api/schemas/sessions.py
deleted file mode 100644
index b61a17540a3081600d4e5e49023bf2d9bce188ff..0000000000000000000000000000000000000000
--- a/server/app/api/schemas/sessions.py
+++ /dev/null
@@ -1,102 +0,0 @@
-from __future__ import annotations
-
-from datetime import datetime
-from typing import Any, List, Optional
-from uuid import UUID
-
-from pydantic import BaseModel, Field
-
-from ...utils.paths import to_storage_relative
-
-
-class SessionCreateRequest(BaseModel):
- name: str = Field(..., description="Human readable session name")
- target_standard: str = Field(..., description="Original standard identifier")
- destination_standard: str = Field(..., description="Target standard identifier")
- metadata: dict[str, Any] = Field(default_factory=dict)
-
-
-class SessionSummaryResponse(BaseModel):
- id: UUID
- name: str
- status: str
- created_at: datetime
- updated_at: datetime
- source_doc: str
- target_standard: str
- destination_standard: str
- standards_count: int = Field(default=0, description="Number of standards PDFs attached")
- last_error: Optional[str] = None
-
-
-class SessionResponse(BaseModel):
- id: UUID
- name: str
- status: str
- created_at: datetime
- updated_at: datetime
- source_doc: str
- target_standard: str
- destination_standard: str
- standards_count: int = Field(default=0, description="Number of standards PDFs attached")
- standards_docs: List[str] = Field(default_factory=list)
- logs: List[str] = Field(default_factory=list)
- metadata: dict[str, Any]
- last_error: Optional[str] = None
-
-
-class SessionStatusResponse(BaseModel):
- id: UUID
- status: str
- updated_at: datetime
-
-
-def session_to_response(manifest: dict[str, Any]) -> SessionResponse:
- return SessionResponse(
- id=manifest["id"],
- name=manifest["name"],
- status=manifest["status"],
- created_at=manifest["created_at"],
- updated_at=manifest["updated_at"],
- source_doc=to_storage_relative(manifest["source_doc"]),
- target_standard=manifest["target_standard"],
- destination_standard=manifest["destination_standard"],
- standards_count=len(manifest.get("standards_docs", [])),
- standards_docs=[to_storage_relative(path) for path in manifest.get("standards_docs", [])],
- logs=manifest.get("logs", []),
- metadata=manifest["metadata"],
- last_error=manifest.get("last_error"),
- )
-
-
-def build_session_response(session) -> SessionResponse:
- return SessionResponse(
- id=session.id,
- name=session.name,
- status=session.status.value,
- created_at=session.created_at,
- updated_at=session.updated_at,
- source_doc=to_storage_relative(session.source_doc),
- target_standard=session.target_standard,
- destination_standard=session.destination_standard,
- standards_count=len(session.standards_docs),
- standards_docs=[to_storage_relative(path) for path in session.standards_docs],
- logs=session.logs,
- metadata=session.metadata,
- last_error=session.last_error,
- )
-
-
-def build_session_summary(session) -> SessionSummaryResponse:
- return SessionSummaryResponse(
- id=session.id,
- name=session.name,
- status=session.status.value,
- created_at=session.created_at,
- updated_at=session.updated_at,
- source_doc=to_storage_relative(session.source_doc),
- target_standard=session.target_standard,
- destination_standard=session.destination_standard,
- standards_count=len(session.standards_docs),
- last_error=session.last_error,
- )
diff --git a/server/app/core/config.py b/server/app/core/config.py
deleted file mode 100644
index 3a56081362f0ecb2a75e735b3f343f1f05d75d22..0000000000000000000000000000000000000000
--- a/server/app/core/config.py
+++ /dev/null
@@ -1,66 +0,0 @@
-import logging
-from functools import lru_cache
-from pathlib import Path
-from typing import List, Optional
-
-from pydantic import Field, model_validator
-from pydantic_settings import BaseSettings, SettingsConfigDict
-
-logger = logging.getLogger(__name__)
-
-
-class Settings(BaseSettings):
- """Application-wide configuration."""
-
- model_config = SettingsConfigDict(env_file=(".env", ".env.local"), env_prefix="RIGHTCODES_")
-
- app_name: str = "RightCodes Server"
- api_prefix: str = "/api"
- cors_origins: List[str] = ["http://localhost:5173"]
- storage_dir: Path = Field(
- default_factory=lambda: (Path(__file__).resolve().parents[3] / "storage")
- )
- debug: bool = False
- openai_api_key: Optional[str] = None
- openai_api_base: str = "https://api.openai.com/v1"
- openai_model_extract: str = "gpt-4.1-mini"
- openai_model_mapping: str = "gpt-4.1-mini"
- openai_model_rewrite: str = "gpt-4o-mini"
- openai_model_validate: str = "gpt-4.1-mini"
- openai_model_embed: str = "text-embedding-3-large"
-
- @model_validator(mode="after")
- def _normalise_paths(self) -> "Settings":
- base_dir = Path(__file__).resolve().parents[3]
- storage_dir = self.storage_dir.expanduser()
- if not storage_dir.is_absolute():
- storage_dir = (base_dir / storage_dir).resolve()
- else:
- storage_dir = storage_dir.resolve()
- self.storage_dir = storage_dir
- self._warn_on_duplicate_storage_root(base_dir)
- return self
-
- def _warn_on_duplicate_storage_root(self, base_dir: Path) -> None:
- """Emit a warning if another storage directory is detected inside server/."""
- server_storage = base_dir / "server" / "storage"
- if server_storage.resolve() == self.storage_dir:
- return
- if server_storage.exists():
- try:
- if any(server_storage.iterdir()):
- logger.warning(
- "Detected stale server/storage directory at %s. "
- "All runtime artefacts now live under %s. "
- "Consider migrating or removing the duplicate to avoid confusion.",
- server_storage,
- self.storage_dir,
- )
- except OSError as exc: # noqa: BLE001
- logger.debug("Unable to inspect legacy storage directory %s: %s", server_storage, exc)
-
-
-@lru_cache
-def get_settings() -> Settings:
- """Return cached settings instance."""
- return Settings()
diff --git a/server/app/core/logging.py b/server/app/core/logging.py
deleted file mode 100644
index b00024e513ffe071b57b27a2253a7cba671fdd62..0000000000000000000000000000000000000000
--- a/server/app/core/logging.py
+++ /dev/null
@@ -1,27 +0,0 @@
-import logging
-from logging.config import dictConfig
-
-
-def configure_logging(debug: bool = False) -> None:
- """Configure application logging."""
- level = "DEBUG" if debug else "INFO"
- dictConfig(
- {
- "version": 1,
- "disable_existing_loggers": False,
- "formatters": {
- "default": {
- "format": "%(asctime)s %(levelname)s [%(name)s] %(message)s",
- }
- },
- "handlers": {
- "console": {
- "class": "logging.StreamHandler",
- "formatter": "default",
- "level": level,
- }
- },
- "root": {"handlers": ["console"], "level": level},
- }
- )
- logging.getLogger(__name__).debug("Logging configured (level=%s)", level)
diff --git a/server/app/events/__init__.py b/server/app/events/__init__.py
deleted file mode 100644
index 74e7e73caf18035b45e13f241a918cf5f13b4a7e..0000000000000000000000000000000000000000
--- a/server/app/events/__init__.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from .notifier import EventNotifier
-
-__all__ = ["EventNotifier"]
diff --git a/server/app/events/notifier.py b/server/app/events/notifier.py
deleted file mode 100644
index 4ed30075755caa015d6618c2a1ea5009b892f24c..0000000000000000000000000000000000000000
--- a/server/app/events/notifier.py
+++ /dev/null
@@ -1,32 +0,0 @@
-from __future__ import annotations
-
-import logging
-from dataclasses import dataclass
-from typing import Callable, Optional
-
-logger = logging.getLogger(__name__)
-
-
-@dataclass
-class Event:
- name: str
- payload: dict
-
-
-EventListener = Callable[[Event], None]
-
-
-class EventNotifier:
- """Simple pub/sub placeholder until real event bus is wired."""
-
- def __init__(self) -> None:
- self._listeners: list[EventListener] = []
-
- def connect(self, listener: EventListener) -> None:
- self._listeners.append(listener)
-
- def emit(self, name: str, payload: Optional[dict] = None) -> None:
- event = Event(name=name, payload=payload or {})
- logger.debug("Emitting event %s", event.name)
- for listener in self._listeners:
- listener(event)
diff --git a/server/app/main.py b/server/app/main.py
index b4230d12b571e63fb39cab2bc616a978f4059ec2..ee2c13e38bf49cb315f01a1d5ff39990b5687533 100644
--- a/server/app/main.py
+++ b/server/app/main.py
@@ -1,32 +1,22 @@
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
-from .api.routes import api_router
-from .core.config import get_settings
-from .core.logging import configure_logging
-
def create_app() -> FastAPI:
- settings = get_settings()
- configure_logging(settings.debug)
-
- app = FastAPI(
- title=settings.app_name,
- version="0.1.0",
- docs_url=f"{settings.api_prefix}/docs",
- redoc_url=f"{settings.api_prefix}/redoc",
- debug=settings.debug,
- )
+ app = FastAPI(title="Starter API", version="0.1.0")
app.add_middleware(
CORSMiddleware,
- allow_origins=settings.cors_origins,
+ allow_origins=["http://localhost:5173"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
- app.include_router(api_router, prefix=settings.api_prefix)
+ @app.get("/health", summary="Health check")
+ async def healthcheck() -> dict[str, str]:
+ return {"status": "ok"}
+
return app
diff --git a/server/app/models/__init__.py b/server/app/models/__init__.py
deleted file mode 100644
index 5455f8f644623f47eef2ad96af8442fee037864b..0000000000000000000000000000000000000000
--- a/server/app/models/__init__.py
+++ /dev/null
@@ -1,4 +0,0 @@
-from .preset import StandardsPreset
-from .session import Session, SessionStatus
-
-__all__ = ["Session", "SessionStatus", "StandardsPreset"]
diff --git a/server/app/models/preset.py b/server/app/models/preset.py
deleted file mode 100644
index 5f1071142e78be2950684b5a3b5d0c765acfff91..0000000000000000000000000000000000000000
--- a/server/app/models/preset.py
+++ /dev/null
@@ -1,27 +0,0 @@
-from __future__ import annotations
-
-from dataclasses import dataclass, field
-from datetime import datetime
-from pathlib import Path
-from typing import Any, List, Optional
-from uuid import UUID, uuid4
-
-
-@dataclass()
-class StandardsPreset:
- """Represents a reusable collection of parsed standards documents."""
-
- name: str
- standards_docs: list[Path]
- parsed_payloads: list[dict[str, Any]]
- description: Optional[str] = None
- id: UUID = field(default_factory=uuid4)
- created_at: datetime = field(default_factory=datetime.utcnow)
- updated_at: datetime = field(default_factory=datetime.utcnow)
- status: str = "ready"
- processed_count: int = 0
- total_count: int = 0
- last_error: Optional[str] = None
-
- def touch(self) -> None:
- self.updated_at = datetime.utcnow()
diff --git a/server/app/models/session.py b/server/app/models/session.py
deleted file mode 100644
index 17fa22569c058d1b113c446b3e2ef0acdbd9878b..0000000000000000000000000000000000000000
--- a/server/app/models/session.py
+++ /dev/null
@@ -1,39 +0,0 @@
-from __future__ import annotations
-
-from dataclasses import dataclass, field
-from datetime import datetime
-from enum import Enum
-from pathlib import Path
-from typing import Any, Optional
-from uuid import UUID, uuid4
-
-
-class SessionStatus(str, Enum):
- CREATED = "created"
- PROCESSING = "processing"
- REVIEW = "review"
- COMPLETED = "completed"
- FAILED = "failed"
-
-
-@dataclass()
-class Session:
- """Represents a document conversion session."""
-
- name: str
- source_doc: Path
- target_standard: str
- destination_standard: str
- standards_docs: list[Path] = field(default_factory=list)
- logs: list[str] = field(default_factory=list)
- id: UUID = field(default_factory=uuid4)
- created_at: datetime = field(default_factory=datetime.utcnow)
- updated_at: datetime = field(default_factory=datetime.utcnow)
- status: SessionStatus = SessionStatus.CREATED
- metadata: dict[str, Any] = field(default_factory=dict)
- last_error: Optional[str] = None
-
- def update_status(self, status: SessionStatus, *, error: Optional[str] = None) -> None:
- self.status = status
- self.updated_at = datetime.utcnow()
- self.last_error = error
diff --git a/server/app/services/__init__.py b/server/app/services/__init__.py
deleted file mode 100644
index d263399a83a5c03929b1d39f5755397305d3a1dd..0000000000000000000000000000000000000000
--- a/server/app/services/__init__.py
+++ /dev/null
@@ -1,12 +0,0 @@
-from .diagnostics_service import DiagnosticsService, get_diagnostics_service
-from .file_cache import FileCache
-from .preset_service import PresetService
-from .session_service import SessionService
-
-__all__ = [
- "SessionService",
- "FileCache",
- "PresetService",
- "DiagnosticsService",
- "get_diagnostics_service",
-]
diff --git a/server/app/services/diagnostics_service.py b/server/app/services/diagnostics_service.py
deleted file mode 100644
index e4db711ad546a031e03de9af19e2fc7c1c214e06..0000000000000000000000000000000000000000
--- a/server/app/services/diagnostics_service.py
+++ /dev/null
@@ -1,121 +0,0 @@
-from __future__ import annotations
-
-from collections import deque
-from dataclasses import dataclass
-from datetime import datetime, timezone
-from functools import lru_cache
-from threading import Lock
-from typing import Any, Deque, Dict, List, Optional
-from uuid import uuid4
-
-
-@dataclass
-class DiagnosticsEvent:
- id: str
- timestamp: str
- event_type: str
- message: str
- node_id: Optional[str]
- metadata: Dict[str, Any]
-
-
-class DiagnosticsService:
- """Lightweight in-memory tracker for system topology and recent activity."""
-
- _MAX_EVENTS = 256
-
- def __init__(self) -> None:
- self._lock = Lock()
- self._events: Deque[DiagnosticsEvent] = deque(maxlen=self._MAX_EVENTS)
- self._node_activity: Dict[str, str] = {}
- self._topology = self._build_default_topology()
-
- def _build_default_topology(self) -> Dict[str, List[Dict[str, Any]]]:
- nodes = [
- {"id": "uploads", "label": "Uploads", "group": "storage", "position": {"x": 8, "y": 45}},
- {"id": "sessions", "label": "Sessions", "group": "service", "position": {"x": 25, "y": 25}},
- {"id": "pipeline", "label": "Pipeline", "group": "service", "position": {"x": 45, "y": 25}},
- {"id": "manifests", "label": "Manifests", "group": "storage", "position": {"x": 65, "y": 15}},
- {"id": "embeddings", "label": "Embeddings", "group": "storage", "position": {"x": 65, "y": 35}},
- {"id": "exports", "label": "Exports", "group": "storage", "position": {"x": 85, "y": 45}},
- {"id": "presets", "label": "Presets", "group": "storage", "position": {"x": 25, "y": 45}},
- {"id": "openai", "label": "OpenAI API", "group": "external", "position": {"x": 45, "y": 5}},
- ]
- edges = [
- {"id": "uploads->sessions", "source": "uploads", "target": "sessions"},
- {"id": "sessions->pipeline", "source": "sessions", "target": "pipeline"},
- {"id": "presets->pipeline", "source": "presets", "target": "pipeline"},
- {"id": "pipeline->manifests", "source": "pipeline", "target": "manifests"},
- {"id": "pipeline->embeddings", "source": "pipeline", "target": "embeddings"},
- {"id": "pipeline->exports", "source": "pipeline", "target": "exports"},
- {"id": "pipeline->openai", "source": "pipeline", "target": "openai"},
- ]
- return {"nodes": nodes, "edges": edges}
-
- def record_event(
- self,
- *,
- event_type: str,
- message: str,
- node_id: Optional[str] = None,
- metadata: Optional[Dict[str, Any]] = None,
- ) -> DiagnosticsEvent:
- timestamp = datetime.utcnow().replace(tzinfo=timezone.utc).isoformat()
- event = DiagnosticsEvent(
- id=str(uuid4()),
- timestamp=timestamp,
- event_type=event_type,
- message=message,
- node_id=node_id,
- metadata=dict(metadata or {}),
- )
- with self._lock:
- self._events.appendleft(event)
- if node_id:
- self._node_activity[node_id] = timestamp
- return event
-
- def get_topology(self) -> Dict[str, Any]:
- with self._lock:
- nodes = [
- {
- **node,
- "last_event_at": self._node_activity.get(node["id"]),
- }
- for node in self._topology["nodes"]
- ]
- return {"nodes": nodes, "edges": list(self._topology["edges"])}
-
- def get_events(self, *, limit: int = 50, since: Optional[str] = None) -> List[Dict[str, Any]]:
- with self._lock:
- events = list(self._events)
- if since:
- try:
- since_dt = datetime.fromisoformat(since)
- events = [
- event for event in events if datetime.fromisoformat(event.timestamp) >= since_dt
- ]
- except ValueError:
- # Ignore malformed 'since' parameter; return full list instead.
- pass
- return [
- {
- "id": event.id,
- "timestamp": event.timestamp,
- "event_type": event.event_type,
- "message": event.message,
- "node_id": event.node_id,
- "metadata": event.metadata,
- }
- for event in events[:limit]
- ]
-
- def clear(self) -> None:
- with self._lock:
- self._events.clear()
- self._node_activity.clear()
-
-
-@lru_cache
-def get_diagnostics_service() -> DiagnosticsService:
- return DiagnosticsService()
diff --git a/server/app/services/file_cache.py b/server/app/services/file_cache.py
deleted file mode 100644
index 7eea5dfc3df1107534c02377377a263d858c51a4..0000000000000000000000000000000000000000
--- a/server/app/services/file_cache.py
+++ /dev/null
@@ -1,53 +0,0 @@
-from __future__ import annotations
-
-import hashlib
-import json
-from pathlib import Path
-from typing import Any, Optional
-
-from ..core.config import get_settings
-
-
-class FileCache:
- """Content-addressed cache for expensive file processing results."""
-
- def __init__(self, base_dir: Optional[Path] = None) -> None:
- settings = get_settings()
- cache_root = base_dir or Path(settings.storage_dir) / "cache" / "files"
- cache_root.mkdir(parents=True, exist_ok=True)
- self._cache_root = cache_root
-
- def compute_key(self, file_path: Path, namespace: str, *, extra: Optional[str] = None) -> str:
- """Return a deterministic key derived from file contents and optional metadata."""
- digest = _hash_file(file_path)
- components = [namespace, digest]
- if extra:
- components.append(extra)
- return hashlib.sha256("::".join(components).encode("utf-8")).hexdigest()
-
- def load(self, namespace: str, key: str) -> Optional[dict[str, Any]]:
- """Return cached payload for the given namespace/key pair, if it exists."""
- cache_path = self._cache_path(namespace, key)
- if not cache_path.exists():
- return None
- with cache_path.open("r", encoding="utf-8") as handle:
- return json.load(handle)
-
- def store(self, namespace: str, key: str, payload: dict[str, Any]) -> None:
- """Persist payload to the cache."""
- cache_path = self._cache_path(namespace, key)
- cache_path.parent.mkdir(parents=True, exist_ok=True)
- temp_path = cache_path.with_suffix(".json.tmp")
- temp_path.write_text(json.dumps(payload, indent=2), encoding="utf-8")
- temp_path.replace(cache_path)
-
- def _cache_path(self, namespace: str, key: str) -> Path:
- return self._cache_root / namespace / f"{key}.json"
-
-
-def _hash_file(path: Path) -> str:
- hasher = hashlib.sha256()
- with path.open("rb") as handle:
- for chunk in iter(lambda: handle.read(1024 * 1024), b""):
- hasher.update(chunk)
- return hasher.hexdigest()
diff --git a/server/app/services/preset_service.py b/server/app/services/preset_service.py
deleted file mode 100644
index ed4dd5aabf4421894a64aa055e3be7e9fe0d784a..0000000000000000000000000000000000000000
--- a/server/app/services/preset_service.py
+++ /dev/null
@@ -1,408 +0,0 @@
-from __future__ import annotations
-
-import json
-from datetime import datetime
-from pathlib import Path
-from typing import Any, Dict, Iterable, List, Optional
-from uuid import UUID
-
-from workers.pdf_processing import parse_pdf
-
-from ..core.config import get_settings
-from ..models import StandardsPreset
-from ..services.diagnostics_service import get_diagnostics_service
-from ..utils.paths import canonical_storage_path, resolve_storage_path, to_storage_relative
-from ..workflows.serializers import serialize_pdf_result
-from .file_cache import FileCache
-
-
-class PresetService:
- """File-backed store for standards presets."""
-
- def __init__(self) -> None:
- settings = get_settings()
- storage_root = Path(settings.storage_dir) / "presets"
- storage_root.mkdir(parents=True, exist_ok=True)
- self._storage_root = storage_root
- self._presets: dict[str, StandardsPreset] = {}
- self._progress: Dict[str, Dict[str, Any]] = {}
- self._diagnostics = get_diagnostics_service()
- self._load_existing_presets()
-
- def list_presets(self) -> Iterable[StandardsPreset]:
- return self._presets.values()
-
- def get_preset(self, preset_id: str) -> Optional[StandardsPreset]:
- preset = self._presets.get(str(preset_id))
- if not preset:
- return None
- return self._ensure_payloads_loaded(preset)
-
- def start_preset(
- self,
- *,
- name: str,
- documents: List[Path],
- description: Optional[str] = None,
- ) -> StandardsPreset:
- normalized_docs: list[Path] = []
- seen_paths = set()
- for doc in documents:
- doc_path = resolve_storage_path(doc)
- doc_key = self._path_key(doc_path)
- if doc_key in seen_paths:
- continue
- seen_paths.add(doc_key)
- normalized_docs.append(doc_path)
-
- if not normalized_docs:
- raise ValueError("Preset requires at least one standards document.")
-
- preset = StandardsPreset(
- name=name,
- description=description,
- standards_docs=normalized_docs,
- parsed_payloads=[],
- status="processing",
- total_count=len(normalized_docs),
- processed_count=0,
- last_error=None,
- )
- self._save_preset(preset)
- self._progress[str(preset.id)] = {
- "total": preset.total_count,
- "processed": 0,
- "status": preset.status,
- }
- self._diagnostics.record_event(
- node_id="presets",
- event_type="preset.created",
- message=f"Preset `{preset.name}` created",
- metadata={"preset_id": str(preset.id), "documents": preset.total_count},
- )
- return preset
-
- def process_preset(self, preset_id: str, file_cache: Optional[FileCache] = None) -> None:
- preset = self.get_preset(preset_id)
- if not preset:
- return
- payloads: list[dict[str, Any]] = []
- try:
- if not preset.standards_docs:
- preset.status = "ready"
- preset.total_count = 0
- preset.processed_count = 0
- preset.parsed_payloads = []
- preset.last_error = None
- self._save_preset(preset)
- self._progress[str(preset.id)] = {
- "total": 0,
- "processed": 0,
- "status": preset.status,
- }
- return
-
- for index, path in enumerate(preset.standards_docs, start=1):
- resolved_path = resolve_storage_path(path)
- preset.standards_docs[index - 1] = resolved_path
- if not resolved_path.exists():
- raise FileNotFoundError(resolved_path)
- payload = self._load_or_parse(resolved_path, file_cache)
- payloads.append(payload)
- preset.parsed_payloads = payloads.copy()
- preset.processed_count = index
- preset.status = "processing"
- preset.last_error = None
- self._save_preset(preset)
- self._progress[str(preset.id)] = {
- "total": preset.total_count,
- "processed": preset.processed_count,
- "status": preset.status,
- }
- self._diagnostics.record_event(
- node_id="presets",
- event_type="preset.process",
- message=f"Parsed preset document `{resolved_path.name}`",
- metadata={"preset_id": str(preset.id), "path": to_storage_relative(resolved_path)},
- )
- preset.status = "ready"
- preset.processed_count = preset.total_count
- preset.parsed_payloads = payloads
- self._save_preset(preset)
- self._progress[str(preset.id)] = {
- "total": preset.total_count,
- "processed": preset.processed_count,
- "status": preset.status,
- }
- preset.parsed_payloads = []
- self._diagnostics.record_event(
- node_id="presets",
- event_type="preset.ready",
- message=f"Preset `{preset.name}` ready",
- metadata={"preset_id": str(preset.id)},
- )
- except Exception as exc: # noqa: BLE001
- preset.status = "failed"
- preset.last_error = str(exc)
- self._save_preset(preset)
- self._progress[str(preset.id)] = {
- "total": preset.total_count,
- "processed": preset.processed_count,
- "status": preset.status,
- "error": str(exc),
- }
- self._diagnostics.record_event(
- node_id="presets",
- event_type="preset.failed",
- message=f"Preset `{preset.name}` failed: {exc}",
- metadata={"preset_id": str(preset.id), "error": str(exc)},
- )
-
- def _load_or_parse(self, path: Path, file_cache: Optional[FileCache]) -> dict[str, Any]:
- path = resolve_storage_path(path)
- payload: Optional[dict[str, Any]] = None
- cache_key = None
- if file_cache is not None:
- cache_key = file_cache.compute_key(path, "standards-parse", extra="max_chunk_chars=1200")
- cached = file_cache.load("standards-parse", cache_key)
- if cached:
- payload = dict(cached)
- payload["path"] = to_storage_relative(path)
- if payload is None:
- result = parse_pdf(path)
- payload = serialize_pdf_result(result)
- payload["path"] = to_storage_relative(path)
- if file_cache is not None and cache_key:
- file_cache.store("standards-parse", cache_key, payload)
- return payload
-
- def add_documents(self, preset_id: str, documents: List[Path]) -> Optional[StandardsPreset]:
- preset = self._presets.get(str(preset_id))
- if not preset:
- return None
- existing = {self._path_key(path) for path in preset.standards_docs}
- new_paths: list[Path] = []
- for doc in documents:
- doc_path = resolve_storage_path(doc)
- key = self._path_key(doc_path)
- if key in existing:
- continue
- new_paths.append(doc_path)
- existing.add(key)
- if not new_paths:
- return preset
- preset.standards_docs.extend(new_paths)
- preset.total_count = len(preset.standards_docs)
- preset.processed_count = 0
- preset.status = "processing"
- preset.last_error = None
- preset.parsed_payloads = []
- self._save_preset(preset)
- self._progress[str(preset.id)] = {
- "total": preset.total_count,
- "processed": 0,
- "status": preset.status,
- }
- self._diagnostics.record_event(
- node_id="presets",
- event_type="preset.updated",
- message=f"Added {len(new_paths)} document(s) to preset `{preset.name}`",
- metadata={"preset_id": str(preset.id)},
- )
- return preset
-
- def remove_document(self, preset_id: str, document_path: str) -> Optional[StandardsPreset]:
- preset = self._presets.get(str(preset_id))
- if not preset:
- return None
- document_key = self._path_key(resolve_storage_path(document_path))
- updated_docs = [path for path in preset.standards_docs if self._path_key(path) != document_key]
- if len(updated_docs) == len(preset.standards_docs):
- return preset
- preset.standards_docs = updated_docs
- preset.total_count = len(updated_docs)
- preset.processed_count = 0
- preset.status = "processing" if updated_docs else "ready"
- preset.last_error = None
- preset.parsed_payloads = []
- self._save_preset(preset)
- self._progress[str(preset.id)] = {
- "total": preset.total_count,
- "processed": 0,
- "status": preset.status,
- }
- self._diagnostics.record_event(
- node_id="presets",
- event_type="preset.updated",
- message=f"Removed document from preset `{preset.name}`",
- metadata={"preset_id": str(preset.id), "path": document_path},
- )
- return preset
-
- def get_progress(self, preset_id: str) -> Optional[Dict[str, Any]]:
- progress = self._progress.get(str(preset_id))
- if progress:
- return progress
- preset = self.get_preset(preset_id)
- if not preset:
- return None
- return {
- "total": preset.total_count,
- "processed": preset.processed_count,
- "status": preset.status,
- "error": preset.last_error,
- }
-
- def update_preset(
- self,
- preset_id: str,
- *,
- name: Optional[str] = None,
- description: Optional[str] = None,
- ) -> Optional[StandardsPreset]:
- preset = self._presets.get(str(preset_id))
- if not preset:
- return None
- if name is not None:
- preset.name = name
- if description is not None:
- preset.description = description
- self._save_preset(preset)
- return preset
-
- def delete_preset(self, preset_id: str) -> bool:
- preset = self._presets.pop(str(preset_id), None)
- if not preset:
- return False
- manifest_path = self._storage_root / f"{preset.id}.json"
- data_path = self._data_path(preset.id)
- if manifest_path.exists():
- manifest_path.unlink()
- if data_path.exists():
- data_path.unlink()
- self._progress.pop(str(preset.id), None)
- return True
-
- def _save_preset(self, preset: StandardsPreset) -> None:
- preset.touch()
- self._presets[str(preset.id)] = preset
- self._write_manifest(preset)
- self._write_payloads(preset)
-
- def _write_manifest(self, preset: StandardsPreset) -> None:
- manifest_path = self._storage_root / f"{preset.id}.json"
- payload = {
- "id": str(preset.id),
- "name": preset.name,
- "description": preset.description,
- "standards_docs": [to_storage_relative(path) for path in preset.standards_docs],
- "created_at": preset.created_at.isoformat(),
- "updated_at": preset.updated_at.isoformat(),
- "status": preset.status,
- "processed_count": preset.processed_count,
- "total_count": preset.total_count,
- "last_error": preset.last_error,
- }
- manifest_path.write_text(json.dumps(payload, indent=2), encoding="utf-8")
-
- def _load_existing_presets(self) -> None:
- for manifest_path in sorted(self._storage_root.glob("*.json")):
- try:
- raw = manifest_path.read_text(encoding="utf-8")
- data = json.loads(raw)
- payloads = data.get("parsed_payloads")
- preset = StandardsPreset(
- id=UUID(data["id"]),
- name=data["name"],
- description=data.get("description"),
- standards_docs=[
- resolve_storage_path(path) for path in data.get("standards_docs", [])
- ],
- parsed_payloads=list(payloads) if isinstance(payloads, list) else [],
- created_at=_parse_datetime(data.get("created_at")),
- updated_at=_parse_datetime(data.get("updated_at")),
- status=data.get("status", "ready"),
- processed_count=int(
- data.get(
- "processed_count",
- len(payloads) if isinstance(payloads, list) else 0,
- )
- ),
- total_count=int(data.get("total_count", len(data.get("standards_docs", [])))),
- last_error=data.get("last_error"),
- )
- except Exception: # noqa: BLE001
- continue
- self._presets[str(preset.id)] = preset
- if preset.parsed_payloads:
- self._write_payloads(preset)
- preset.parsed_payloads = []
- self._write_manifest(preset)
- if preset.status != "ready":
- self._progress[str(preset.id)] = {
- "total": preset.total_count or len(preset.standards_docs),
- "processed": preset.processed_count,
- "status": preset.status,
- "error": preset.last_error,
- }
-
- def _ensure_payloads_loaded(self, preset: StandardsPreset) -> StandardsPreset:
- if preset.parsed_payloads:
- return preset
- data_path = self._data_path(preset.id)
- if data_path.exists():
- try:
- raw = data_path.read_text(encoding="utf-8")
- data = json.loads(raw)
- parsed = data.get("parsed_payloads", [])
- if isinstance(parsed, list):
- normalised: list[dict[str, Any]] = []
- for entry in parsed:
- if not isinstance(entry, dict):
- continue
- normalised_entry = dict(entry)
- path_value = normalised_entry.get("path")
- if path_value:
- normalised_entry["path"] = to_storage_relative(path_value)
- normalised.append(normalised_entry)
- preset.parsed_payloads = normalised
- except Exception: # noqa: BLE001
- preset.parsed_payloads = []
- return preset
-
- def _write_payloads(self, preset: StandardsPreset) -> None:
- if not preset.parsed_payloads:
- return
- data_path = self._data_path(preset.id)
- if preset.parsed_payloads:
- serialisable: list[dict[str, Any]] = []
- for entry in preset.parsed_payloads:
- if not isinstance(entry, dict):
- continue
- normalised_entry = dict(entry)
- path_value = normalised_entry.get("path")
- if path_value:
- normalised_entry["path"] = to_storage_relative(path_value)
- serialisable.append(normalised_entry)
- if not serialisable:
- return
- data = {"parsed_payloads": serialisable}
- data_path.write_text(json.dumps(data, indent=2), encoding="utf-8")
- elif data_path.exists():
- data_path.unlink()
-
- def _data_path(self, preset_id: UUID | str) -> Path:
- return self._storage_root / f"{preset_id}.data.json"
-
- @staticmethod
- def _path_key(path: Path | str) -> str:
- return canonical_storage_path(path)
-
-
-def _parse_datetime(value: Optional[str]) -> datetime:
- if value is None:
- return datetime.utcnow()
- try:
- return datetime.fromisoformat(value)
- except ValueError:
- return datetime.utcnow()
diff --git a/server/app/services/session_service.py b/server/app/services/session_service.py
deleted file mode 100644
index e84cbfb87fc4096de9b43d25042ad4e554fa875d..0000000000000000000000000000000000000000
--- a/server/app/services/session_service.py
+++ /dev/null
@@ -1,248 +0,0 @@
-from __future__ import annotations
-
-import json
-from datetime import datetime
-from pathlib import Path
-from typing import Any, Iterable, Optional, List
-from uuid import UUID
-import logging
-
-from ..core.config import get_settings
-from ..models import Session, SessionStatus
-from ..services.diagnostics_service import get_diagnostics_service
-from ..utils.paths import resolve_storage_path, to_storage_relative
-
-logger = logging.getLogger(__name__)
-
-
-class SessionService:
- """Simple in-memory + file-backed session registry."""
-
- def __init__(self) -> None:
- settings = get_settings()
- storage_root = Path(settings.storage_dir) / "manifests"
- storage_root.mkdir(parents=True, exist_ok=True)
- self._storage_root = storage_root
- self._sessions: dict[str, Session] = {}
- self._diagnostics = get_diagnostics_service()
- logger.info("SessionService initialized with storage %s", self._storage_root)
- self._manifest_index: dict[str, float] = {}
- self._load_existing_sessions()
- logger.info("SessionService startup loaded %d session(s)", len(self._sessions))
-
- def list_sessions(self) -> Iterable[Session]:
- self._load_existing_sessions()
- return self._sessions.values()
-
- def get_session(self, session_id: str) -> Optional[Session]:
- return self._sessions.get(session_id)
-
- def create_session(
- self,
- *,
- name: str,
- source_doc: Path,
- target_standard: str,
- destination_standard: str,
- standards_docs: Optional[List[Path]] = None,
- metadata: Optional[dict[str, Any]] = None,
- ) -> Session:
- source_doc = resolve_storage_path(source_doc)
- normalised_docs = [resolve_storage_path(path) for path in (standards_docs or [])]
- session = Session(
- name=name,
- source_doc=source_doc,
- target_standard=target_standard,
- destination_standard=destination_standard,
- standards_docs=normalised_docs,
- metadata=metadata or {},
- )
- session.logs.append("Session created.")
- session.logs.append(f"Attached {len(session.standards_docs)} standards PDF(s).")
- self.save_session(session)
- self._diagnostics.record_event(
- node_id="sessions",
- event_type="session.created",
- message=f"Session `{session.name}` created",
- metadata={"session_id": str(session.id)},
- )
- return session
-
- def update_status(
- self, session_id: str, status: SessionStatus, *, error: Optional[str] = None
- ) -> Optional[Session]:
- session = self._sessions.get(session_id)
- if not session:
- return None
- session.update_status(status, error=error)
- session.logs.append(f"Status changed to {session.status.value}.")
- self.save_session(session)
- self._diagnostics.record_event(
- node_id="sessions",
- event_type="session.status",
- message=f"Session `{session.name}` status -> {session.status.value}",
- metadata={"session_id": str(session.id), "status": session.status.value},
- )
- return session
-
- def save_session(self, session: Session) -> None:
- self._sessions[str(session.id)] = session
- self._write_manifest(session)
-
- def store_stage_output(self, session: Session, key: str, value: Any) -> None:
- session.metadata[key] = value
- self.save_session(session)
-
- def _write_manifest(self, session: Session) -> None:
- manifest_path = self._storage_root / f"{session.id}.json"
- metadata = self._normalise_metadata(session.metadata)
- session.metadata = metadata
- payload = {
- "id": str(session.id),
- "name": session.name,
- "status": session.status.value,
- "source_doc": to_storage_relative(session.source_doc),
- "target_standard": session.target_standard,
- "destination_standard": session.destination_standard,
- "standards_docs": [to_storage_relative(path) for path in session.standards_docs],
- "created_at": session.created_at.isoformat(),
- "updated_at": session.updated_at.isoformat(),
- "metadata": metadata,
- "last_error": session.last_error,
- "logs": session.logs,
- }
- manifest_path.write_text(json.dumps(payload, indent=2), encoding="utf-8")
- self._diagnostics.record_event(
- node_id="manifests",
- event_type="manifest.write",
- message=f"Manifest updated for session `{session.name}`",
- metadata={"session_id": str(session.id), "path": str(manifest_path)},
- )
-
- def _load_existing_sessions(self) -> None:
- current_files = {path.stem: path for path in self._storage_root.glob("*.json")}
- # remove sessions whose manifest no longer exists
- removed_ids = set(self._sessions.keys()) - set(current_files.keys())
- for session_id in removed_ids:
- self._sessions.pop(session_id, None)
- self._manifest_index.pop(session_id, None)
- logger.info("SessionService removed missing session %s", session_id)
-
- total = 0
- skipped = 0
- for session_id, manifest_path in current_files.items():
- try:
- mtime = manifest_path.stat().st_mtime
- except OSError as exc:
- logger.warning("Unable to stat manifest %s: %s", manifest_path, exc)
- continue
-
- previous_mtime = self._manifest_index.get(session_id)
- if previous_mtime is not None and abs(previous_mtime - mtime) < 1e-6:
- continue # unchanged
- try:
- raw = manifest_path.read_text(encoding="utf-8")
- data = json.loads(raw)
- session = self._session_from_manifest(data)
- except Exception as exc: # noqa: BLE001
- skipped += 1
- logger.exception("Failed to load session manifest %s: %s", manifest_path, exc)
- continue
- self._sessions[str(session.id)] = session
- self._manifest_index[str(session.id)] = mtime
- total += 1
- logger.info(
- "SessionService manifest scan complete: %d refreshed, %d skipped (from %s)",
- total,
- skipped,
- self._storage_root,
- )
-
- def _session_from_manifest(self, manifest: dict[str, Any]) -> Session:
- created_at = datetime.fromisoformat(manifest["created_at"])
- updated_at = datetime.fromisoformat(manifest["updated_at"])
- session = Session(
- id=UUID(manifest["id"]),
- name=manifest["name"],
- source_doc=resolve_storage_path(manifest["source_doc"]),
- target_standard=manifest["target_standard"],
- destination_standard=manifest["destination_standard"],
- standards_docs=[resolve_storage_path(path) for path in manifest.get("standards_docs", [])],
- logs=list(manifest.get("logs", [])),
- created_at=created_at,
- updated_at=updated_at,
- status=SessionStatus(manifest["status"]),
- metadata=self._normalise_metadata(manifest.get("metadata", {})),
- last_error=manifest.get("last_error"),
- )
- return session
-
- @staticmethod
- def _normalise_metadata(metadata: Any) -> dict[str, Any]:
- if not isinstance(metadata, dict):
- return {}
- result: dict[str, Any] = dict(metadata)
-
- doc_parse = result.get("doc_parse")
- if isinstance(doc_parse, dict):
- doc_copy = dict(doc_parse)
- path_value = doc_copy.get("path")
- if path_value:
- doc_copy["path"] = to_storage_relative(path_value)
- result["doc_parse"] = doc_copy
-
- standards_parse = result.get("standards_parse")
- if isinstance(standards_parse, list):
- normalised_chunks: list[Any] = []
- for entry in standards_parse:
- if isinstance(entry, dict):
- entry_copy = dict(entry)
- path_value = entry_copy.get("path")
- if path_value:
- entry_copy["path"] = to_storage_relative(path_value)
- normalised_chunks.append(entry_copy)
- else:
- normalised_chunks.append(entry)
- result["standards_parse"] = normalised_chunks
-
- preset_payloads = result.get("preset_standards_payload")
- if isinstance(preset_payloads, list):
- normalised_payloads: list[Any] = []
- for entry in preset_payloads:
- if isinstance(entry, dict):
- entry_copy = dict(entry)
- path_value = entry_copy.get("path")
- if path_value:
- entry_copy["path"] = to_storage_relative(path_value)
- normalised_payloads.append(entry_copy)
- else:
- normalised_payloads.append(entry)
- result["preset_standards_payload"] = normalised_payloads
-
- doc_paths = result.get("preset_standards_doc_paths")
- if isinstance(doc_paths, list):
- result["preset_standards_doc_paths"] = [to_storage_relative(path) for path in doc_paths]
-
- presets_meta = result.get("presets")
- if isinstance(presets_meta, list):
- normalised_presets: list[Any] = []
- for entry in presets_meta:
- if isinstance(entry, dict):
- entry_copy = dict(entry)
- documents = entry_copy.get("documents")
- if isinstance(documents, list):
- entry_copy["documents"] = [to_storage_relative(path) for path in documents]
- normalised_presets.append(entry_copy)
- else:
- normalised_presets.append(entry)
- result["presets"] = normalised_presets
-
- progress = result.get("standards_ingest_progress")
- if isinstance(progress, dict):
- progress_copy = dict(progress)
- current_file = progress_copy.get("current_file")
- if current_file:
- progress_copy["current_file"] = to_storage_relative(current_file)
- result["standards_ingest_progress"] = progress_copy
-
- return result
diff --git a/server/app/utils/__init__.py b/server/app/utils/__init__.py
deleted file mode 100644
index 5ad3fd7b869271412ba7ac7ab8e5d3f02db61fe7..0000000000000000000000000000000000000000
--- a/server/app/utils/__init__.py
+++ /dev/null
@@ -1,9 +0,0 @@
-from .files import save_upload
-from .paths import canonical_storage_path, resolve_storage_path, to_storage_relative
-
-__all__ = [
- "save_upload",
- "canonical_storage_path",
- "resolve_storage_path",
- "to_storage_relative",
-]
diff --git a/server/app/utils/files.py b/server/app/utils/files.py
deleted file mode 100644
index 4470a7c8535066927d1306a80d2a06bdf78c8968..0000000000000000000000000000000000000000
--- a/server/app/utils/files.py
+++ /dev/null
@@ -1,32 +0,0 @@
-from __future__ import annotations
-
-from pathlib import Path
-from typing import BinaryIO
-
-from ..core.config import get_settings
-from ..services.diagnostics_service import get_diagnostics_service
-
-
-def save_upload(filename: str, fileobj: BinaryIO, *, subdir: str = "originals") -> Path:
- """Persist an uploaded file to the storage directory."""
- settings = get_settings()
- upload_dir = Path(settings.storage_dir) / subdir
- upload_dir.mkdir(parents=True, exist_ok=True)
- path = upload_dir / filename
- if path.exists():
- stem = path.stem
- suffix = path.suffix
- counter = 1
- while path.exists():
- path = upload_dir / f"{stem}_{counter}{suffix}"
- counter += 1
- with path.open("wb") as buffer:
- buffer.write(fileobj.read())
- diagnostics = get_diagnostics_service()
- diagnostics.record_event(
- node_id="uploads",
- event_type="file.saved",
- message=f"Stored upload `{path.name}`",
- metadata={"path": str(path)},
- )
- return path
diff --git a/server/app/utils/paths.py b/server/app/utils/paths.py
deleted file mode 100644
index c67eab778d0bae7d17f8019f20e56f45a34fbd43..0000000000000000000000000000000000000000
--- a/server/app/utils/paths.py
+++ /dev/null
@@ -1,37 +0,0 @@
-from __future__ import annotations
-
-from pathlib import Path
-
-from ..core.config import get_settings
-
-
-def resolve_storage_path(path: Path | str) -> Path:
- """Return an absolute Path for any file under the storage tree."""
- candidate = Path(path).expanduser()
- if candidate.is_absolute():
- return candidate.resolve()
-
- storage_dir = get_settings().storage_dir
- parts = list(candidate.parts)
- if parts and parts[0].lower() == "storage":
- relative = Path(*parts[1:]) if len(parts) > 1 else Path()
- return (storage_dir / relative).resolve()
-
- base_dir = Path(__file__).resolve().parents[3]
- return (base_dir / candidate).resolve()
-
-
-def to_storage_relative(path: Path | str) -> str:
- """Render a path relative to the storage root for manifest readability."""
- storage_dir = get_settings().storage_dir
- resolved = resolve_storage_path(path)
- try:
- rel = resolved.relative_to(storage_dir)
- return str(Path("storage") / rel)
- except ValueError:
- return str(resolved)
-
-
-def canonical_storage_path(path: Path | str) -> str:
- """Return a canonical string (POSIX) for comparing storage paths."""
- return resolve_storage_path(path).as_posix()
diff --git a/server/app/workflows/__init__.py b/server/app/workflows/__init__.py
deleted file mode 100644
index 32f12719b4eb63587f3d67954e44b6a115b448c1..0000000000000000000000000000000000000000
--- a/server/app/workflows/__init__.py
+++ /dev/null
@@ -1,8 +0,0 @@
-from .pipeline import (
- PipelineOrchestrator,
- PipelineStage,
- PipelineStatus,
- register_default_stages,
-)
-
-__all__ = ["PipelineOrchestrator", "PipelineStage", "PipelineStatus", "register_default_stages"]
diff --git a/server/app/workflows/embeddings.py b/server/app/workflows/embeddings.py
deleted file mode 100644
index 3c5c65004214ef4b7a34047252cad022ba1781fc..0000000000000000000000000000000000000000
--- a/server/app/workflows/embeddings.py
+++ /dev/null
@@ -1,95 +0,0 @@
-from __future__ import annotations
-
-import asyncio
-from pathlib import Path
-from typing import Any, Dict, List
-
-from agents.shared.embeddings import embed_texts
-from common.embedding_store import EmbeddingStore, get_session_embedding_path
-from ..services.diagnostics_service import get_diagnostics_service
-
-EMBEDDING_BATCH_SIZE = 16
-EMBEDDING_SNIPPET_CHARS = 1500
-
-
-def index_standards_embeddings(session_id: str, standards_payload: List[Dict[str, Any]]) -> None:
- diagnostics = get_diagnostics_service()
- diagnostics.record_event(
- node_id="embeddings",
- event_type="embeddings.start",
- message="Building embeddings index",
- metadata={"session_id": session_id},
- )
- try:
- asyncio.run(_async_index(session_id, standards_payload))
- except Exception as exc:
- diagnostics.record_event(
- node_id="embeddings",
- event_type="embeddings.failed",
- message="Embedding indexing failed",
- metadata={"session_id": session_id, "error": str(exc)},
- )
- raise
- else:
- diagnostics.record_event(
- node_id="embeddings",
- event_type="embeddings.complete",
- message="Embeddings index saved",
- metadata={"session_id": session_id},
- )
-
-
-async def _async_index(session_id: str, standards_payload: List[Dict[str, Any]]) -> None:
- store = EmbeddingStore(get_session_embedding_path(session_id))
- store.clear()
-
- buffer_texts: List[str] = []
- buffer_meta: List[Dict[str, Any]] = []
-
- for entry in standards_payload:
- path = entry.get("path", "")
- chunks = entry.get("chunks", [])
- for chunk in chunks:
- text = chunk.get("text", "")
- if not text:
- continue
- snippet = _build_snippet(chunk, max_chars=EMBEDDING_SNIPPET_CHARS)
- buffer_texts.append(snippet)
- buffer_meta.append(
- {
- "path": path,
- "document": Path(path).name if path else "unknown",
- "heading": chunk.get("heading"),
- "clauses": chunk.get("clause_numbers", []),
- "page_number": chunk.get("page_number"),
- "chunk_index": chunk.get("chunk_index"),
- "snippet": text[:600],
- }
- )
- if len(buffer_texts) >= EMBEDDING_BATCH_SIZE:
- vectors = await embed_texts(buffer_texts)
- store.extend(vectors, buffer_meta)
- buffer_texts, buffer_meta = [], []
-
- if buffer_texts:
- vectors = await embed_texts(buffer_texts)
- store.extend(vectors, buffer_meta)
-
- store.save()
-
-
-def load_embedding_store(session_id: str) -> EmbeddingStore:
- return EmbeddingStore(get_session_embedding_path(session_id))
-
-
-def _build_snippet(chunk: Dict[str, Any], max_chars: int) -> str:
- heading = chunk.get("heading")
- clauses = chunk.get("clause_numbers") or []
- text = chunk.get("text", "")
- snippet = []
- if heading:
- snippet.append(str(heading))
- if clauses:
- snippet.append("Clauses: " + ", ".join(clauses[:10]))
- snippet.append(text[:max_chars])
- return "\n".join(snippet)
diff --git a/server/app/workflows/pipeline.py b/server/app/workflows/pipeline.py
deleted file mode 100644
index 8e906de0bd8010c3ff517f6d493f86b3f8446b76..0000000000000000000000000000000000000000
--- a/server/app/workflows/pipeline.py
+++ /dev/null
@@ -1,493 +0,0 @@
-from __future__ import annotations
-
-import asyncio
-import logging
-from datetime import datetime
-from enum import Enum
-from pathlib import Path
-from typing import Any, Awaitable, Callable, Iterable, Optional, Union
-
-from agents.extraction.agent import ExtractionAgent
-from agents.standards_mapping.agent import StandardsMappingAgent
-from agents.rewrite.agent import RewriteAgent
-from agents.validation.agent import ValidationAgent
-from agents.export.agent import ExportAgent
-from agents.shared.base import AgentContext
-from .embeddings import index_standards_embeddings
-from .serializers import serialize_docx_result, serialize_pdf_result
-from workers.docx_processing import parse_docx
-from workers.pdf_processing import parse_pdf
-
-from ..models import Session, SessionStatus
-from ..services.diagnostics_service import get_diagnostics_service
-from ..services.file_cache import FileCache
-from ..services.session_service import SessionService
-from ..utils.paths import (
- canonical_storage_path,
- resolve_storage_path,
- to_storage_relative,
-)
-
-logger = logging.getLogger(__name__)
-
-
-class PipelineStage(str, Enum):
- INGEST = "ingest"
- EXTRACT = "extract"
- MAP = "map"
- REWRITE = "rewrite"
- VALIDATE = "validate"
- EXPORT = "export"
-
-
-class PipelineStatus(str, Enum):
- PENDING = "pending"
- RUNNING = "running"
- COMPLETED = "completed"
- FAILED = "failed"
-
-
-StageHandler = Callable[[Session], Union[Awaitable[None], None]]
-
-
-class PipelineOrchestrator:
- """Pipeline orchestrator that runs each stage sequentially."""
-
- def __init__(self, session_service: SessionService) -> None:
- self._session_service = session_service
- self._stages: dict[PipelineStage, StageHandler] = {}
- self._diagnostics = get_diagnostics_service()
-
- def register_stage(self, stage: PipelineStage, handler: StageHandler) -> None:
- self._stages[stage] = handler
-
- def run(self, session: Session, stages: Optional[Iterable[PipelineStage]] = None) -> None:
- logger.info("Starting pipeline for session %s", session.id)
- self._diagnostics.record_event(
- node_id="pipeline",
- event_type="pipeline.start",
- message=f"Pipeline started for session `{session.name}`",
- metadata={"session_id": str(session.id)},
- )
- session.logs.append(f"[{datetime.utcnow().isoformat()}] Pipeline initiated.")
- self._session_service.save_session(session)
- session = self._session_service.update_status(str(session.id), SessionStatus.PROCESSING) or session
- selected_stages = list(stages or self._stages.keys())
- total_stages = len(selected_stages)
- if total_stages:
- self._session_service.store_stage_output(
- session,
- "pipeline_progress",
- {
- "total": total_stages,
- "current_index": 0,
- "stage": None,
- "status": PipelineStatus.PENDING.value,
- },
- )
- try:
- for index, stage in enumerate(selected_stages, start=1):
- handler = self._stages.get(stage)
- if not handler:
- logger.warning("No handler registered for stage %s", stage)
- continue
-
- logger.debug("Running stage %s for session %s", stage, session.id)
- session.logs.append(f"[{datetime.utcnow().isoformat()}] Stage `{stage.value}` started.")
- self._diagnostics.record_event(
- node_id="pipeline",
- event_type="stage.start",
- message=f"Stage `{stage.value}` started for session `{session.name}`",
- metadata={"session_id": str(session.id), "stage": stage.value},
- )
- if total_stages:
- self._session_service.store_stage_output(
- session,
- "pipeline_progress",
- {
- "total": total_stages,
- "current_index": index,
- "stage": stage.value,
- "status": PipelineStatus.RUNNING.value,
- },
- )
- self._session_service.save_session(session)
-
- result = handler(session)
- if asyncio.iscoroutine(result):
- asyncio.run(result)
-
- session.logs.append(f"[{datetime.utcnow().isoformat()}] Stage `{stage.value}` completed.")
- self._diagnostics.record_event(
- node_id="pipeline",
- event_type="stage.complete",
- message=f"Stage `{stage.value}` completed for session `{session.name}`",
- metadata={"session_id": str(session.id), "stage": stage.value},
- )
- if total_stages:
- self._session_service.store_stage_output(
- session,
- "pipeline_progress",
- {
- "total": total_stages,
- "current_index": index,
- "stage": stage.value,
- "status": PipelineStatus.COMPLETED.value,
- },
- )
- self._session_service.save_session(session)
-
- session.logs.append(f"[{datetime.utcnow().isoformat()}] Pipeline completed; awaiting review.")
- self._session_service.save_session(session)
- self._session_service.update_status(str(session.id), SessionStatus.REVIEW)
- self._diagnostics.record_event(
- node_id="pipeline",
- event_type="pipeline.complete",
- message=f"Pipeline completed for session `{session.name}`",
- metadata={"session_id": str(session.id)},
- )
- if total_stages:
- self._session_service.store_stage_output(
- session,
- "pipeline_progress",
- {
- "total": total_stages,
- "current_index": total_stages,
- "stage": None,
- "status": PipelineStatus.COMPLETED.value,
- },
- )
- logger.info("Pipeline complete for session %s", session.id)
- except Exception as exc: # noqa: BLE001
- logger.exception("Pipeline failed for session %s", session.id)
- session.logs.append(f"[{datetime.utcnow().isoformat()}] Pipeline failed: {exc}.")
- self._session_service.save_session(session)
- self._session_service.update_status(
- str(session.id),
- SessionStatus.FAILED,
- error=str(exc),
- )
- self._diagnostics.record_event(
- node_id="pipeline",
- event_type="pipeline.failed",
- message=f"Pipeline failed for session `{session.name}`",
- metadata={"session_id": str(session.id), "error": str(exc)},
- )
- if total_stages:
- self._session_service.store_stage_output(
- session,
- "pipeline_progress",
- {
- "total": total_stages,
- "current_index": min(len(selected_stages), total_stages),
- "stage": None,
- "status": PipelineStatus.FAILED.value,
- },
- )
-
-
-def register_default_stages(
- orchestrator: PipelineOrchestrator,
- *,
- file_cache: FileCache,
-) -> PipelineOrchestrator:
- """Register functional ingestion, extraction, mapping, rewrite, and export stages."""
-
- session_service = orchestrator._session_service # noqa: SLF001
- diagnostics = get_diagnostics_service()
-
- def _path_key_variants(raw_path: str) -> set[str]:
- variants = {raw_path}
- try:
- path_obj = Path(raw_path)
- variants.add(str(path_obj))
- variants.add(path_obj.as_posix())
- variants.add(canonical_storage_path(path_obj))
- except Exception: # noqa: BLE001
- pass
- return {variant for variant in variants if variant}
-
- def ingest_stage(session: Session) -> None:
- doc_path = resolve_storage_path(session.source_doc)
- doc_cache_key = file_cache.compute_key(doc_path, "doc-parse")
- cached_doc_payload = file_cache.load("doc-parse", doc_cache_key)
- if cached_doc_payload:
- doc_payload = dict(cached_doc_payload)
- doc_payload["path"] = to_storage_relative(doc_path)
- session.logs.append(
- f"[{datetime.utcnow().isoformat()}] Loaded cached report parse "
- f"({len(doc_payload.get('paragraphs', []))} paragraphs, "
- f"{len(doc_payload.get('tables', []))} tables)."
- )
- diagnostics.record_event(
- node_id="pipeline",
- event_type="doc.cache",
- message=f"Loaded cached report parse for session `{session.name}`",
- metadata={
- "session_id": str(session.id),
- "path": to_storage_relative(doc_path),
- },
- )
- else:
- doc_result = parse_docx(doc_path)
- doc_payload = serialize_docx_result(doc_result)
- file_cache.store("doc-parse", doc_cache_key, doc_payload)
- session.logs.append(
- f"[{datetime.utcnow().isoformat()}] Parsed report ({len(doc_payload['paragraphs'])} paragraphs, "
- f"{len(doc_payload['tables'])} tables)."
- )
- diagnostics.record_event(
- node_id="pipeline",
- event_type="doc.parse",
- message=f"Parsed report for session `{session.name}`",
- metadata={
- "session_id": str(session.id),
- "path": to_storage_relative(doc_path),
- },
- )
- session_service.store_stage_output(session, "doc_parse", doc_payload)
-
- standards_payload: list[dict[str, Any]] = []
- total_standards = len(session.standards_docs)
- cached_count = 0
- parsed_count = 0
-
- raw_preset_payloads = session.metadata.get("preset_standards_payload") or []
- preset_payload_map: dict[str, dict[str, Any]] = {}
- for entry in raw_preset_payloads:
- if not isinstance(entry, dict):
- continue
- path_str = entry.get("path")
- if not path_str:
- continue
- normalised_entry = dict(entry)
- normalised_entry["path"] = to_storage_relative(path_str)
- canonical_key = canonical_storage_path(path_str)
- preset_payload_map[canonical_key] = normalised_entry
- for key in _path_key_variants(path_str):
- preset_payload_map.setdefault(key, normalised_entry)
- if total_standards:
- session_service.store_stage_output(
- session,
- "standards_ingest_progress",
- {
- "total": total_standards,
- "processed": 0,
- "cached_count": 0,
- "parsed_count": 0,
- },
- )
-
- for index, path in enumerate(session.standards_docs, start=1):
- resolved_path = resolve_storage_path(path)
- path_str = to_storage_relative(resolved_path)
- payload: Optional[dict[str, Any]] = None
- canonical = canonical_storage_path(resolved_path)
- preset_payload = (
- preset_payload_map.get(canonical)
- or preset_payload_map.get(path_str)
- or preset_payload_map.get(Path(path_str).as_posix())
- )
- cache_key = file_cache.compute_key(
- resolved_path,
- "standards-parse",
- extra="max_chunk_chars=1200",
- )
-
- if preset_payload:
- payload = dict(preset_payload)
- payload["path"] = path_str
- cached_count += 1
- file_cache.store("standards-parse", cache_key, payload)
- diagnostics.record_event(
- node_id="presets",
- event_type="preset.cache",
- message=f"Used cached preset parse for `{Path(path_str).name}`",
- metadata={
- "session_id": str(session.id),
- "path": path_str,
- },
- )
- else:
- cached_payload = file_cache.load("standards-parse", cache_key)
- if cached_payload:
- payload = dict(cached_payload)
- payload["path"] = to_storage_relative(cached_payload.get("path", resolved_path))
- cached_count += 1
- diagnostics.record_event(
- node_id="pipeline",
- event_type="standards.cache",
- message=f"Loaded cached standards parse for `{Path(path_str).name}`",
- metadata={
- "session_id": str(session.id),
- "path": path_str,
- },
- )
- else:
- result = parse_pdf(resolved_path)
- payload = serialize_pdf_result(result)
- file_cache.store("standards-parse", cache_key, payload)
- parsed_count += 1
- diagnostics.record_event(
- node_id="pipeline",
- event_type="standards.parse",
- message=f"Parsed standards document `{Path(path_str).name}`",
- metadata={
- "session_id": str(session.id),
- "path": path_str,
- },
- )
- payload["path"] = path_str
- standards_payload.append(payload)
-
- if total_standards:
- session_service.store_stage_output(
- session,
- "standards_ingest_progress",
- {
- "total": total_standards,
- "processed": index,
- "current_file": path_str,
- "cached_count": cached_count,
- "parsed_count": parsed_count,
- },
- )
-
- session.logs.append(
- f"[{datetime.utcnow().isoformat()}] Ingested {len(standards_payload)} standards PDF(s) "
- f"(parsed {parsed_count}, cached {cached_count})."
- )
- diagnostics.record_event(
- node_id="pipeline",
- event_type="ingest.complete",
- message=f"Ingested {len(standards_payload)} standards for session `{session.name}`",
- metadata={
- "session_id": str(session.id),
- "parsed": parsed_count,
- "cached": cached_count,
- },
- )
- session_service.store_stage_output(session, "standards_parse", standards_payload)
- if total_standards:
- session_service.store_stage_output(
- session,
- "standards_ingest_progress",
- {
- "total": total_standards,
- "processed": total_standards,
- "cached_count": cached_count,
- "parsed_count": parsed_count,
- "completed": True,
- },
- )
- try:
- index_standards_embeddings(str(session.id), standards_payload)
- except Exception as exc: # noqa: BLE001
- logger.warning("Embedding indexing failed for session %s: %s", session.id, exc)
- session.logs.append(f"[{datetime.utcnow().isoformat()}] Embedding indexing failed: {exc}.")
-
-
- async def extraction_stage(session: Session) -> None:
- doc_payload = session.metadata.get("doc_parse", {}) or {}
- paragraphs = doc_payload.get("paragraphs", [])
- tables = doc_payload.get("tables", [])
- metadata = {
- "paragraph_count": len(paragraphs),
- "table_count": len(tables),
- }
-
- agent = ExtractionAgent()
- context = AgentContext(
- session_id=str(session.id),
- payload={
- "paragraphs": paragraphs,
- "tables": tables,
- "metadata": metadata,
- },
- )
- result = await agent.run(context)
- session_service.store_stage_output(session, "extraction_result", result)
-
- async def mapping_stage(session: Session) -> None:
- extraction_result = session.metadata.get("extraction_result") or {}
- standards_payload = session.metadata.get("standards_parse") or []
-
- standards_chunks: list[dict[str, Any]] = []
- for entry in standards_payload:
- chunks = entry.get("chunks", [])
- standards_chunks.extend(chunks)
-
- agent = StandardsMappingAgent()
- context = AgentContext(
- session_id=str(session.id),
- payload={
- "extraction_result": extraction_result,
- "standards_chunks": standards_chunks,
- "target_metadata": {
- "target_standard": session.destination_standard,
- "standards_count": len(standards_payload),
- },
- },
- )
- result = await agent.run(context)
- session_service.store_stage_output(session, "mapping_result", result)
-
- async def rewrite_stage(session: Session) -> None:
- mapping_result = session.metadata.get("mapping_result") or {}
- extraction_result = session.metadata.get("extraction_result") or {}
- doc_payload = session.metadata.get("doc_parse") or {}
-
- agent = RewriteAgent()
- context = AgentContext(
- session_id=str(session.id),
- payload={
- "mapping_result": mapping_result,
- "document_sections": extraction_result.get("sections", []),
- "document_paragraphs": doc_payload.get("paragraphs", []),
- "document_tables": doc_payload.get("tables", []),
- "target_voice": "Professional engineering tone",
- "constraints": [
- "Do not alter calculations or numeric values.",
- "Preserve paragraph numbering and formatting markers.",
- ],
- },
- )
- result = await agent.run(context)
- session_service.store_stage_output(session, "rewrite_plan", result)
-
- async def validate_stage(session: Session) -> None:
- # Placeholder validation using existing agent scaffold.
- agent = ValidationAgent()
- context = AgentContext(
- session_id=str(session.id),
- payload={
- "extraction_result": session.metadata.get("extraction_result"),
- "mapping_result": session.metadata.get("mapping_result"),
- "rewrite_plan": session.metadata.get("rewrite_plan"),
- },
- )
- result = await agent.run(context)
- session_service.store_stage_output(session, "validation_report", result)
-
- async def export_stage(session: Session) -> None:
- agent = ExportAgent()
- context = AgentContext(
- session_id=str(session.id),
- payload={
- "rewrite_plan": session.metadata.get("rewrite_plan"),
- "original_document": session.metadata.get("doc_parse"),
- "original_path": str(session.source_doc),
- },
- )
- result = await agent.run(context)
- session_service.store_stage_output(session, "export_manifest", result)
-
- orchestrator.register_stage(PipelineStage.INGEST, ingest_stage)
- orchestrator.register_stage(PipelineStage.EXTRACT, extraction_stage)
- orchestrator.register_stage(PipelineStage.MAP, mapping_stage)
- orchestrator.register_stage(PipelineStage.REWRITE, rewrite_stage)
- orchestrator.register_stage(PipelineStage.VALIDATE, validate_stage)
- orchestrator.register_stage(PipelineStage.EXPORT, export_stage)
- return orchestrator
-
-
diff --git a/server/app/workflows/serializers.py b/server/app/workflows/serializers.py
deleted file mode 100644
index 7be884f4d1cdfbc577cdad2400f23694210de4d0..0000000000000000000000000000000000000000
--- a/server/app/workflows/serializers.py
+++ /dev/null
@@ -1,83 +0,0 @@
-from __future__ import annotations
-
-from typing import Any, Dict
-
-from workers.docx_processing.parser import DocxParseResult
-from workers.pdf_processing.parser import PdfParseResult
-
-from ..utils.paths import to_storage_relative
-
-
-def serialize_docx_result(result: DocxParseResult) -> dict[str, Any]:
- return {
- "path": to_storage_relative(result.path),
- "paragraphs": [
- {
- "index": block.index,
- "text": block.text,
- "style": block.style,
- "heading_level": block.heading_level,
- "references": block.references,
- }
- for block in result.paragraphs
- ],
- "tables": [
- {
- "index": block.index,
- "rows": block.rows,
- "references": block.references,
- }
- for block in result.tables
- ],
- "summary": {
- "paragraph_count": len(result.paragraphs),
- "table_count": len(result.tables),
- "reference_count": len(
- {
- ref
- for block in result.paragraphs
- for ref in block.references
- }.union(
- {
- ref
- for block in result.tables
- for ref in block.references
- }
- )
- ),
- },
- }
-
-
-def serialize_pdf_result(result: PdfParseResult) -> dict[str, Any]:
- payload: Dict[str, Any] = {
- "path": to_storage_relative(result.path),
- "chunks": [
- {
- "page_number": chunk.page_number,
- "chunk_index": chunk.chunk_index,
- "text": chunk.text,
- "heading": chunk.heading,
- "clause_numbers": chunk.clause_numbers,
- "references": chunk.references,
- "is_ocr": chunk.is_ocr,
- }
- for chunk in result.chunks
- ],
- "summary": {
- "chunk_count": len(result.chunks),
- "reference_count": len(
- {
- ref
- for chunk in result.chunks
- for ref in chunk.references
- }
- ),
- },
- }
- if result.ocr_pages:
- payload["summary"]["ocr_pages"] = result.ocr_pages
- payload["summary"]["ocr_chunk_count"] = len(
- [chunk for chunk in result.chunks if chunk.is_ocr]
- )
- return payload
diff --git a/server/requirements.txt b/server/requirements.txt
index 366b5a4633d074239cc0ae2f579c6feb4c1d00a0..1868f3c534668bb3db9f448867dc6fe41d1157cc 100644
--- a/server/requirements.txt
+++ b/server/requirements.txt
@@ -1,12 +1,2 @@
fastapi>=0.115.0,<1.0.0
uvicorn[standard]>=0.30.0,<0.32.0
-pydantic-settings>=2.4.0,<3.0.0
-python-multipart>=0.0.9,<0.1.0
-aiofiles>=24.0.0,<25.0.0
-httpx>=0.27.0,<0.28.0
-python-docx>=1.1.0,<1.2.0
-pdfplumber>=0.11.4,<0.12.0
-openai>=1.45.0,<2.0.0
-numpy>=1.26.0,<2.0.0
-pillow>=10.4.0,<11.0.0
-pytesseract>=0.3.10,<0.4.0
diff --git a/start-rightcodes.bat b/start-rightcodes.bat
deleted file mode 100644
index 41a7bd8913eab9a422b2cb1bb954d19521b7386d..0000000000000000000000000000000000000000
--- a/start-rightcodes.bat
+++ /dev/null
@@ -1,63 +0,0 @@
-@echo off
-setlocal EnableExtensions
-
-set "SCRIPT_DIR=%~dp0"
-set "EXIT_CODE=1"
-set "PYTHON_EXE="
-set "NPM_EXE="
-
-for %%P in (python.exe py.exe python) do (
- where %%P >nul 2>&1
- if not errorlevel 1 (
- set "PYTHON_EXE=%%P"
- goto :found_python
- )
-)
-
-echo [ERROR] Python 3.8+ is required but was not found in PATH.
-echo Download and install it from https://www.python.org/downloads/ then reopen this window.
-goto :abort
-
-:found_python
-for %%N in (npm.cmd npm.exe npm) do (
- where %%N >nul 2>&1
- if not errorlevel 1 (
- set "NPM_EXE=%%N"
- goto :found_npm
- )
-)
-
-echo [ERROR] Node.js (npm) is required but was not found in PATH.
-echo Download it from https://nodejs.org/en/download/ (npm ships with Node.js) then reopen this window.
-goto :abort
-
-:found_npm
-"%PYTHON_EXE%" -c "import sys; sys.exit(0 if sys.version_info >= (3, 8) else 1)" >nul 2>&1
-if errorlevel 1 (
- for /f "tokens=1-3" %%A in ('"%PYTHON_EXE%" --version 2^>^&1') do set "PY_VER=%%A %%B %%C"
- echo [ERROR] Detected Python %PY_VER%. Python 3.8 or newer is required.
- goto :abort
-)
-
-echo [INFO] Using Python: %PYTHON_EXE%
-echo [INFO] Using npm: %NPM_EXE%
-echo [INFO] First launch downloads dependencies; ensure this machine has internet access.
-
-pushd "%SCRIPT_DIR%"
-"%PYTHON_EXE%" "%SCRIPT_DIR%start-rightcodes.py" %*
-set "EXIT_CODE=%ERRORLEVEL%"
-popd
-
-if "%EXIT_CODE%"=="0" (
- endlocal
- exit /b 0
-)
-
-echo.
-echo Launcher exited with error code %EXIT_CODE%.
-echo Review the console output above for details.
-
-:abort
-pause
-endlocal
-exit /b %EXIT_CODE%
diff --git a/start-rightcodes.html b/start-rightcodes.html
deleted file mode 100644
index 87c79d038a433d82c4f039762c0ec725ac4801d9..0000000000000000000000000000000000000000
--- a/start-rightcodes.html
+++ /dev/null
@@ -1,326 +0,0 @@
-
-
-
-
- RightCodes Launcher
-
-
-
-
-
- RightCodes Launcher
-
-
- Start Services
- Enter your OpenAI API key to launch both the backend (uvicorn) and the frontend (npm run dev) in dedicated consoles.
-
-
-
-
- Environment Status
- Loading status...
-
-
-
-
- Setup Log
- Collecting setup log...
-
-
-
-
-
-
diff --git a/start-rightcodes.py b/start-rightcodes.py
deleted file mode 100644
index 013de0f54993ba006318d82e1e80afa0f3d4b2a7..0000000000000000000000000000000000000000
--- a/start-rightcodes.py
+++ /dev/null
@@ -1,619 +0,0 @@
-#!/usr/bin/env python3
-"""RightCodes launcher with a lightweight HTML control panel."""
-
-from __future__ import annotations
-
-import argparse
-import atexit
-import datetime as dt
-import http.server
-import json
-import os
-import pathlib
-import shutil
-import subprocess
-import sys
-import signal
-import threading
-import time
-import urllib.parse
-import webbrowser
-from http import HTTPStatus
-from typing import Dict, List, Optional
-
-
-REPO_DIR = pathlib.Path(__file__).resolve().parent
-FRONTEND_DIR = REPO_DIR / "frontend"
-VENV_DIR = REPO_DIR / ".venv"
-BACKEND_REQUIREMENTS = REPO_DIR / "server" / "requirements.txt"
-LAUNCHER_HTML_PATH = REPO_DIR / "start-rightcodes.html"
-TIME_FORMAT = "%Y-%m-%d %H:%M:%S"
-
-
-def _is_windows() -> bool:
- return os.name == "nt"
-
-
-def _venv_python() -> pathlib.Path:
- if _is_windows():
- return VENV_DIR / "Scripts" / "python.exe"
- return VENV_DIR / "bin" / "python"
-
-
-def _activate_snippet() -> str:
- if _is_windows():
- return f'call "{VENV_DIR / "Scripts" / "activate.bat"}"'
- return f'source "{VENV_DIR / "bin" / "activate"}"'
-
-
-def _timestamp() -> str:
- return dt.datetime.now().strftime(TIME_FORMAT)
-
-
-class LauncherState:
- """Mutable state container shared between the launcher runtime and HTTP handler."""
-
- def __init__(self) -> None:
- self.lock = threading.Lock()
- self.backend_proc: Optional[subprocess.Popen] = None
- self.frontend_proc: Optional[subprocess.Popen] = None
- self.openai_key: Optional[str] = None
- self.messages: List[str] = []
- self.errors: List[str] = []
- self.prepared: bool = False
- self.ui_opened: bool = False
- self.dashboard_opened: bool = False
- self.python_path: Optional[str] = None
- self.npm_path: Optional[str] = None
-
- def add_message(self, message: str) -> None:
- entry = f"[{_timestamp()}] {message}"
- print(entry)
- with self.lock:
- self.messages.append(entry)
-
- def add_error(self, message: str) -> None:
- entry = f"[{_timestamp()}] ERROR: {message}"
- print(entry)
- with self.lock:
- self.errors.append(entry)
-
- def record_backend(self, proc: subprocess.Popen) -> None:
- with self.lock:
- self.backend_proc = proc
-
- def record_frontend(self, proc: subprocess.Popen) -> None:
- with self.lock:
- self.frontend_proc = proc
-
- def set_openai_key(self, value: Optional[str]) -> None:
- with self.lock:
- self.openai_key = value
-
- def set_prepared(self, value: bool) -> None:
- with self.lock:
- self.prepared = value
-
- def mark_ui_opened(self) -> bool:
- with self.lock:
- if self.ui_opened:
- return False
- self.ui_opened = True
- return True
-
- def mark_dashboard_opened(self) -> bool:
- with self.lock:
- if self.dashboard_opened:
- return False
- self.dashboard_opened = True
- return True
-
- def detach_processes(self) -> Dict[str, Optional[subprocess.Popen]]:
- with self.lock:
- backend = self.backend_proc
- frontend = self.frontend_proc
- self.backend_proc = None
- self.frontend_proc = None
- self.openai_key = None
- return {"backend": backend, "frontend": frontend}
-
- def get_status(self) -> Dict[str, object]:
- with self.lock:
- backend_info = _process_info(self.backend_proc)
- frontend_info = _process_info(self.frontend_proc)
- return {
- "prepared": self.prepared,
- "openai_key_set": self.openai_key is not None,
- "backend": backend_info,
- "frontend": frontend_info,
- "messages": list(self.messages),
- "errors": list(self.errors),
- }
-
-
-def _process_info(proc: Optional[subprocess.Popen]) -> Dict[str, object]:
- if proc is None:
- return {"running": False, "pid": None, "returncode": None}
- running = proc.poll() is None
- return {
- "running": running,
- "pid": proc.pid,
- "returncode": None if running else proc.returncode,
- }
-
-
-def run_command(
- cmd: List[str],
- *,
- cwd: Optional[str],
- state: LauncherState,
- description: Optional[str] = None,
-) -> None:
- state.add_message(description or f"Executing: {' '.join(cmd)}")
- try:
- subprocess.run(cmd, cwd=cwd, check=True)
- except subprocess.CalledProcessError as exc:
- state.add_error(f"Command failed (exit {exc.returncode}): {' '.join(exc.cmd)}")
- raise
-
-
-def ensure_environment(state: LauncherState) -> None:
- state.add_message("Validating prerequisite tools...")
- npm_path = shutil.which("npm")
- if npm_path is None:
- raise RuntimeError(
- "npm was not found in PATH. Install Node.js (includes npm) and ensure it is available."
- )
- state.add_message(f"npm detected at {npm_path}")
- state.npm_path = npm_path
-
- python_path = _venv_python()
- if not python_path.exists():
- state.add_message("Creating Python virtual environment (.venv)...")
- try:
- subprocess.run(
- [sys.executable, "-m", "venv", str(VENV_DIR)],
- cwd=str(REPO_DIR),
- check=True,
- )
- except subprocess.CalledProcessError as exc:
- raise RuntimeError(
- f"Virtual environment creation failed (exit {exc.returncode})."
- ) from exc
- python_path = _venv_python()
- else:
- state.add_message("Virtual environment already exists.")
-
- if not python_path.exists():
- raise RuntimeError("Virtual environment is missing expected python executable.")
-
- python_executable = str(python_path)
- state.python_path = python_executable
-
- try:
- run_command(
- [python_executable, "-m", "pip", "install", "--upgrade", "pip"],
- cwd=str(REPO_DIR),
- state=state,
- description="Upgrading pip inside the virtual environment...",
- )
- run_command(
- [python_executable, "-m", "pip", "install", "-r", str(BACKEND_REQUIREMENTS)],
- cwd=str(REPO_DIR),
- state=state,
- description="Installing backend Python dependencies...",
- )
- except subprocess.CalledProcessError as exc:
- raise RuntimeError("Failed to install backend dependencies.") from exc
-
- frontend_modules = FRONTEND_DIR / "node_modules"
- if not frontend_modules.exists():
- try:
- run_command(
- ["npm", "install"],
- cwd=str(FRONTEND_DIR),
- state=state,
- description="Installing frontend dependencies via npm install...",
- )
- except subprocess.CalledProcessError as exc:
- raise RuntimeError("Failed to install frontend dependencies via npm.") from exc
- else:
- state.add_message("Frontend dependencies already installed.")
-
- state.set_prepared(True)
- state.add_message("Environment preparation complete.")
-
-
-def launch_backend(state: LauncherState, key: str) -> subprocess.Popen:
- state.add_message("Starting backend server (uvicorn)...")
- env = os.environ.copy()
- env["RIGHTCODES_OPENAI_API_KEY"] = key
- env["RIGHTCODES_OPENAI_API_KEY_SOURCE"] = "launcher"
-
- python_executable = state.python_path or str(_venv_python())
- python_path = pathlib.Path(python_executable)
- if not python_path.exists():
- raise RuntimeError(
- "Python executable inside the virtual environment was not found. "
- "Re-run the launcher setup before starting services."
- )
- state.python_path = python_executable
- try:
- kwargs: Dict[str, object] = {"env": env, "cwd": str(REPO_DIR)}
- if _is_windows():
- kwargs["creationflags"] = (
- subprocess.CREATE_NEW_CONSOLE | subprocess.CREATE_NEW_PROCESS_GROUP # type: ignore[attr-defined]
- )
- else:
- kwargs["start_new_session"] = True
-
- proc = subprocess.Popen(
- [
- python_executable,
- "-m",
- "uvicorn",
- "server.app.main:app",
- "--reload",
- "--port",
- "8000",
- ],
- **kwargs,
- )
- except OSError as exc:
- raise RuntimeError(f"Failed to launch backend server: {exc}") from exc
-
- state.add_message(f"Backend console launched (PID {proc.pid}).")
- return proc
-
-
-def launch_frontend(state: LauncherState, key: str) -> subprocess.Popen:
- state.add_message("Starting frontend dev server (npm run dev)...")
- env = os.environ.copy()
- env["RIGHTCODES_OPENAI_API_KEY"] = key
- env["RIGHTCODES_OPENAI_API_KEY_SOURCE"] = "launcher"
-
- npm_path = state.npm_path or shutil.which("npm")
- if npm_path is None:
- raise RuntimeError("npm was not found; cannot launch frontend.")
-
- try:
- kwargs: Dict[str, object] = {"env": env, "cwd": str(FRONTEND_DIR)}
- if _is_windows():
- kwargs["creationflags"] = (
- subprocess.CREATE_NEW_CONSOLE | subprocess.CREATE_NEW_PROCESS_GROUP # type: ignore[attr-defined]
- )
- else:
- kwargs["start_new_session"] = True
-
- proc = subprocess.Popen([npm_path, "run", "dev"], **kwargs)
- except OSError as exc:
- raise RuntimeError(f"Failed to launch frontend dev server: {exc}") from exc
-
- state.add_message(f"Frontend console launched (PID {proc.pid}).")
- return proc
-
-
-def stop_process(proc: Optional[subprocess.Popen], name: str, state: LauncherState) -> bool:
- if not proc:
- return False
- if proc.poll() is not None:
- state.add_message(f"{name.capitalize()} already stopped (exit {proc.returncode}).")
- return False
- pid = proc.pid
- state.add_message(f"Stopping {name} (PID {pid})...")
-
- if _is_windows():
- try:
- result = subprocess.run(
- ["taskkill", "/PID", str(pid), "/T", "/F"],
- check=False,
- stdout=subprocess.DEVNULL,
- stderr=subprocess.DEVNULL,
- )
- if result.returncode not in (0, 128, 255):
- state.add_error(
- f"taskkill exited with code {result.returncode} while stopping {name} (PID {pid})."
- )
- except FileNotFoundError:
- state.add_error("taskkill command not found; falling back to direct termination.")
- except Exception as exc: # pragma: no cover - defensive
- state.add_error(f"taskkill failed for {name} (PID {pid}): {exc}")
-
- else:
- try:
- os.killpg(pid, signal.SIGTERM)
- except ProcessLookupError:
- pass
- except Exception as exc:
- state.add_error(f"Failed to send SIGTERM to {name} (PID {pid}): {exc}")
-
- if proc.poll() is None:
- try:
- proc.terminate()
- except Exception as exc: # pragma: no cover - defensive
- state.add_error(f"Failed to send terminate() to {name} (PID {pid}): {exc}")
-
- try:
- proc.wait(10)
- except subprocess.TimeoutExpired:
- if _is_windows():
- state.add_message(f"{name.capitalize()} unresponsive; forcing termination.")
- try:
- proc.kill()
- except Exception as exc: # pragma: no cover - defensive
- state.add_error(f"Failed to terminate {name} (PID {pid}) via kill(): {exc}")
- try:
- proc.wait(5)
- except subprocess.TimeoutExpired:
- state.add_error(f"{name.capitalize()} did not exit after kill(); giving up.")
- else:
- state.add_message(f"{name.capitalize()} unresponsive; sending SIGKILL.")
- try:
- os.killpg(pid, signal.SIGKILL)
- except ProcessLookupError:
- pass
- except Exception as exc: # pragma: no cover - defensive
- state.add_error(f"Failed to send SIGKILL to {name} (PID {pid}): {exc}")
- try:
- proc.wait(5)
- except subprocess.TimeoutExpired:
- state.add_error(f"{name.capitalize()} did not exit after SIGKILL; giving up.")
-
- state.add_message(f"{name.capitalize()} stopped.")
- return True
-
-
-def start_services(state: LauncherState, key: str) -> List[str]:
- cleaned_key = key.strip()
- if cleaned_key.startswith('"') and cleaned_key.endswith('"') and len(cleaned_key) > 1:
- cleaned_key = cleaned_key[1:-1]
- if not cleaned_key:
- raise ValueError("OpenAI API key is required.")
-
- state.set_openai_key(cleaned_key)
- messages: List[str] = []
-
- with state.lock:
- backend_proc = state.backend_proc
- frontend_proc = state.frontend_proc
- backend_running = backend_proc and backend_proc.poll() is None
- frontend_running = frontend_proc and frontend_proc.poll() is None
-
- if backend_running:
- msg = f"Backend already running (PID {backend_proc.pid})."
- state.add_message(msg)
- messages.append(msg)
- else:
- backend_proc = launch_backend(state, cleaned_key)
- state.record_backend(backend_proc)
- messages.append(f"Backend launching (PID {backend_proc.pid}).")
-
- if frontend_running:
- msg = f"Frontend already running (PID {frontend_proc.pid})."
- state.add_message(msg)
- messages.append(msg)
- else:
- frontend_proc = launch_frontend(state, cleaned_key)
- state.record_frontend(frontend_proc)
- messages.append(f"Frontend launching (PID {frontend_proc.pid}).")
-
- open_dashboard(state)
- return messages
-
-
-def stop_services(state: LauncherState) -> List[str]:
- processes = state.detach_processes()
- messages: List[str] = []
- if stop_process(processes["backend"], "backend", state):
- messages.append("Backend stopped.")
- if stop_process(processes["frontend"], "frontend", state):
- messages.append("Frontend stopped.")
- if not messages:
- messages.append("No services were running.")
- return messages
-
-
-def open_dashboard(state: LauncherState) -> None:
- if not state.mark_dashboard_opened():
- return
-
- def _open() -> None:
- time.sleep(2)
- webbrowser.open("http://localhost:5173")
-
- state.add_message("Opening RightCodes dashboard in browser (http://localhost:5173)...")
- threading.Thread(target=_open, daemon=True).start()
-
-
-def open_launcher_ui(host: str, port: int, state: LauncherState) -> None:
- if not state.mark_ui_opened():
- return
-
- url = f"http://{host}:{port}/"
- state.add_message(f"Opening launcher UI in browser ({url})...")
-
- def _open() -> None:
- time.sleep(1)
- webbrowser.open(url)
-
- threading.Thread(target=_open, daemon=True).start()
-
-
-def cleanup_processes(state: LauncherState) -> None:
- processes = state.detach_processes()
- for name in ("backend", "frontend"):
- proc = processes.get(name)
- if proc and proc.poll() is None:
- try:
- stop_process(proc, name, state)
- except Exception as exc: # pragma: no cover - best-effort cleanup
- state.add_error(f"Failed to stop {name} during cleanup: {exc}")
-
-
-def make_handler(state: LauncherState, html_content: str):
- class LauncherHandler(http.server.BaseHTTPRequestHandler):
- def log_message(self, format: str, *args) -> None: # noqa: D401 - silence default logging
- return
-
- def _send_html(self, content: str, status: HTTPStatus = HTTPStatus.OK) -> None:
- encoded = content.encode("utf-8")
- self.send_response(status)
- self.send_header("Content-Type", "text/html; charset=utf-8")
- self.send_header("Content-Length", str(len(encoded)))
- self.end_headers()
- self.wfile.write(encoded)
-
- def _send_json(self, payload: Dict[str, object], status: HTTPStatus = HTTPStatus.OK) -> None:
- encoded = json.dumps(payload).encode("utf-8")
- self.send_response(status)
- self.send_header("Content-Type", "application/json; charset=utf-8")
- self.send_header("Cache-Control", "no-store")
- self.send_header("Content-Length", str(len(encoded)))
- self.end_headers()
- self.wfile.write(encoded)
-
- def do_GET(self) -> None: # noqa: D401
- path = urllib.parse.urlsplit(self.path).path
- if path in ("/", "/index.html"):
- self._send_html(html_content)
- elif path == "/status":
- self._send_json(state.get_status())
- else:
- self.send_error(HTTPStatus.NOT_FOUND, "Not Found")
-
- def do_POST(self) -> None: # noqa: D401
- path = urllib.parse.urlsplit(self.path).path
- if path == "/start":
- self._handle_start()
- elif path == "/stop":
- self._handle_stop()
- elif path == "/shutdown":
- self._handle_shutdown()
- else:
- self.send_error(HTTPStatus.NOT_FOUND, "Not Found")
-
- def _read_body(self) -> str:
- length = int(self.headers.get("Content-Length", "0") or "0")
- return self.rfile.read(length).decode("utf-8")
-
- def _handle_start(self) -> None:
- if not state.prepared:
- self._send_json(
- {
- "ok": False,
- "error": "Environment setup did not complete successfully. Review the log above and restart the launcher once issues are resolved.",
- "status": state.get_status(),
- },
- status=HTTPStatus.SERVICE_UNAVAILABLE,
- )
- return
-
- body = self._read_body()
- content_type = self.headers.get("Content-Type", "")
- key: str
- if "application/json" in content_type:
- data = json.loads(body or "{}")
- key = str(data.get("openai_key", "") or "")
- else:
- data = urllib.parse.parse_qs(body)
- key = data.get("openai_key", [""])[0]
-
- try:
- messages = start_services(state, key)
- except ValueError as exc:
- self._send_json({"ok": False, "error": str(exc)}, status=HTTPStatus.BAD_REQUEST)
- return
- except RuntimeError as exc:
- state.add_error(str(exc))
- self._send_json({"ok": False, "error": str(exc)}, status=HTTPStatus.INTERNAL_SERVER_ERROR)
- return
- except Exception as exc: # pragma: no cover - unexpected failures
- state.add_error(f"Unexpected error while starting services: {exc}")
- self._send_json(
- {"ok": False, "error": "Unexpected error starting services. Check the console for details."},
- status=HTTPStatus.INTERNAL_SERVER_ERROR,
- )
- return
-
- self._send_json({"ok": True, "message": messages, "status": state.get_status()})
-
- def _handle_stop(self) -> None:
- messages = stop_services(state)
- self._send_json({"ok": True, "message": messages, "status": state.get_status()})
-
- def _handle_shutdown(self) -> None:
- messages = stop_services(state)
- shutdown_note = "Launcher shutting down. You can close this tab."
- state.add_message(shutdown_note)
- payload = {
- "ok": True,
- "message": [shutdown_note, *messages] if messages else [shutdown_note],
- "status": state.get_status(),
- }
- self._send_json(payload)
-
- def _shutdown() -> None:
- time.sleep(0.5)
- try:
- cleanup_processes(state)
- finally:
- try:
- self.server.shutdown()
- except Exception as exc: # pragma: no cover - best effort
- state.add_error(f"Failed to shut down launcher server: {exc}")
-
- threading.Thread(target=_shutdown, daemon=True).start()
-
- return LauncherHandler
-
-
-def parse_args() -> argparse.Namespace:
- parser = argparse.ArgumentParser(description="RightCodes launcher with HTML control panel.")
- parser.add_argument("--host", default="127.0.0.1", help="Host/interface for the launcher UI (default: 127.0.0.1).")
- parser.add_argument("--port", type=int, default=8765, help="Port for the launcher UI (default: 8765).")
- parser.add_argument(
- "--no-browser",
- action="store_true",
- help="Do not automatically open the launcher UI in the default browser.",
- )
- return parser.parse_args()
-
-
-def main() -> int:
- args = parse_args()
- if not LAUNCHER_HTML_PATH.exists():
- print(f"[ERROR] Launcher HTML file not found: {LAUNCHER_HTML_PATH}")
- return 1
-
- html_content = LAUNCHER_HTML_PATH.read_text(encoding="utf-8")
- state = LauncherState()
-
- try:
- ensure_environment(state)
- except Exception as exc:
- state.add_error(str(exc))
- state.add_message(
- "Environment preparation failed. Resolve the error(s) above and restart the launcher."
- )
-
- handler_class = make_handler(state, html_content)
- server = http.server.ThreadingHTTPServer((args.host, args.port), handler_class)
- state.add_message(f"Launcher UI available at http://{args.host}:{args.port}/")
-
- atexit.register(cleanup_processes, state)
- if not args.no_browser:
- open_launcher_ui(args.host, args.port, state)
-
- try:
- server.serve_forever()
- except KeyboardInterrupt:
- state.add_message("Launcher interrupted; shutting down.")
- finally:
- server.server_close()
- cleanup_processes(state)
-
- return 0
-
-
-if __name__ == "__main__":
- sys.exit(main())
diff --git a/temp_patch.txt b/temp_patch.txt
deleted file mode 100644
index 8b137891791fe96927ad78e64b0aad7bded08bdc..0000000000000000000000000000000000000000
--- a/temp_patch.txt
+++ /dev/null
@@ -1 +0,0 @@
-
diff --git a/tools/build_offline_package.py b/tools/build_offline_package.py
deleted file mode 100644
index 7fe0e417ad13505052e7631584c5a1154e352704..0000000000000000000000000000000000000000
--- a/tools/build_offline_package.py
+++ /dev/null
@@ -1,152 +0,0 @@
-#!/usr/bin/env python3
-"""Create an offline-ready bundle of the RightCodes workspace.
-
-The bundle includes:
- * The full repository contents (excluding transient caches like dist/ and .git/)
- * The populated Python virtual environment (.venv/)
- * The frontend node_modules/ directory
-
-Usage:
- python tools/build_offline_package.py [--output dist/rightcodes-offline.zip]
-
-Run this after the launcher has successfully installed dependencies so the
-virtualenv and node_modules folders exist. Distribute the resulting archive
-to machines that need an offline copy (same OS/architecture as the machine
-that produced the bundle).
-"""
-
-from __future__ import annotations
-
-import argparse
-import os
-import shutil
-import sys
-import tempfile
-from pathlib import Path
-from typing import Iterable
-from zipfile import ZIP_DEFLATED, ZipFile
-
-
-REPO_ROOT = Path(__file__).resolve().parents[1]
-DIST_DIR = REPO_ROOT / "dist"
-DEFAULT_OUTPUT = DIST_DIR / "rightcodes-offline.zip"
-
-MANDATORY_PATHS = [
- REPO_ROOT / ".venv",
- REPO_ROOT / "frontend" / "node_modules",
-]
-
-EXCLUDED_NAMES = {
- ".git",
- ".mypy_cache",
- "__pycache__",
- ".pytest_cache",
- "dist",
- ".idea",
- ".vscode",
-}
-
-EXCLUDED_SUFFIXES = {".pyc", ".pyo", ".pyd", ".log"}
-
-
-def parse_args() -> argparse.Namespace:
- parser = argparse.ArgumentParser(description=__doc__)
- parser.add_argument(
- "--output",
- type=Path,
- default=DEFAULT_OUTPUT,
- help=f"Path to the archive to create (default: {DEFAULT_OUTPUT})",
- )
- parser.add_argument(
- "--force",
- action="store_true",
- help="Overwrite the output archive if it already exists.",
- )
- parser.add_argument(
- "--python-runtime",
- type=Path,
- help="Path to a portable Python runtime directory to embed (optional).",
- )
- parser.add_argument(
- "--node-runtime",
- type=Path,
- help="Path to a portable Node.js runtime directory to embed (optional).",
- )
- return parser.parse_args()
-
-
-def ensure_prerequisites(paths: Iterable[Path]) -> None:
- missing = [str(p) for p in paths if not p.exists()]
- if missing:
- joined = "\n - ".join(missing)
- raise SystemExit(
- "Cannot create offline bundle. Ensure the launcher has run once so the following paths exist:\n"
- f" - {joined}"
- )
-
-
-def main() -> int:
- args = parse_args()
-
- ensure_prerequisites(MANDATORY_PATHS)
-
- if args.python_runtime and not args.python_runtime.exists():
- raise SystemExit(f"Portable Python runtime not found: {args.python_runtime}")
- if args.node_runtime and not args.node_runtime.exists():
- raise SystemExit(f"Portable Node.js runtime not found: {args.node_runtime}")
-
- output_path = args.output
- output_path.parent.mkdir(parents=True, exist_ok=True)
-
- if output_path.exists():
- if args.force:
- output_path.unlink()
- else:
- raise SystemExit(f"Output archive already exists: {output_path} (use --force to overwrite).")
-
- print(f"[INFO] Creating offline bundle at {output_path}")
- with tempfile.TemporaryDirectory() as tmpdir:
- staging_root = Path(tmpdir) / REPO_ROOT.name
- shutil.copytree(REPO_ROOT, staging_root, dirs_exist_ok=True)
- for excluded in EXCLUDED_NAMES:
- for candidate in staging_root.rglob(excluded):
- if candidate.is_dir():
- shutil.rmtree(candidate, ignore_errors=True)
- else:
- candidate.unlink(missing_ok=True)
- for suffix in EXCLUDED_SUFFIXES:
- for candidate in staging_root.rglob(f"*{suffix}"):
- candidate.unlink(missing_ok=True)
-
- portable_root = staging_root / "portable"
- if args.python_runtime:
- destination = portable_root / "python"
- print(f"[INFO] Embedding portable Python runtime from {args.python_runtime} -> {destination}")
- destination.parent.mkdir(parents=True, exist_ok=True)
- if destination.exists():
- shutil.rmtree(destination, ignore_errors=True)
- shutil.copytree(args.python_runtime, destination)
- if args.node_runtime:
- destination = portable_root / "node"
- print(f"[INFO] Embedding portable Node.js runtime from {args.node_runtime} -> {destination}")
- destination.parent.mkdir(parents=True, exist_ok=True)
- if destination.exists():
- shutil.rmtree(destination, ignore_errors=True)
- shutil.copytree(args.node_runtime, destination)
-
- with ZipFile(output_path, mode="w", compression=ZIP_DEFLATED, compresslevel=9) as zip_file:
- for item in sorted(staging_root.rglob("*")):
- arcname = item.relative_to(staging_root).as_posix()
- if item.is_dir():
- zip_file.writestr(arcname + "/", "")
- continue
- zip_file.write(item, arcname=arcname)
-
- print("[INFO] Offline bundle created successfully.")
- print("[INFO] Distribute the archive, extract it on the target machine, then launch via start-rightcodes.ps1/.bat.")
- return 0
-
-
-if __name__ == "__main__":
- os.umask(0o022)
- sys.exit(main())
diff --git a/workers/README.md b/workers/README.md
deleted file mode 100644
index 2a95907c0ff6140909b8f1b64bc85d9238dace76..0000000000000000000000000000000000000000
--- a/workers/README.md
+++ /dev/null
@@ -1,12 +0,0 @@
-# Worker Pipelines
-
-Async and background jobs that ingest user artefacts, parse documents, and execute the staged processing pipelines without blocking the API.
-
-## Structure
-
-- `ingestion/` - Session bootstrap jobs: file validation, antivirus scanning, metadata capture.
-- `docx_processing/` - Word-to-markdown and JSON conversion tasks plus style map extraction (see `parse_docx` for paragraph/table summarisation).
-- `pdf_processing/` - OCR, text extraction, and structure inference for standards PDFs (see `parse_pdf` for page chunking with code detection and OCR supplements).
-- `pipelines/` - Composed job graphs representing the multi-agent workflow.
-- `queue/` - Task queue adapters (Celery, RQ, or custom) and worker entrypoints.
-- `tests/` - Worker-level unit tests and pipeline simulation suites.
diff --git a/workers/__init__.py b/workers/__init__.py
deleted file mode 100644
index 426e8155df14d11aa5255535b8d6f56a86a6144c..0000000000000000000000000000000000000000
--- a/workers/__init__.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from .pipelines.bootstrap import bootstrap_pipeline_runner
-
-__all__ = ["bootstrap_pipeline_runner"]
diff --git a/workers/docx_processing/__init__.py b/workers/docx_processing/__init__.py
deleted file mode 100644
index fbc23d3dd35e1252b787d6e81de82621e3446919..0000000000000000000000000000000000000000
--- a/workers/docx_processing/__init__.py
+++ /dev/null
@@ -1,8 +0,0 @@
-from .parser import DocxParseResult, TableBlock, ParagraphBlock, parse_docx
-
-__all__ = [
- "DocxParseResult",
- "TableBlock",
- "ParagraphBlock",
- "parse_docx",
-]
diff --git a/workers/docx_processing/parser.py b/workers/docx_processing/parser.py
deleted file mode 100644
index 1232bfeb29601f4f770321876b9fab2fd7594620..0000000000000000000000000000000000000000
--- a/workers/docx_processing/parser.py
+++ /dev/null
@@ -1,104 +0,0 @@
-from __future__ import annotations
-
-import re
-from dataclasses import dataclass, field
-from pathlib import Path
-from typing import Iterable, List, Optional, Sequence
-
-from docx import Document
-from docx.table import _Cell as DocxCell # type: ignore[attr-defined]
-from docx.text.paragraph import Paragraph as DocxParagraph
-
-REFERENCE_PATTERN = re.compile(r"(?P[A-Z]{2,}[\s-]?\d{2,}(?:\.\d+)*)")
-HEADING_PATTERN = re.compile(r"Heading\s+(\d+)", re.IGNORECASE)
-
-
-@dataclass
-class ParagraphBlock:
- index: int
- text: str
- style: Optional[str] = None
- heading_level: Optional[int] = None
- references: list[str] = field(default_factory=list)
-
-
-@dataclass
-class TableBlock:
- index: int
- rows: list[list[str]]
- references: list[str] = field(default_factory=list)
-
-
-@dataclass
-class DocxParseResult:
- path: Path
- paragraphs: list[ParagraphBlock] = field(default_factory=list)
- tables: list[TableBlock] = field(default_factory=list)
-
-
-def parse_docx(path: Path) -> DocxParseResult:
- if not path.exists():
- raise FileNotFoundError(path)
-
- document = Document(path)
- result = DocxParseResult(path=path)
-
- for idx, paragraph in enumerate(_iter_paragraphs(document.paragraphs)):
- text = paragraph.text.strip()
- if not text:
- continue
- references = _extract_references(text)
- style_name = paragraph.style.name if paragraph.style else None
- heading_level = _resolve_heading_level(style_name)
- result.paragraphs.append(
- ParagraphBlock(
- index=idx,
- text=text,
- style=style_name,
- heading_level=heading_level,
- references=references,
- )
- )
-
- for idx, docx_table in enumerate(document.tables):
- rows: list[list[str]] = []
- references: list[str] = []
- for row in docx_table.rows:
- cells_text = [_normalise_text(cell) for cell in row.cells]
- rows.append(cells_text)
- for value in cells_text:
- references.extend(_extract_references(value))
- result.tables.append(
- TableBlock(
- index=idx,
- rows=rows,
- references=sorted(set(references)),
- )
- )
-
- return result
-
-
-def _resolve_heading_level(style_name: Optional[str]) -> Optional[int]:
- if not style_name:
- return None
- match = HEADING_PATTERN.search(style_name)
- if match:
- try:
- return int(match.group(1))
- except ValueError:
- return None
- return None
-
-
-def _iter_paragraphs(paragraphs: Sequence[DocxParagraph]) -> Iterable[DocxParagraph]:
- yield from paragraphs
-
-
-def _normalise_text(cell: DocxCell) -> str:
- value = cell.text.strip()
- return re.sub(r"\s+", " ", value)
-
-
-def _extract_references(value: str) -> list[str]:
- return sorted({match.group("code") for match in REFERENCE_PATTERN.finditer(value)})
diff --git a/workers/pdf_processing/__init__.py b/workers/pdf_processing/__init__.py
deleted file mode 100644
index e916881832d43bcb9220d6f061e4cf6506e554d6..0000000000000000000000000000000000000000
--- a/workers/pdf_processing/__init__.py
+++ /dev/null
@@ -1,7 +0,0 @@
-from .parser import PdfParseResult, PdfPageChunk, parse_pdf
-
-__all__ = [
- "PdfParseResult",
- "PdfPageChunk",
- "parse_pdf",
-]
diff --git a/workers/pdf_processing/parser.py b/workers/pdf_processing/parser.py
deleted file mode 100644
index 70922588add7d3a8734fac37975f5c9ce4deca5c..0000000000000000000000000000000000000000
--- a/workers/pdf_processing/parser.py
+++ /dev/null
@@ -1,175 +0,0 @@
-from __future__ import annotations
-
-import difflib
-import logging
-import re
-from dataclasses import dataclass, field
-from pathlib import Path
-from typing import Iterable, List, Optional
-
-import pdfplumber
-
-try:
- import pytesseract
-except ImportError: # pragma: no cover - optional dependency
- pytesseract = None
-
-logger = logging.getLogger(__name__)
-
-OCR_AVAILABLE = pytesseract is not None
-
-REFERENCE_PATTERN = re.compile(
- r"(?P[A-Z]{2,}[\s-]?\d{2,}(?:\.\d+)*)"
-)
-CLAUSE_PATTERN = re.compile(r"\b\d+(?:\.\d+){1,3}\b")
-
-
-@dataclass
-class PdfPageChunk:
- page_number: int
- chunk_index: int
- text: str
- heading: Optional[str] = None
- clause_numbers: list[str] = field(default_factory=list)
- references: list[str] = field(default_factory=list)
- is_ocr: bool = False
-
-
-@dataclass
-class PdfParseResult:
- path: Path
- chunks: list[PdfPageChunk] = field(default_factory=list)
- ocr_pages: list[int] = field(default_factory=list)
-
-
-def parse_pdf(path: Path, *, max_chunk_chars: int = 1200) -> PdfParseResult:
- if not path.exists():
- raise FileNotFoundError(path)
-
- result = PdfParseResult(path=path)
-
- with pdfplumber.open(path) as pdf:
- for page_number, page in enumerate(pdf.pages, start=1):
- base_text = page.extract_text() or ""
- ocr_text = ""
- ocr_additions: list[str] = []
-
- if OCR_AVAILABLE:
- try:
- image = page.to_image(resolution=300).original
- ocr_text = pytesseract.image_to_string(image) or ""
- ocr_additions = _extract_ocr_additions(base_text, ocr_text)
- except Exception as exc: # noqa: BLE001
- logger.warning("OCR failed for %s page %s: %s", path, page_number, exc)
- ocr_text = ocr_text or ""
- ocr_additions = []
-
- chunk_counter = 0
- base_text_stripped = base_text.strip()
- if base_text_stripped:
- for chunk in _chunk_text(base_text_stripped, max_chunk_chars):
- references = _extract_references(chunk)
- heading = _detect_heading(chunk)
- clause_numbers = sorted({match.group(0) for match in CLAUSE_PATTERN.finditer(chunk)})
- result.chunks.append(
- PdfPageChunk(
- page_number=page_number,
- chunk_index=chunk_counter,
- text=chunk,
- heading=heading,
- clause_numbers=clause_numbers,
- references=references,
- is_ocr=False,
- )
- )
- chunk_counter += 1
-
- additions_text = "\n".join(ocr_additions).strip()
- if additions_text:
- if page_number not in result.ocr_pages:
- result.ocr_pages.append(page_number)
- for chunk in _chunk_text(additions_text, max_chunk_chars):
- references = _extract_references(chunk)
- heading = _detect_heading(chunk)
- clause_numbers = sorted({match.group(0) for match in CLAUSE_PATTERN.finditer(chunk)})
- result.chunks.append(
- PdfPageChunk(
- page_number=page_number,
- chunk_index=chunk_counter,
- text=chunk,
- heading=heading,
- clause_numbers=clause_numbers,
- references=references,
- is_ocr=True,
- )
- )
- chunk_counter += 1
-
- if not base_text_stripped and not additions_text:
- logger.debug("Skipping empty page %s in %s", page_number, path)
-
- return result
-
-
-def _chunk_text(text: str, max_chunk_chars: int) -> Iterable[str]:
- cleaned = re.sub(r"\s+", " ", text).strip()
- if len(cleaned) <= max_chunk_chars:
- yield cleaned
- return
-
- sentences = re.split(r"(?<=[\.\?!])\s+", cleaned)
- buffer: List[str] = []
- buffer_len = 0
- for sentence in sentences:
- sentence_len = len(sentence)
- if buffer_len + sentence_len + 1 > max_chunk_chars and buffer:
- yield " ".join(buffer).strip()
- buffer = [sentence]
- buffer_len = sentence_len
- else:
- buffer.append(sentence)
- buffer_len += sentence_len + 1
- if buffer:
- yield " ".join(buffer).strip()
-
-
-def _extract_references(value: str) -> list[str]:
- return sorted({match.group("code") for match in REFERENCE_PATTERN.finditer(value)})
-
-
-def _detect_heading(text: str) -> Optional[str]:
- lines = text.splitlines()
- for line in lines:
- stripped = line.strip()
- if not stripped:
- continue
- if stripped.isupper() or stripped.endswith(":") or _looks_like_clause(stripped):
- return stripped[:200]
- if len(stripped.split()) <= 8 and stripped == stripped.title():
- return stripped[:200]
- break
- return None
-
-
-def _looks_like_clause(line: str) -> bool:
- return bool(re.match(r"^\d+(\.\d+)*", line))
-
-
-def _extract_ocr_additions(base_text: str, ocr_text: str) -> list[str]:
- """Return OCR lines not already present in the base extraction."""
- if not ocr_text.strip():
- return []
- ocr_lines = _normalise_lines(ocr_text)
- if not base_text.strip():
- return ocr_lines
- base_lines = _normalise_lines(base_text)
- matcher = difflib.SequenceMatcher(a=base_lines, b=ocr_lines)
- additions: list[str] = []
- for tag, _i1, _i2, j1, j2 in matcher.get_opcodes():
- if tag in ("replace", "insert"):
- additions.extend(ocr_lines[j1:j2])
- return additions
-
-
-def _normalise_lines(text: str) -> list[str]:
- return [line.strip() for line in text.splitlines() if line.strip()]
diff --git a/workers/pipelines/bootstrap.py b/workers/pipelines/bootstrap.py
deleted file mode 100644
index d8973f06f8aaedd3d38e6aeb34eb939158eae063..0000000000000000000000000000000000000000
--- a/workers/pipelines/bootstrap.py
+++ /dev/null
@@ -1,39 +0,0 @@
-from __future__ import annotations
-
-import asyncio
-from typing import Awaitable, Callable
-
-from agents import (
- ExportAgent,
- ExtractionAgent,
- OrchestratorAgent,
- RewriteAgent,
- StandardsMappingAgent,
- ValidationAgent,
-)
-from agents.shared.base import AgentContext
-
-AgentFactory = Callable[[], Awaitable[dict]]
-
-
-async def bootstrap_pipeline_runner(session_id: str) -> None:
- """Demonstrate how the agent chain could be invoked asynchronously."""
- context = AgentContext(session_id=session_id, payload={})
- orchestrator = OrchestratorAgent()
- extraction = ExtractionAgent()
- standards = StandardsMappingAgent()
- rewrite = RewriteAgent()
- validation = ValidationAgent()
- export = ExportAgent()
-
- await orchestrator.emit_debug("Bootstrapping placeholder pipeline.")
-
- stage_outputs = []
- stage_outputs.append(await orchestrator.run(context))
- stage_outputs.append(await extraction.run(context))
- stage_outputs.append(await standards.run(context))
- stage_outputs.append(await rewrite.run(context))
- stage_outputs.append(await validation.run(context))
- stage_outputs.append(await export.run(context))
-
- await orchestrator.emit_debug(f"Pipeline completed with {len(stage_outputs)} stage outputs.")
diff --git a/workers/queue/runner.py b/workers/queue/runner.py
deleted file mode 100644
index de33fcba1f45c3d1e8dc42226d67fb30dbf439af..0000000000000000000000000000000000000000
--- a/workers/queue/runner.py
+++ /dev/null
@@ -1,20 +0,0 @@
-import asyncio
-from typing import Callable
-
-from .types import TaskHandler
-
-
-class InMemoryQueue:
- """Simple FIFO queue that simulates background processing."""
-
- def __init__(self) -> None:
- self._handlers: dict[str, TaskHandler] = {}
-
- def register(self, name: str, handler: TaskHandler) -> None:
- self._handlers[name] = handler
-
- async def enqueue(self, name: str, **payload) -> None:
- handler = self._handlers.get(name)
- if not handler:
- raise ValueError(f"Handler not registered for task `{name}`")
- await handler(**payload)
diff --git a/workers/queue/types.py b/workers/queue/types.py
deleted file mode 100644
index e8c9e4137ab95581ed61654ae6438a795b6ac293..0000000000000000000000000000000000000000
--- a/workers/queue/types.py
+++ /dev/null
@@ -1,3 +0,0 @@
-from typing import Awaitable, Callable
-
-TaskHandler = Callable[..., Awaitable[None]]