AIencoder commited on
Commit
12ba16b
·
verified ·
1 Parent(s): 5bba5ed

Upload folder using huggingface_hub

Browse files
README.md CHANGED
@@ -1,12 +1,86 @@
1
  ---
2
- title: PROJECT CHIMERA
3
- emoji: 👁
4
- colorFrom: blue
5
- colorTo: purple
6
- sdk: gradio
7
- sdk_version: 6.3.0
8
  app_file: app.py
9
- pinned: false
 
10
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
 
12
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
1
  ---
2
+ title: PROJECT-CHIMERA
 
 
 
 
 
3
  app_file: app.py
4
+ sdk: gradio
5
+ sdk_version: 5.42.0
6
  ---
7
+ # Project Chimera: A Functional, Multi-Persona AI Agent
8
+
9
+ ## 1. Project Objective
10
+
11
+ Project Chimera has evolved from a conceptual framework into a functional, multi-persona AI agent. It uses a modular architecture to delegate tasks to specialized "personas" that are powered by the Google Gemini Pro API. This creates a system capable of providing expert-level analysis across different domains by adopting the correct context for each query.
12
+
13
+ ---
14
+
15
+ ## 2. Core Architecture: The Functional AI Framework
16
+
17
+ ### A. The Causal Reasoning Core (CRC)
18
+ The heart of Chimera. The CRC analyzes user prompts and uses a **Capability-Based Routing** system to delegate the task to the most appropriate AI persona.
19
+
20
+ ### B. Specialized Cognitive Modules (SCMs)
21
+ These are no longer placeholders. Each SCM is now a functional "AI Persona" that crafts a highly specialized, role-playing prompt to send to the Gemini API. This ensures the response is not generic, but tailored to the specific domain. Current functional modules include:
22
+ - **Abstract Symbology Module (ASM):** Acts as an expert code analyst and mathematician.
23
+ - **Sensory Fusion Engine (SFE):** Acts as a data scientist, analyzing and finding patterns in data descriptions.
24
+ - **Creative Synthesis Module (CSM):** Acts as a creative director and author, generating novel ideas and stories.
25
+
26
+ ### C. The Metacognitive Layer
27
+ The "overseer" layer monitors the system's operations, logging the delegation process and system status.
28
+
29
+ ---
30
+
31
+ ## 3. Setup and Configuration
32
+
33
+ ### 1. Prerequisites
34
+ - Python 3.x
35
+ - A Google Gemini API key. Get one from [Google AI Studio](https://aistudio.google.com/app/apikey).
36
+
37
+ ### 2. Install Dependencies
38
+ Open your terminal and run:
39
+ ```bash
40
+ pip install google-generativeai
41
+ 3. Configure Your API Key
42
+ Open the config.py file.
43
+ Find the line API_KEY = "YOUR_API_KEY_HERE".
44
+ Replace YOUR_API_KEY_HERE with your actual, secret API key.
45
+ IMPORTANT: Do NOT commit your real API key to a public GitHub repository.
46
+ 4. How to Run
47
+ Run the main entry point from your terminal:
48
+
49
+ Bash
50
+
51
+ python main.py
52
+ The script will initialize the system and run a series of demonstration tasks, printing the live, AI-generated responses to your console.
53
+
54
+
55
+ ***
56
+
57
+ ### `config.py`
58
+
59
+ ```python
60
+ # config.py
61
+ # System-wide configuration settings for Project Chimera.
62
+
63
+ # --- Gemini API Configuration ---
64
+ # IMPORTANT: Replace "YOUR_API_KEY_HERE" with your actual Google Gemini API key.
65
+ # This key should be kept secret and should not be committed to public version control.
66
+ API_KEY = "YOUR_API_KEY_HERE"
67
+
68
+ # --- System Configuration ---
69
+
70
+ # Set the operational mode for the AI
71
+ # 'development' - Enables verbose logging and debug information.
72
+ # 'production' - Optimized for performance with minimal console output.
73
+ OPERATION_MODE = 'development'
74
+
75
+ # Configuration for the Metacognitive Layer
76
+ METACOGNITIVE_CONFIG = {
77
+ "monitoring_interval_seconds": 10,
78
+ }
79
 
80
+ # Configuration for the Gemini Model
81
+ GEMINI_MODEL_CONFIG = {
82
+ "temperature": 0.4,
83
+ "top_p": 1,
84
+ "top_k": 32,
85
+ "max_output_tokens": 4096,
86
+ }
app.py ADDED
@@ -0,0 +1,70 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import gradio as gr
2
+ from src.chimera_core import Chimera
3
+
4
+ # Initialize the Beast
5
+ try:
6
+ chimera = Chimera()
7
+ except Exception as e:
8
+ chimera = None
9
+ print(f"Startup Error: {e}")
10
+
11
+ def chat_logic(message, history, mode_selection):
12
+ if not chimera:
13
+ return history + [["", "❌ Error: API Key missing. Check Settings -> Secrets."]], ""
14
+
15
+ # Map friendly names to internal codes
16
+ role_map = {
17
+ "Auto (Router)": "Auto",
18
+ "👨‍💻 ASM (Coder)": "ASM",
19
+ "🔬 SFE (Scientist)": "SFE",
20
+ "🎨 CSM (Writer)": "CSM"
21
+ }
22
+ selected_role = role_map.get(mode_selection, "Auto")
23
+
24
+ # Get response
25
+ response_text, active_module = chimera.process_request(message, history, selected_role)
26
+
27
+ # Format the output with the active module tag
28
+ final_response = f"**[{active_module} Active]**\n\n{response_text}"
29
+
30
+ history.append((message, final_response))
31
+ return history, ""
32
+
33
+ # --- Sci-Fi Theme CSS ---
34
+ custom_css = """
35
+ body {background-color: #0b0f19; color: #c9d1d9;}
36
+ .gradio-container {font-family: 'IBM Plex Mono', monospace;}
37
+ header {display: none !important;}
38
+ #chatbot {
39
+ height: 600px;
40
+ border: 1px solid #30363d;
41
+ background-color: #0d1117;
42
+ border-radius: 12px;
43
+ }
44
+ .feedback {font-size: 12px; color: #8b949e;}
45
+ """
46
+
47
+ with gr.Blocks(css=custom_css, title="Project Chimera") as demo:
48
+ gr.Markdown("# 🦁 Project Chimera")
49
+ gr.Markdown("*> Multi-Persona Routing System // Online*")
50
+
51
+ with gr.Row():
52
+ with gr.Column(scale=4):
53
+ chatbot = gr.Chatbot(elem_id="chatbot", type="messages")
54
+
55
+ with gr.Column(scale=1):
56
+ gr.Markdown("### ⚙️ System Controls")
57
+ mode = gr.Dropdown(
58
+ choices=["Auto (Router)", "👨‍💻 ASM (Coder)", "🔬 SFE (Scientist)", "🎨 CSM (Writer)"],
59
+ value="Auto (Router)",
60
+ label="Persona Mode",
61
+ interactive=True
62
+ )
63
+ gr.Markdown("*Select 'Auto' to let the AI decide, or force a specific module.*")
64
+
65
+ msg = gr.Textbox(placeholder="Enter command or query...", autofocus=True, label="User Input")
66
+
67
+ msg.submit(chat_logic, [msg, chatbot, mode], [chatbot, msg])
68
+
69
+ if __name__ == "__main__":
70
+ demo.launch()
requirements.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ google-genai
2
+ gradio
src/__pycache__/chimera_core.cpython-313.pyc ADDED
Binary file (5.46 kB). View file
 
src/__pycache__/chimera_core.cpython-314.pyc ADDED
Binary file (4.37 kB). View file
 
src/chimera_core.py ADDED
@@ -0,0 +1,80 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from google import genai
2
+ import os
3
+ import sys
4
+
5
+ class Chimera:
6
+ def __init__(self):
7
+ # On Hugging Face, we use os.getenv to grab the secret key safely
8
+ self.api_key = os.getenv("GEMINI_API_KEY")
9
+
10
+ # Fallback for local testing if env var is missing
11
+ if not self.api_key:
12
+ try:
13
+ import config
14
+ self.api_key = config.API_KEY
15
+ except ImportError:
16
+ pass
17
+
18
+ if not self.api_key:
19
+ raise ValueError("❌ CRITICAL: API Key missing! Set GEMINI_API_KEY in Secrets.")
20
+
21
+ self.client = genai.Client(api_key=self.api_key)
22
+ self.model_name = "gemini-2.5-flash"
23
+ print(f"🦁 Chimera Core: ONLINE [{self.model_name}]")
24
+
25
+ def _route_task(self, prompt):
26
+ """ The Router: Classifies the task automatically. """
27
+ routing_prompt = f"""
28
+ Classify this user task into one of these roles:
29
+ [ASM] - Abstract Symbology Module (Code, Math)
30
+ [SFE] - Sensory Fusion Engine (Data, Science)
31
+ [CSM] - Creative Synthesis Module (Story, Art)
32
+ [CHAT] - General conversation.
33
+
34
+ User Task: "{prompt}"
35
+ Reply ONLY with the tag (e.g., [ASM]).
36
+ """
37
+ try:
38
+ response = self.client.models.generate_content(
39
+ model=self.model_name,
40
+ contents=routing_prompt
41
+ )
42
+ tag = response.text.strip().replace("[", "").replace("]", "")
43
+ return tag
44
+ except:
45
+ return "CHAT"
46
+
47
+ def process_request(self, user_message, history, manual_role="Auto"):
48
+ """
49
+ Main Processor with Manual Override.
50
+ """
51
+ # 1. Determine Role (Auto or Manual)
52
+ if manual_role and manual_role != "Auto":
53
+ role = manual_role # User forced this role
54
+ print(f"👉 Manual Override: [{role}]")
55
+ else:
56
+ role = self._route_task(user_message)
57
+ print(f"👉 Auto-Routing to: [{role}]")
58
+
59
+ # 2. Assign the Persona
60
+ system_instruction = ""
61
+ if role == "ASM":
62
+ system_instruction = "You are the ASM (Abstract Symbology Module). You are an expert Software Architect and Mathematician. Your responses must be code-centric, precise, and optimized."
63
+ elif role == "SFE":
64
+ system_instruction = "You are the SFE (Sensory Fusion Engine). You are a Lead Data Scientist. Analyze this request using logic, empirical data, and patterns."
65
+ elif role == "CSM":
66
+ system_instruction = "You are the CSM (Creative Synthesis Module). You are a Visionary Author. Respond with creativity, rich imagery, and narrative flair."
67
+ else:
68
+ system_instruction = "You are Project Chimera, an advanced multi-persona AI system."
69
+
70
+ # 3. Generate
71
+ full_prompt = f"System Instruction: {system_instruction}\n\nUser Message: {user_message}"
72
+
73
+ try:
74
+ response = self.client.models.generate_content(
75
+ model=self.model_name,
76
+ contents=full_prompt
77
+ )
78
+ return response.text, role
79
+ except Exception as e:
80
+ return f"❌ System Error: {str(e)}", "ERR"
src/modules/__init__.py ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ # src/modules/__init__.py
2
+ # This file makes the 'modules' directory a Python package and allows for easy imports.
3
+
4
+ from .base_module import SpecializedCognitiveModule
5
+ from .sfe import SensoryFusionEngine
6
+ from .asm import AbstractSymbologyModule
7
+ from .pse import PredictiveSimulationEngine
8
+ from .csm import CreativeSynthesisModule
src/modules/__pycache__/__init__.cpython-313.pyc ADDED
Binary file (511 Bytes). View file
 
src/modules/__pycache__/__init__.cpython-314.pyc ADDED
Binary file (528 Bytes). View file
 
src/modules/__pycache__/asm.cpython-313.pyc ADDED
Binary file (2.57 kB). View file
 
src/modules/__pycache__/base_module.cpython-313.pyc ADDED
Binary file (2.14 kB). View file
 
src/modules/__pycache__/base_module.cpython-314.pyc ADDED
Binary file (2.21 kB). View file
 
src/modules/__pycache__/csm.cpython-313.pyc ADDED
Binary file (2.35 kB). View file
 
src/modules/__pycache__/pse.cpython-313.pyc ADDED
Binary file (1.84 kB). View file
 
src/modules/__pycache__/sfe.cpython-313.pyc ADDED
Binary file (2.7 kB). View file
 
src/modules/asm.py ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # src/modules/asm.py
2
+ # Functional implementation of the Abstract Symbology Module.
3
+
4
+ from .base_module import SpecializedCognitiveModule
5
+
6
+ class AbstractSymbologyModule(SpecializedCognitiveModule):
7
+ """
8
+ SCM specializing in formal logic, mathematics, and programming languages.
9
+ It acts as an expert code analyst.
10
+ """
11
+ def __init__(self):
12
+ super().__init__("ASM", "Abstract Symbology Module (Code, Math, Logic)")
13
+
14
+ def get_capabilities(self):
15
+ """Reports this module's skills to the CRC."""
16
+ return {'code', 'python', 'function', 'analyze', 'vulnerabilities', 'optimizations', 'math', 'logic'}
17
+
18
+ def construct_prompt(self, user_query):
19
+ """Constructs a prompt that tells Gemini to act as a code analyst."""
20
+ # We extract the core request from the user's query.
21
+ core_request = user_query.replace("Please analyze the following Python code for bugs and suggest improvements:", "").strip()
22
+
23
+ prompt = f"""
24
+ **Persona:** You are an elite software engineer and cybersecurity analyst. Your task is to perform a rigorous code review.
25
+
26
+ **Context:** You have been given the following code snippet to analyze.
27
+
28
+ **Code to Analyze:**
29
+ ```python
30
+ {core_request}
31
+ ```
32
+
33
+ **Instructions:**
34
+ 1. Identify any potential bugs, errors, or edge cases (e.g., infinite recursion, type errors, logical flaws).
35
+ 2. Identify any potential security vulnerabilities (e.g., injection attacks, unsafe operations).
36
+ 3. Suggest improvements for performance, readability, and adherence to best practices.
37
+ 4. Provide a corrected or refactored version of the code if necessary.
38
+ 5. Format your response clearly with sections for Bugs, Vulnerabilities, and Suggestions.
39
+ """
40
+ return prompt
src/modules/base_module.py ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # src/modules/base_module.py
2
+ # Defines the abstract base class for all Specialized Cognitive Modules (SCMs).
3
+
4
+ from abc import ABC, abstractmethod
5
+ from src.utils.logger import log
6
+ from src.utils.gemini_client import gemini_client # Import the shared client
7
+
8
+ class SpecializedCognitiveModule(ABC):
9
+ """
10
+ Abstract Base Class for all expert modules. Requires capabilities and
11
+ an execution method that calls the Gemini API.
12
+ """
13
+ def __init__(self, name, description):
14
+ self.name = name
15
+ self.description = description
16
+ log("SCM_Loader", f"Loaded: {self.name} ({self.description})")
17
+
18
+ @abstractmethod
19
+ def get_capabilities(self):
20
+ """Returns a set of keywords representing the module's skills."""
21
+ pass
22
+
23
+ @abstractmethod
24
+ def construct_prompt(self, user_query):
25
+ """Constructs a detailed, role-playing prompt for the Gemini API."""
26
+ pass
27
+
28
+ def execute(self, user_query):
29
+ """
30
+ Constructs the prompt and sends it to the Gemini client for processing.
31
+ """
32
+ specialized_prompt = self.construct_prompt(user_query)
33
+ log(self.name, "Executing task via Gemini API...")
34
+ response = gemini_client.generate(specialized_prompt)
35
+ return response
src/modules/csm.py ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # src/modules/csm.py
2
+ # Functional implementation of the Creative Synthesis Module.
3
+
4
+ from .base_module import SpecializedCognitiveModule
5
+
6
+ class CreativeSynthesisModule(SpecializedCognitiveModule):
7
+ """
8
+ SCM specializing in generating novel, creative, and artistic concepts.
9
+ It acts as a creative writer and concept artist.
10
+ """
11
+ def __init__(self):
12
+ super().__init__("CSM", "Creative Synthesis Module")
13
+
14
+ def get_capabilities(self):
15
+ """Reports this module's skills to the CRC."""
16
+ return {'creative', 'generate', 'concepts', 'ideas', 'art', 'design', 'story', 'visualization'}
17
+
18
+ def construct_prompt(self, user_query):
19
+ """Constructs a prompt that tells Gemini to act as a creative writer."""
20
+ core_request = user_query.replace("Generate a short story concept about", "").strip()
21
+
22
+ prompt = f"""
23
+ **Persona:** You are a visionary science fiction author and world-builder.
24
+
25
+ **Context:** You have been tasked with creating a compelling, original story concept based on a core idea.
26
+
27
+ **Core Idea:** "{core_request}"
28
+
29
+ **Instructions:**
30
+ 1. Generate a unique title for the story.
31
+ 2. Write a compelling logline (a one or two-sentence summary).
32
+ 3. Outline the main character(s), including their primary motivation and conflict.
33
+ 4. Describe the central plot, including an inciting incident, rising action, and a potential climax.
34
+ 5. Suggest a unique theme or question that the story explores.
35
+ """
36
+ return prompt
src/modules/pse.py ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # src/modules/pse.py
2
+ # Placeholder implementation of the Predictive Simulation Engine.
3
+
4
+ from .base_module import SpecializedCognitiveModule
5
+ from src.utils.logger import log
6
+ import time
7
+
8
+ class PredictiveSimulationEngine(SpecializedCognitiveModule):
9
+ """
10
+ SCM specializing in running high-fidelity, predictive simulations.
11
+ """
12
+ def __init__(self):
13
+ super().__init__("PSE", "Predictive Simulation Engine")
14
+
15
+ def get_capabilities(self):
16
+ """Reports this module's skills to the CRC."""
17
+ return ['simulation', 'forecast', 'predict', 'model', 'what-if']
18
+
19
+ def execute(self, sub_task, data=None):
20
+ """Executes simulation tasks."""
21
+ log(self.name, "Initializing predictive model and running simulation...")
22
+ time.sleep(2.5) # Simulate a complex simulation run
23
+ result = "Simulation complete. Forecast indicates a 15% market shift with 85% confidence."
24
+ log(self.name, f"Sub-task finished. Result: '{result}'")
25
+ return result
src/modules/sfe.py ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # src/modules/sfe.py
2
+ # Functional implementation of the Sensory Fusion Engine.
3
+
4
+ from .base_module import SpecializedCognitiveModule
5
+
6
+ class SensoryFusionEngine(SpecializedCognitiveModule):
7
+ """
8
+ SCM specializing in processing and finding correlations in data descriptions.
9
+ It acts as a data scientist.
10
+ """
11
+ def __init__(self):
12
+ super().__init__("SFE", "Sensory Fusion Engine (Data Ingestion & Correlation)")
13
+
14
+ def get_capabilities(self):
15
+ """Reports this module's skills to the CRC."""
16
+ return {'data', 'dataset', 'process', 'correlations', 'sales', 'traffic', 'marketing', 'find'}
17
+
18
+ def construct_prompt(self, user_query):
19
+ """Constructs a prompt that tells Gemini to act as a data scientist."""
20
+ core_request = user_query.replace("Process the following data description and identify potential causal links:", "").strip()
21
+
22
+ prompt = f"""
23
+ **Persona:** You are a senior data scientist with expertise in business intelligence and statistical analysis.
24
+
25
+ **Context:** You have been given a high-level summary of a dataset and a key business problem. You do not have the raw data, so you must reason based on the description provided.
26
+
27
+ **Data Description:**
28
+ "{core_request}"
29
+
30
+ **Instructions:**
31
+ 1. Formulate three distinct, plausible hypotheses that could explain the described situation (e.g., the sales dip).
32
+ 2. For each hypothesis, explain the potential causal link between the variables (e.g., "Hypothesis 1: The sales dip could be caused by a reduction in marketing spend, leading to lower website traffic and thus fewer conversions.").
33
+ 3. Suggest what specific data points or charts you would need to see from the raw data to prove or disprove each of your hypotheses.
34
+ 4. Provide a concluding summary of the most likely cause.
35
+ """
36
+ return prompt
src/simulation_enviroment ADDED
@@ -0,0 +1,70 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # src/simulation_environment.py
2
+ # A conceptual framework for the "Digital Crucible" simulation environment.
3
+ # Project Chimera would be trained within a far more advanced version of this.
4
+
5
+ import time
6
+ from src.utils.logger import log
7
+
8
+ class PhysicsEngine:
9
+ """
10
+ A simplified physics engine for the simulation.
11
+ """
12
+ def __init__(self):
13
+ self.gravity = -9.81 # m/s^2
14
+
15
+ def apply_force(self, obj, force):
16
+ """Applies a force to an object."""
17
+ # F = ma -> a = F/m
18
+ acceleration = force / obj.mass
19
+ obj.velocity += acceleration
20
+ log("PhysicsEngine", f"Applied force to '{obj.name}'. New velocity: {obj.velocity:.2f} m/s")
21
+
22
+ def update_position(self, obj):
23
+ """Updates an object's position based on its velocity."""
24
+ obj.position += obj.velocity
25
+ log("PhysicsEngine", f"Updated position for '{obj.name}'. Current position: {obj.position:.2f} m")
26
+
27
+ class SimulatedObject:
28
+ """Represents a generic object within the simulation."""
29
+ def __init__(self, name, mass, initial_position=0, initial_velocity=0):
30
+ self.name = name
31
+ self.mass = mass
32
+ self.position = initial_position
33
+ self.velocity = initial_velocity
34
+ log("SimulatedObject", f"Created '{name}' with mass {mass}kg.")
35
+
36
+ class Environment:
37
+ """
38
+ The main simulation environment class. Chimera would interact with this
39
+ world to learn cause and effect.
40
+ """
41
+ def __init__(self):
42
+ log("DigitalCrucible", "--- Environment Initializing ---", header=True)
43
+ self.physics = PhysicsEngine()
44
+ self.objects = []
45
+ self.time_step = 0
46
+ log("DigitalCrucible", "--- Environment Ready ---")
47
+
48
+ def add_object(self, obj):
49
+ """Adds an object to the simulation."""
50
+ self.objects.append(obj)
51
+
52
+ def run_step(self, chimera_action=None):
53
+ """Runs a single step of the simulation."""
54
+ self.time_step += 1
55
+ log("DigitalCrucible", f"--- Simulation Step {self.time_step} ---", header=True)
56
+
57
+ if chimera_action:
58
+ target_obj_name = chimera_action.get("target")
59
+ force_to_apply = chimera_action.get("force")
60
+ target = next((obj for obj in self.objects if obj.name == target_obj_name), None)
61
+
62
+ if target and force_to_apply:
63
+ log("ChimeraAction", f"AI attempts to apply {force_to_apply}N of force to '{target.name}'.")
64
+ self.physics.apply_force(target, force_to_apply)
65
+
66
+ # Update all objects
67
+ for obj in self.objects:
68
+ gravity_force = self.physics.gravity * obj.mass
69
+ self.physics.apply_force(obj, gravity_force)
70
+ self.physics.update_position(obj)
src/utils/__init__.py ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ # src/utils/__init__.py
2
+ # This file makes the 'utils' directory a Python package.
src/utils/__pycache__/__init__.cpython-313.pyc ADDED
Binary file (235 Bytes). View file
 
src/utils/__pycache__/__init__.cpython-314.pyc ADDED
Binary file (252 Bytes). View file
 
src/utils/__pycache__/gemini_client.cpython-313.pyc ADDED
Binary file (2.84 kB). View file
 
src/utils/__pycache__/gemini_client.cpython-314.pyc ADDED
Binary file (2.94 kB). View file
 
src/utils/__pycache__/logger.cpython-313.pyc ADDED
Binary file (1.21 kB). View file
 
src/utils/__pycache__/logger.cpython-314.pyc ADDED
Binary file (1.24 kB). View file
 
src/utils/gemini_client.py ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # src/utils/gemini_client.py
2
+ # A centralized client for interacting with the Google Gemini API.
3
+
4
+ import google.generativeai as genai
5
+ import config
6
+ from .logger import log
7
+
8
+ class GeminiClient:
9
+ """Handles all communication with the Gemini API."""
10
+
11
+ def __init__(self):
12
+ log("GeminiClient", "Initializing Gemini client...")
13
+ try:
14
+ genai.configure(api_key=config.API_KEY)
15
+ self.model = genai.GenerativeModel('gemini-1.5-pro')
16
+ self.generation_config = config.GEMINI_MODEL_CONFIG
17
+ log("GeminiClient", "Gemini client configured successfully.")
18
+ except Exception as e:
19
+ log("GeminiClient", f"FATAL: Failed to configure Gemini API. Check your API key. Error: {e}", level="ERROR")
20
+ self.model = None
21
+
22
+ def generate(self, specialized_prompt):
23
+ """
24
+ Sends a prompt to the Gemini API and returns the response.
25
+
26
+ Args:
27
+ specialized_prompt (str): A detailed, role-playing prompt crafted by an SCM.
28
+
29
+ Returns:
30
+ str: The text response from the Gemini API, or an error message.
31
+ """
32
+ if not self.model:
33
+ return "Error: Gemini client is not initialized. Please check your API key."
34
+
35
+ try:
36
+ log("GeminiClient", "Sending request to Gemini API...")
37
+ response = self.model.generate_content(
38
+ specialized_prompt,
39
+ generation_config=self.generation_config
40
+ )
41
+ log("GeminiClient", "Response received successfully.")
42
+ return response.text
43
+ except Exception as e:
44
+ log("GeminiClient", f"API call failed. Error: {e}", level="ERROR")
45
+ # Check for specific quota errors
46
+ if "quota" in str(e).lower():
47
+ return "API Error: You have exceeded your usage quota. Please check your plan and billing details."
48
+ return f"API Error: {e}"
49
+
50
+ # Create a single, shared instance of the client to be used by all modules
51
+ gemini_client = GeminiClient()
src/utils/logger.py ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # src/utils/logger.py
2
+ # A simple, centralized logging utility for Project Chimera.
3
+
4
+ import datetime
5
+ import config
6
+
7
+ def log(source, message, level="INFO", header=False):
8
+ """
9
+ A centralized logging function.
10
+ - source: The component of the system generating the log (e.g., "CRC", "SFE").
11
+ - message: The log message.
12
+ - level: The severity of the message (e.g., "INFO", "WARN", "ERROR").
13
+ - header: If True, prints a decorative header for major events.
14
+ """
15
+ if config.OPERATION_MODE == 'development' or level != "INFO":
16
+ timestamp = datetime.datetime.now().strftime("%H:%M:%S")
17
+ if header:
18
+ print(f"\n{'='*20} {message} {'='*20}")
19
+ else:
20
+ print(f"[{timestamp} | {source:<20} | {level:<5}] {message}")