Aravindhan11 commited on
Commit
8d9eecc
Β·
verified Β·
1 Parent(s): f9bc79b

Upload 11 files

Browse files
Files changed (11) hide show
  1. .gitignore +6 -0
  2. Dockerfile +10 -10
  3. README.md +53 -19
  4. builtin_commands.py +170 -0
  5. colors.py +17 -0
  6. display.py +38 -0
  7. docker-compose.yml +13 -0
  8. external_commands.py +14 -0
  9. gemini_client.py +65 -0
  10. llm_file_generator.py +202 -0
  11. main.py +35 -0
.gitignore ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ __pycache__/
2
+ *.pyc
3
+ *.pyo
4
+ *.pyd
5
+ *.swp
6
+ .env
Dockerfile CHANGED
@@ -1,20 +1,20 @@
1
- FROM python:3.13.5-slim
2
 
3
  WORKDIR /app
4
 
 
 
5
  RUN apt-get update && apt-get install -y \
6
- build-essential \
7
- curl \
8
  git \
 
9
  && rm -rf /var/lib/apt/lists/*
10
 
11
- COPY requirements.txt ./
12
- COPY src/ ./src/
13
-
14
- RUN pip3 install -r requirements.txt
15
 
16
- EXPOSE 8501
 
 
 
17
 
18
- HEALTHCHECK CMD curl --fail http://localhost:8501/_stcore/health
19
 
20
- ENTRYPOINT ["streamlit", "run", "src/streamlit_app.py", "--server.port=8501", "--server.address=0.0.0.0"]
 
1
+ FROM python:3.11-slim
2
 
3
  WORKDIR /app
4
 
5
+ COPY . /app
6
+
7
  RUN apt-get update && apt-get install -y \
 
 
8
  git \
9
+ curl \
10
  && rm -rf /var/lib/apt/lists/*
11
 
 
 
 
 
12
 
13
+ RUN pip install --no-cache-dir \
14
+ grpcio==1.60.1 \
15
+ google-generativeai \
16
+ python-dotenv
17
 
18
+ ENV PYTHONUNBUFFERED=1
19
 
20
+ CMD ["python", "main.py"]
README.md CHANGED
@@ -1,19 +1,53 @@
1
- ---
2
- title: AI Code Terminal
3
- emoji: πŸš€
4
- colorFrom: red
5
- colorTo: red
6
- sdk: docker
7
- app_port: 8501
8
- tags:
9
- - streamlit
10
- pinned: false
11
- short_description: Streamlit template space
12
- ---
13
-
14
- # Welcome to Streamlit!
15
-
16
- Edit `/src/streamlit_app.py` to customize this app to your heart's desire. :heart:
17
-
18
- If you have any questions, checkout our [documentation](https://docs.streamlit.io) and [community
19
- forums](https://discuss.streamlit.io).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ AI Coding Terminal
2
+ A custom AI-powered coding shell that builds, edits, and manages projects directly from natural language commands. Designed for developers who want to interact with code through conversational prompts, integrating LLMs, Docker, and advanced terminal features.
3
+
4
+ Features
5
+ Custom Shell & Cursor: Fully interactive terminal with a custom cursor and intelligent command handling.
6
+
7
+ AI-Powered Code Generation: Integrates Gemini API to interpret natural language and generate or edit Python projects.
8
+
9
+ Project Management: Build, edit, and organize projects from a single terminal interface.
10
+
11
+ Docker Integration: Run isolated environments for safe and reproducible code execution.
12
+
13
+ Extensible & Modular: Easily add new commands, AI models, or workflows.
14
+
15
+ Cross-Platform: Works in Linux, macOS, and Windows environments supporting Python and Docker.
16
+
17
+ Tech Stack
18
+ Languages: Python
19
+
20
+ APIs: Gemini LLM API
21
+
22
+ Containerization: Docker
23
+
24
+ Terminal Control: Custom shell implementation with advanced cursor management
25
+
26
+ Other: Optional extensions for code linting, testing, and deployment
27
+
28
+ Installation
29
+ Clone the repository:
30
+
31
+ git clone https://github.com/Aravindh-dev12/Codic--AI-Coding-Terminal.git
32
+ cd ai-coding-terminal
33
+ Install dependencies:
34
+
35
+ pip install -r requirements.txt
36
+ Configure Gemini API key:
37
+
38
+ export GEMINI_API_KEY="your_api_key_here"
39
+ Run Docker setup (optional for isolated environments):
40
+
41
+ docker-compose up
42
+ Usage
43
+ Start the AI coding terminal:
44
+
45
+ python main.py
46
+ Example commands:
47
+
48
+ create project my_app β†’ Creates a new Python project
49
+
50
+ add function for user authentication β†’ Adds Python functions based on natural language
51
+
52
+ refactor code in utils.py β†’ Refactors existing files with AI assistance
53
+
builtin_commands.py ADDED
@@ -0,0 +1,170 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import getpass
3
+ import socket
4
+ from datetime import datetime
5
+ import curses
6
+ import re
7
+ from colors import *
8
+ from display import clear_and_setup, show_ascii_art
9
+ from gemini_client import ask_gemini, ask_gemini_with_file_generation
10
+
11
+ def my_ls():
12
+ try:
13
+ files = os.listdir('.')
14
+ print(f"{CYAN}πŸ“ Files and Directories:{RESET}")
15
+ for item in sorted(files):
16
+ if os.path.isdir(item):
17
+ print(f" {BLUE}{BOLD}πŸ“ {item}/{RESET}")
18
+ else:
19
+ print(f" {GREEN}πŸ“„ {item}{RESET}")
20
+ except PermissionError:
21
+ print(f"{RED}❌ Permission denied{RESET}")
22
+
23
+ def my_pwd():
24
+ print(f"{YELLOW}πŸ“ Current directory: {BOLD}{os.getcwd()}{RESET}")
25
+
26
+ def my_date():
27
+ now = datetime.now()
28
+ print(f"{MAGENTA}πŸ•’ {now.strftime('%A, %B %d, %Y - %I:%M:%S %p')}{RESET}")
29
+
30
+ def my_whoami():
31
+ username = getpass.getuser()
32
+ hostname = socket.gethostname()
33
+ print(f"{CYAN}πŸ‘€ You are: {BOLD}{username}{RESET} on Acidop shell")
34
+
35
+ def my_help():
36
+ print(f"{BOLD}{CYAN}πŸ› οΈ codic Built-in Commands:{RESET}")
37
+ print(f"{GREEN} ls {RESET}- List files and directories (custom)")
38
+ print(f"{GREEN} pwd {RESET}- Show current directory (custom)")
39
+ print(f"{GREEN} cd <dir> {RESET}- Change directory")
40
+ print(f"{GREEN} date {RESET}- Show current date and time (custom)")
41
+ print(f"{GREEN} whoami {RESET}- Show current user info (custom)")
42
+ print(f"{GREEN} clear {RESET}- Clear the screen")
43
+ print(f"{GREEN} help {RESET}- Show this help message")
44
+ print(f"{GREEN} ascii {RESET}- Show cool ASCII art")
45
+ print(f"{GREEN} exit {RESET}- Exit the shell")
46
+ print(f"{GREEN} ai {RESET}- Enter AI mode with Gemini 2.5 Flash")
47
+ print(f"{GREEN} ai gen {RESET}- Enter AI mode with automatic file generation")
48
+ print(f"{YELLOW} <command> {RESET}- Run any system command")
49
+
50
+ def handle_builtin_commands(command_parts):
51
+ cmd = command_parts[0].lower()
52
+
53
+ if cmd == "ls":
54
+ my_ls()
55
+ return True
56
+ elif cmd == "pwd":
57
+ my_pwd()
58
+ return True
59
+ elif cmd == "cd":
60
+ try:
61
+ if len(command_parts) == 1:
62
+ os.chdir(os.path.expanduser("~"))
63
+ else:
64
+ os.chdir(command_parts[1])
65
+ print(f"{GREEN}βœ… Changed to: {os.getcwd()}{RESET}")
66
+ except FileNotFoundError:
67
+ print(f"{RED}❌ Directory not found: {command_parts[1]}{RESET}")
68
+ except PermissionError:
69
+ print(f"{RED}❌ Permission denied{RESET}")
70
+ return True
71
+ elif cmd == "date":
72
+ my_date()
73
+ return True
74
+ elif cmd == "whoami":
75
+ my_whoami()
76
+ return True
77
+ elif cmd == "clear":
78
+ clear_and_setup()
79
+ return True
80
+ elif cmd == "help":
81
+ my_help()
82
+ return True
83
+ elif cmd == "ascii":
84
+ show_ascii_art()
85
+ return True
86
+ elif cmd == "ai":
87
+ if len(command_parts) > 1 and command_parts[1].lower() == "gen":
88
+ print(f"{YELLOW}πŸ€– Entering AI Gen Mode (with file generation).{RESET}")
89
+ print(f"{CYAN}πŸ’‘ Tip: Ask me to 'create a Flask app' or 'build a React component'{RESET}")
90
+ print(f"{GREEN}πŸ“ Current directory: {os.getcwd()}{RESET}")
91
+ print(f"{YELLOW}Type 'exit' to leave.{RESET}\n")
92
+
93
+ while True:
94
+ query = input(f"{MAGENTA}ai-gen> {RESET}").strip()
95
+ if query.lower() in ["exit", "quit"]:
96
+ print(f"{YELLOW}πŸ‘‹ Leaving AI Gen mode.{RESET}")
97
+ break
98
+ if not query:
99
+ continue
100
+
101
+ print(f"{YELLOW}πŸ€– Thinking and generating...{RESET}\n")
102
+
103
+ print(f"{CYAN}πŸ“ Where should I create the files?{RESET}")
104
+ print(f"{YELLOW} Examples:{RESET}")
105
+ print(f"{YELLOW} - Press Enter (current directory){RESET}")
106
+ print(f"{YELLOW} - my_project (creates folder in current dir){RESET}")
107
+ print(f"{YELLOW} - ~/Desktop/my_app (absolute path){RESET}")
108
+ print(f"{YELLOW} - ../parent_folder/project (relative path){RESET}")
109
+ project_dir = input(f"{CYAN} Path: {RESET}").strip()
110
+
111
+ if not project_dir:
112
+ project_dir = "."
113
+ print(f"{GREEN} βœ“ Using current directory: {os.getcwd()}{RESET}")
114
+ else:
115
+ project_dir = os.path.expanduser(project_dir)
116
+
117
+ if not os.path.isabs(project_dir):
118
+ abs_path = os.path.abspath(project_dir)
119
+ print(f"{GREEN} βœ“ Will create in: {abs_path}{RESET}")
120
+ else:
121
+ print(f"{GREEN} βœ“ Will create in: {project_dir}{RESET}")
122
+
123
+ if not os.path.exists(project_dir):
124
+ confirm = input(f"{YELLOW} Directory doesn't exist. Create it? (Y/n): {RESET}").strip().lower()
125
+ if confirm and confirm != 'y' and confirm != '':
126
+ print(f"{RED} βœ— Cancelled{RESET}\n")
127
+ continue
128
+ try:
129
+ os.makedirs(project_dir, exist_ok=True)
130
+ print(f"{GREEN} βœ“ Created directory{RESET}")
131
+ except Exception as e:
132
+ print(f"{RED} βœ— Failed to create directory: {e}{RESET}\n")
133
+ continue
134
+
135
+ print()
136
+ response, files_generated = ask_gemini_with_file_generation(query, project_dir)
137
+
138
+ clean_response = re.sub(r'<file path="[^"]+">.*?</file>', '', response, flags=re.DOTALL)
139
+ clean_response = re.sub(r'<command>.*?</command>', '', clean_response, flags=re.DOTALL)
140
+ clean_response = clean_response.strip()
141
+
142
+ if clean_response:
143
+ print(f"\n{GREEN}{clean_response}{RESET}\n")
144
+
145
+ if not files_generated and '<file' not in response:
146
+ print(f"{YELLOW}πŸ’‘ Tip: Ask me to create or generate files!{RESET}\n")
147
+
148
+ elif len(command_parts) == 1:
149
+ print(f"{YELLOW}πŸ€– Entering AI mode. Type 'exit' to leave.{RESET}")
150
+ print(f"{CYAN}πŸ’‘ Use 'ai gen' for file generation mode{RESET}\n")
151
+
152
+ while True:
153
+ query = input(f"{CYAN}ai> {RESET}").strip()
154
+ if query.lower() in ["exit", "quit"]:
155
+ print(f"{YELLOW}πŸ‘‹ Leaving AI mode.{RESET}")
156
+ break
157
+ if not query:
158
+ continue
159
+ print(f"{YELLOW}πŸ€– Thinking...{RESET}")
160
+ answer = ask_gemini(query)
161
+ print(f"{GREEN}{answer}{RESET}\n")
162
+ else:
163
+ query = " ".join(command_parts[1:])
164
+ print(f"{YELLOW}πŸ€– Thinking...{RESET}")
165
+ answer = ask_gemini(query)
166
+ print(f"{GREEN}{answer}{RESET}")
167
+
168
+ return True
169
+
170
+ return False
colors.py ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # colors.py
2
+
3
+ # Text Colors
4
+ RED = '\033[31m'
5
+ GREEN = '\033[32m'
6
+ YELLOW = '\033[33m'
7
+ BLUE = '\033[34m'
8
+ MAGENTA = '\033[35m'
9
+ CYAN = '\033[36m'
10
+ WHITE = '\033[37m'
11
+
12
+ # Styles
13
+ BOLD = '\033[1m'
14
+ RESET = '\033[0m'
15
+
16
+ # Background
17
+ BG_BLACK = '\033[40m'
display.py ADDED
@@ -0,0 +1,38 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # display.py
2
+ import os
3
+ import getpass
4
+ import socket
5
+ from datetime import datetime
6
+ from colors import *
7
+
8
+ def clear_and_setup():
9
+ """Clear terminal and setup background"""
10
+ os.system('clear')
11
+ print(BG_BLACK, end='')
12
+ print(f"{CYAN}{BOLD}")
13
+ print(" ╔═══════════════════════════════════════╗")
14
+ print(" β•‘ Codic β•‘")
15
+ print(" β•‘ Your Custom Shell β•‘")
16
+ print(" β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•")
17
+ print(f"{RESET}\n")
18
+
19
+ def get_colored_prompt():
20
+ """Create colorful prompt with current path"""
21
+ username = getpass.getuser()
22
+ hostname = socket.gethostname()
23
+ current_path = os.getcwd()
24
+
25
+ if len(current_path) > 25:
26
+ current_path = "..." + current_path[-22:]
27
+
28
+ return f"{GREEN}{BOLD}{username}{RESET}@{BLUE}{hostname}{RESET}:{YELLOW}{BOLD}{current_path}{RESET}> AcidopShell$ "
29
+
30
+ def show_ascii_art():
31
+ """Show cool ASCII art as background"""
32
+ art = f"""{BLUE}
33
+ β €β €β €β €β €β €β €β €β£€β£€β£Άβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Άβ£€β£€β €β €β €β €β €β €β €β €
34
+ ⠀⠀⠀⠀⠀⣠⣾⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣿⣷⣄⠀⠀⠀⠀⠀
35
+ β €β €β €β’€β£Ύβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£·β‘€β €β €β €
36
+ β €β €β’ β£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ£Ώβ‘„β €β €
37
+ {RESET}"""
38
+ print(art)
docker-compose.yml ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ version: '3.8'
2
+
3
+ services:
4
+ acidop-shell:
5
+ build: .
6
+ container_name: acidop-shell
7
+ stdin_open: true
8
+ tty: true
9
+ env_file:
10
+ - .env
11
+ volumes:
12
+ - .:/app
13
+ working_dir: /app
external_commands.py ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # external_commands.py
2
+ import subprocess
3
+ from colors import *
4
+
5
+ def run_external_command(command_parts):
6
+ """Run external system commands"""
7
+ try:
8
+ result = subprocess.run(command_parts)
9
+ if result.returncode != 0:
10
+ print(f"{RED}⚠️ Command exited with code: {result.returncode}{RESET}")
11
+ except FileNotFoundError:
12
+ print(f"{RED}❌ Command not found: {command_parts[0]}{RESET}")
13
+ except KeyboardInterrupt:
14
+ print(f"{YELLOW}⏸️ Command interrupted{RESET}")
gemini_client.py ADDED
@@ -0,0 +1,65 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # gemini_client.py
2
+ import os
3
+ from dotenv import load_dotenv
4
+ import google.generativeai as genai
5
+ from llm_file_generator import file_generator
6
+
7
+ load_dotenv()
8
+
9
+ api_key = os.getenv("GEMINI_API_KEY")
10
+ if not api_key:
11
+ raise ValueError("❌ GEMINI_API_KEY missing. Ensure it's in .env and passed with --env-file")
12
+
13
+ genai.configure(api_key=api_key)
14
+
15
+ SYSTEM_INSTRUCTION = f"""You are AcidopShell AI Assistant - a helpful coding assistant integrated into a custom shell.
16
+ You can answer questions, explain concepts, and generate code files.
17
+
18
+ {file_generator.get_enhanced_prompt()}
19
+
20
+ Be concise but helpful. When generating files, always provide complete, working code.
21
+ You can provide explanations before or after the XML tags."""
22
+
23
+ model_with_files = genai.GenerativeModel(
24
+ "gemini-2.0-flash-exp",
25
+ system_instruction=SYSTEM_INSTRUCTION
26
+ )
27
+
28
+ model_regular = genai.GenerativeModel("gemini-2.0-flash-exp")
29
+
30
+
31
+ def ask_gemini(prompt: str):
32
+ """
33
+ Basic Gemini query - just returns the response text.
34
+ Use this for simple Q&A without file generation.
35
+ """
36
+ try:
37
+ response = model_regular.generate_content(prompt)
38
+ return response.text
39
+ except Exception as e:
40
+ return f"⚠️ Error: {e}"
41
+
42
+
43
+ def ask_gemini_with_file_generation(query: str, project_dir: str = "."):
44
+ """
45
+ Enhanced Gemini query with automatic file generation.
46
+
47
+ Args:
48
+ query: The user's prompt
49
+ project_dir: Directory where files should be created (default: current dir)
50
+
51
+ Returns:
52
+ tuple: (response_text, files_were_generated)
53
+ - response_text: The full response from Gemini
54
+ - files_were_generated: True if files were created, False otherwise
55
+ """
56
+ try:
57
+ response = model_with_files.generate_content(query)
58
+ response_text = response.text
59
+
60
+ files_generated = file_generator.process_response(response_text, project_dir)
61
+
62
+ return response_text, files_generated
63
+
64
+ except Exception as e:
65
+ return f"⚠️ Error: {e}", False
llm_file_generator.py ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import re
2
+ import subprocess
3
+ from pathlib import Path
4
+ from colors import *
5
+
6
+ class LLMFileGenerator:
7
+ """Handles parsing LLM responses and generating real files"""
8
+
9
+ def __init__(self):
10
+ self.last_files = []
11
+ self.last_commands = []
12
+
13
+ def parse_response(self, response_text):
14
+ """
15
+ Parse LLM response for file generation instructions.
16
+ Looks for XML tags like:
17
+ <file path="src/app.js">content</file>
18
+ <command>npm install</command>
19
+ """
20
+ files = []
21
+ commands = []
22
+
23
+ file_pattern = r'<file path="([^"]+)">\s*(.*?)\s*</file>'
24
+ for match in re.finditer(file_pattern, response_text, re.DOTALL):
25
+ path = match.group(1).strip()
26
+ content = match.group(2)
27
+ files.append({'path': path, 'content': content})
28
+
29
+ command_pattern = r'<command>(.*?)</command>'
30
+ for match in re.finditer(command_pattern, response_text, re.DOTALL):
31
+ commands.append(match.group(1).strip())
32
+
33
+ self.last_files = files
34
+ self.last_commands = commands
35
+
36
+ return files, commands
37
+
38
+ def has_file_instructions(self, response_text):
39
+ """Check if response contains file generation instructions"""
40
+ return '<file path=' in response_text or '<command>' in response_text
41
+
42
+ def create_files(self, files, base_dir="."):
43
+ """Create actual files on disk"""
44
+ created_files = []
45
+
46
+ if not files:
47
+ return created_files
48
+
49
+ Path(base_dir).mkdir(parents=True, exist_ok=True)
50
+
51
+ print(f"\n{CYAN}πŸ“ Creating {len(files)} file(s)...{RESET}")
52
+
53
+ for file_info in files:
54
+ try:
55
+ file_path = Path(base_dir) / file_info['path']
56
+
57
+ file_path.parent.mkdir(parents=True, exist_ok=True)
58
+
59
+ with open(file_path, 'w', encoding='utf-8') as f:
60
+ f.write(file_info['content'])
61
+
62
+ created_files.append(str(file_path))
63
+ print(f"{GREEN}βœ“ Created: {file_path}{RESET}")
64
+
65
+ except Exception as e:
66
+ print(f"{RED}βœ— Failed to create {file_info['path']}: {e}{RESET}")
67
+
68
+ return created_files
69
+
70
+ def run_commands(self, commands, cwd="."):
71
+ """Execute commands in the specified directory"""
72
+ if not commands:
73
+ return []
74
+
75
+ print(f"\n{YELLOW}βš™οΈ Running {len(commands)} command(s)...{RESET}")
76
+ results = []
77
+
78
+ for cmd in commands:
79
+ print(f"\n{CYAN}β†’ Running: {BOLD}{cmd}{RESET}")
80
+
81
+ try:
82
+ result = subprocess.run(
83
+ cmd,
84
+ shell=True,
85
+ cwd=cwd,
86
+ capture_output=True,
87
+ text=True,
88
+ timeout=300
89
+ )
90
+
91
+ if result.returncode == 0:
92
+ print(f"{GREEN}βœ“ Success{RESET}")
93
+ if result.stdout.strip():
94
+ print(result.stdout)
95
+ results.append({
96
+ 'command': cmd,
97
+ 'success': True,
98
+ 'output': result.stdout
99
+ })
100
+ else:
101
+ print(f"{RED}βœ— Failed with code {result.returncode}{RESET}")
102
+ if result.stderr:
103
+ print(f"{RED}{result.stderr}{RESET}")
104
+ results.append({
105
+ 'command': cmd,
106
+ 'success': False,
107
+ 'error': result.stderr
108
+ })
109
+
110
+ except subprocess.TimeoutExpired:
111
+ print(f"{RED}βœ— Command timed out (5 min limit){RESET}")
112
+ results.append({
113
+ 'command': cmd,
114
+ 'success': False,
115
+ 'error': 'Timeout'
116
+ })
117
+ except Exception as e:
118
+ print(f"{RED}βœ— Error: {e}{RESET}")
119
+ results.append({
120
+ 'command': cmd,
121
+ 'success': False,
122
+ 'error': str(e)
123
+ })
124
+
125
+ return results
126
+
127
+ def process_response(self, response_text, project_dir="."):
128
+ """
129
+ Main method: parse response and generate files/run commands
130
+ Returns True if files were generated, False otherwise
131
+ """
132
+ if not self.has_file_instructions(response_text):
133
+ return False
134
+
135
+ files, commands = self.parse_response(response_text)
136
+
137
+ if not files and not commands:
138
+ return False
139
+
140
+ created_files = self.create_files(files, base_dir=project_dir)
141
+
142
+ if commands:
143
+ results = self.run_commands(commands, cwd=project_dir)
144
+
145
+ failed = [r for r in results if not r['success']]
146
+ if failed:
147
+ print(f"\n{YELLOW}⚠️ {len(failed)} command(s) failed{RESET}")
148
+
149
+ if created_files:
150
+ print(f"\n{GREEN}βœ… Generated {len(created_files)} file(s) in '{project_dir}/'{RESET}")
151
+ print(f"{CYAN}πŸ“‚ Files created:{RESET}")
152
+ for f in created_files:
153
+ print(f" {GREEN}- {f}{RESET}")
154
+
155
+ return True
156
+
157
+ def get_enhanced_prompt(self):
158
+ """
159
+ Returns a system prompt addition to teach Gemini how to generate files.
160
+ Add this to your Gemini system instructions.
161
+ """
162
+ return """
163
+ When the user asks you to create, generate, or build files or a project, respond with file generation instructions using this XML format:
164
+ <file path="relative/path/to/file.ext">
165
+ FILE_CONTENT_HERE
166
+ </file>
167
+
168
+ <command>command to run after files are created</command>
169
+
170
+ Example:
171
+ <file path="src/server.js">
172
+ const express = require('express');
173
+ const app = express();
174
+
175
+ app.get('/hello', (req, res) => {
176
+ res.json({ message: 'Hello!' });
177
+ });
178
+
179
+ app.listen(3000);
180
+ </file>
181
+
182
+ <file path="package.json">
183
+ {
184
+ "name": "my-app",
185
+ "version": "1.0.0",
186
+ "dependencies": {
187
+ "express": "^4.18.0"
188
+ }
189
+ }
190
+ </file>
191
+
192
+ <command>npm install</command>
193
+ <command>npm start</command>
194
+
195
+ Rules:
196
+ - Use relative paths from project root
197
+ - Include ALL necessary files
198
+ - Provide complete, working code
199
+ - Commands run in order after all files are created
200
+ - You can still provide explanations before or after the XML tags
201
+ """
202
+ file_generator = LLMFileGenerator()
main.py ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from display import clear_and_setup, get_colored_prompt
2
+ from builtin_commands import handle_builtin_commands
3
+ from external_commands import run_external_command
4
+ from colors import *
5
+
6
+ def main():
7
+ clear_and_setup()
8
+
9
+ while True:
10
+ try:
11
+ prompt = get_colored_prompt()
12
+ line = input(prompt).strip()
13
+
14
+ if not line:
15
+ continue
16
+
17
+ if line.lower() == "exit":
18
+ print(f"{YELLOW}πŸ‘‹ Goodbye from AcidopShell!{RESET}")
19
+ break
20
+
21
+ command_parts = line.split()
22
+
23
+ if handle_builtin_commands(command_parts):
24
+ continue
25
+
26
+ run_external_command(command_parts)
27
+
28
+ except KeyboardInterrupt:
29
+ print(f"\n{YELLOW}πŸ’‘ Use 'exit' to quit Codic !{RESET}")
30
+ except EOFError:
31
+ print(f"\n{YELLOW}πŸ‘‹ Goodbye from Codic !{RESET}")
32
+ break
33
+
34
+ if __name__ == "__main__":
35
+ main()