khagu commited on
Commit
e3804d7
·
0 Parent(s):

add all file

Browse files
.dockerignore ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ __pycache__
2
+ *.pyc
3
+ *.pyo
4
+ *.pyd
5
+ .env
6
+ venv/
7
+ .env.*
8
+ .git
9
+ .gitignore
10
+ class_data.db
11
+ main.json
.env.example ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ DISCORD_TOKEN=your_bot_token_here
2
+ CR_USER_ID=123456789012345678
3
+ CR_ROLE_NAME=Class Representative
4
+ MISTRAL_API_KEY=your_mistral_key
5
+ ANNOUNCEMENT_CHANNEL_ID=123456789012345678s
.gitignore ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Environment variables
2
+ .env
3
+
4
+ # Python
5
+ __pycache__/
6
+ *.pyc
7
+ *.pyo
8
+ *.pyd
9
+ .Python
10
+ env/
11
+ venv/
12
+ .venv/
13
+
14
+ # Database (will be recreated)
15
+ # class_data.db
16
+
17
+ # IDE
18
+ .vscode/
19
+ .idea/
20
+ *.swp
21
+ *~
22
+
23
+ # DisCloud
24
+ .discloud/
25
+ *.db
26
+ *.log
Dockerfile ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Use official Python image
2
+ FROM python:3.11-slim
3
+
4
+ # Set a working directory
5
+ WORKDIR /app
6
+
7
+ # Install system deps (if any)
8
+ RUN apt-get update && apt-get install -y --no-install-recommends \
9
+ build-essential \
10
+ && rm -rf /var/lib/apt/lists/*
11
+
12
+ # Copy requirements and install
13
+ COPY requirements.txt /app/requirements.txt
14
+ RUN pip install --no-cache-dir -r /app/requirements.txt
15
+
16
+ # Copy source
17
+ COPY . /app
18
+
19
+ # Make start script executable
20
+ RUN chmod +x /app/scripts/start.sh
21
+
22
+ # Create a non-root user with UID 1000 (standard for HF Spaces)
23
+ RUN useradd -m -u 1000 appuser
24
+
25
+ # Switch to the new user
26
+ USER appuser
27
+
28
+ # Set home to the user's home directory
29
+ ENV HOME=/home/appuser \
30
+ PATH=/home/appuser/.local/bin:$PATH
31
+
32
+
33
+
34
+ # Default command
35
+ CMD ["/app/scripts/start.sh"]
README.md ADDED
@@ -0,0 +1,148 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ title: Class Assistant Bot
3
+ emoji: 🤖
4
+ colorFrom: blue
5
+ colorTo: purple
6
+ sdk: docker
7
+ pinned: false
8
+
9
+ ---
10
+
11
+ # Class Assistant Bot
12
+
13
+ A Discord bot to manage class schedules, assignments, notes, and shared materials — with optional AI-powered answers.
14
+
15
+ This repository contains the Class Assistant Bot used to fetch and present schedules, manage assignments and materials, and provide quick study help via an AI assistant. It supports local development with SQLite and production deployment using PostgreSQL (Supabase) and Hugging Face Spaces (Docker).
16
+
17
+ ## Quick highlights
18
+ - Run locally: `python3 src/main.py`
19
+ - Run and force import schedule from `main.json`: `python3 src/main.py --override`
20
+ - Logs: `bot.log` (contains INFO/DEBUG logs and message-level logs like `message to bot` / `message by bot`)
21
+ - Config: environment variables (see below)
22
+
23
+ ## Table of contents
24
+ - [Requirements](#requirements)
25
+ - [Configuration](#configuration)
26
+ - [Running locally](#running-locally)
27
+ - [CLI flags](#cli-flags)
28
+ - [Commands overview](#commands-overview)
29
+ - [Deployment (Hugging Face Spaces + Supabase)](#deployment-hugging-face-spaces--supabase)
30
+ - [Troubleshooting](#troubleshooting)
31
+ - [Contributing](#contributing)
32
+
33
+ ## Requirements
34
+ - Python 3.11+ (project uses 3.11 in Dockerfile)
35
+ - Install dependencies:
36
+
37
+ ```bash
38
+ pip install -r requirements.txt
39
+ ```
40
+
41
+ ## Configuration
42
+ Create a `.env` file at the project root or set these environment variables in your deployment environment.
43
+
44
+ Required environment variables
45
+ - `DISCORD_TOKEN` — Your Discord bot token (required)
46
+
47
+ Optional/Recommended
48
+ - `DATABASE_URL` — PostgreSQL connection URI (when provided, the bot uses PostgreSQL; otherwise it falls back to SQLite)
49
+ - `MISTRAL_API_KEY` — API key for the AI assistant integration
50
+ - `CR_USER_ID` — (optional) numeric Discord user ID of the Class Representative (CR) — grants CR-only commands
51
+ - `CR_ROLE_NAME` — (optional) role name used to mark CRs (default: "Class Representative")
52
+ - `ANNOUNCEMENT_CHANNEL_ID` — (optional) channel id for scheduled announcements
53
+
54
+ Example `.env` (do NOT commit this file):
55
+
56
+ ```env
57
+ DISCORD_TOKEN=your_discord_token_here
58
+ DATABASE_URL=postgresql://postgres:password@db.supabase.co:5432/postgres
59
+ MISTRAL_API_KEY=sk-xxxx
60
+ CR_USER_ID=123456789012345678
61
+ CR_ROLE_NAME=CR
62
+ ANNOUNCEMENT_CHANNEL_ID=987654321012345678
63
+ ```
64
+
65
+ The config loader is `src/configuration/config.py` which uses `python-dotenv` to read `.env`.
66
+
67
+ ## Running locally
68
+
69
+ 1. Install dependencies: `pip install -r requirements.txt`
70
+ 2. Create `.env` with at least `DISCORD_TOKEN` set.
71
+ 3. Run the bot:
72
+
73
+ ```bash
74
+ python3 src/main.py
75
+ ```
76
+
77
+ If you want the bot to import the schedule from `main.json` and overwrite any existing schedule in the database, run:
78
+
79
+ ```bash
80
+ python3 src/main.py --override
81
+ ```
82
+
83
+ Notes
84
+ - The bot writes logs to `bot.log` in the project root. Check this file for detailed debug info.
85
+ - For development the bot will fall back to SQLite if `DATABASE_URL` is not set.
86
+
87
+ ## CLI flags
88
+ - `--override` — Force import of `main.json` into the database (clears existing schedule rows first). If you omit `--override` and the database already has schedule entries the import will be skipped to avoid accidental overwrites.
89
+
90
+ ## Commands overview (quick)
91
+ Type `!bothelp` in Discord to get the dynamic help menu. Example commonly used commands:
92
+
93
+ - `!schedule today` — Today's classes
94
+ - `!schedule tomorrow` — Tomorrow's classes
95
+ - `!schedule day <day>` — That day's classes
96
+ - `!schedule week` — Full week schedule
97
+ - `!assignment add Subject="Math" Topic="Algebra" Due="2025-12-01"` — Add assignment
98
+ - `!assignment list` — List pending assignments
99
+ - `!materials all` — List uploaded materials
100
+ - `!materials add Subject="Math" Link="https://..."` — (CR-only) add material
101
+ - `!materials delete Subject="Math" Link="https://..."` — (CR-only) delete material
102
+ - `!notes <Subject>` — View study notes
103
+ - `!ask <question>` — Ask the AI assistant
104
+
105
+ The bot also supports grouped commands (e.g., `!assignment add/list/delete`) — use `!bothelp <category>` or `!bothelp <command>` for detailed usage.
106
+
107
+ ## Deployment (Hugging Face Spaces + Supabase)
108
+
109
+ 1. Create a Supabase project and get your PostgreSQL connection string (set it to `DATABASE_URL` in HF Spaces).
110
+ 2. Create a new Space on Hugging Face, select **Docker** as the SDK.
111
+ 3. Add the necessary environment variables (Secrets) in the Space settings:
112
+ - `DISCORD_TOKEN`
113
+ - `DATABASE_URL`
114
+ - `MISTRAL_API_KEY`
115
+ - `CR_USER_ID`
116
+ - `CR_ROLE_NAME`
117
+ - `ANNOUNCEMENT_CHANNEL_ID`
118
+ 4. The Space will build using the `Dockerfile` and start the bot.
119
+
120
+
121
+
122
+
123
+ Notes about persistence
124
+ - Use Supabase (PostgreSQL) in production for reliable persistence. If no `DATABASE_URL` is configured the bot will use SQLite for local testing only.
125
+
126
+ ## Logging
127
+ - Console: INFO and above
128
+ - File: `bot.log` (DEBUG and above)
129
+
130
+ The bot logs message-level events such as `message to bot: ...` and `message by bot: ...` which helps debug command usage and AI responses.
131
+
132
+ ## Troubleshooting
133
+
134
+ - Bot fails to start / `ModuleNotFoundError: No module named 'discord'` — ensure you installed the dependencies in the right Python environment (virtualenv/venv).
135
+ - `Improper token` or bot offline — verify `DISCORD_TOKEN` value.
136
+ - Schedule import overwritten unexpectedly — run without `--override` to preserve DB; use `--override` only when intentional.
137
+ - Database connection issues — verify `DATABASE_URL` and check Supabase project and Space secrets.
138
+
139
+ ## Contributing
140
+ - Bug reports and PRs are welcome. Please open issues for feature requests or share deployment notes.
141
+
142
+ ## Useful files
143
+ - `src/` — bot source
144
+ - `src/cogs/` — command groups (schedule, assignments, AI, etc.)
145
+ - `main.json` — main schedule file used for bulk import
146
+ - `requirements.txt` — pip deps
147
+ - `Dockerfile`, `scripts/start.sh` — deployment helpers
148
+ - `guides/` — deployment and DB setup guides
configuration/config.py ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ from dotenv import load_dotenv
3
+
4
+ # Ensure we load .env from the project root
5
+ ROOT = os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
6
+ DOTENV_PATH = os.path.join(ROOT, '.env')
7
+ load_dotenv(DOTENV_PATH)
8
+
9
+ TOKEN = os.getenv("DISCORD_TOKEN")
10
+ CR_USER_ID = int(os.getenv("CR_USER_ID", "0"))
11
+ CR_ROLE_NAME = os.getenv("CR_ROLE_NAME", "Class Representative")
12
+ MISTRAL_API_KEY = os.getenv("MISTRAL_API_KEY")
13
+ CHANNEL_ID = int(os.getenv("ANNOUNCEMENT_CHANNEL_ID", "0"))
database/__init__.py ADDED
@@ -0,0 +1 @@
 
 
1
+ from .database import init_db, get_db
database/database.py ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ from sqlalchemy import create_engine, text
3
+ from sqlalchemy.orm import sessionmaker, scoped_session
4
+ from .models import Base
5
+
6
+ # Load .env file to ensure DATABASE_URL is available
7
+ from dotenv import load_dotenv
8
+ load_dotenv()
9
+
10
+ # Database URL configuration
11
+ # Priority: DATABASE_URL env var (PostgreSQL) > SQLite fallback
12
+ DATABASE_URL = os.getenv('DATABASE_URL')
13
+
14
+ if DATABASE_URL:
15
+ # Production: Use PostgreSQL from environment variable
16
+ # Handle postgres:// vs postgresql:// (some providers use old format)
17
+ if DATABASE_URL.startswith('postgres://'):
18
+ DATABASE_URL = DATABASE_URL.replace('postgres://', 'postgresql://', 1)
19
+
20
+ engine = create_engine(DATABASE_URL)
21
+ else:
22
+ # Local development: Use SQLite
23
+ # Store DB file in /data if available (writable volume), or project root locally
24
+ if os.path.exists('/data') and os.access('/data', os.W_OK):
25
+ DB_NAME = '/data/class_data.db'
26
+ else:
27
+ ROOT = os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
28
+ DB_NAME = os.path.join(ROOT, 'class_data.db')
29
+
30
+ DATABASE_URL = f"sqlite:///{DB_NAME}"
31
+ engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
32
+
33
+ SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
34
+ ScopedSession = scoped_session(SessionLocal)
35
+
36
+ def init_db():
37
+ """Initialize the database by creating all tables defined in models.py."""
38
+ Base.metadata.create_all(bind=engine)
39
+
40
+ # SQLite-specific migration: Add new columns if they don't exist
41
+ # Only run this for SQLite databases
42
+ if 'sqlite' in str(engine.url):
43
+ with engine.connect() as conn:
44
+ # Check schedule table columns
45
+ result = conn.execute(text("PRAGMA table_info(schedule)"))
46
+ columns = [row.name for row in result]
47
+
48
+ if 'instructor' not in columns:
49
+ conn.execute(text("ALTER TABLE schedule ADD COLUMN instructor TEXT"))
50
+ if 'note' not in columns:
51
+ conn.execute(text("ALTER TABLE schedule ADD COLUMN note TEXT"))
52
+ conn.commit()
53
+
54
+ def get_db():
55
+ """Provide a transactional scope around a series of operations."""
56
+ db = SessionLocal()
57
+ try:
58
+ return db
59
+ finally:
60
+ db.close()
database/models.py ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from sqlalchemy import Column, Integer, String
2
+ from sqlalchemy.orm import declarative_base
3
+
4
+ Base = declarative_base()
5
+
6
+ class Schedule(Base):
7
+ __tablename__ = "schedule"
8
+ id = Column(Integer, primary_key=True, autoincrement=True)
9
+ day = Column(String)
10
+ time = Column(String)
11
+ subject = Column(String)
12
+ group_name = Column(String)
13
+ room = Column(String)
14
+ instructor = Column(String)
15
+ note = Column(String)
16
+
17
+ class Assignment(Base):
18
+ __tablename__ = "assignments"
19
+ id = Column(Integer, primary_key=True, autoincrement=True)
20
+ subject = Column(String)
21
+ topic = Column(String)
22
+ due_date = Column(String)
23
+
24
+ class Note(Base):
25
+ __tablename__ = "notes"
26
+ id = Column(Integer, primary_key=True, autoincrement=True)
27
+ subject = Column(String)
28
+ link = Column(String)
29
+
30
+ class Material(Base):
31
+ __tablename__ = "materials"
32
+ id = Column(Integer, primary_key=True, autoincrement=True)
33
+ subject = Column(String)
34
+ drive_link = Column(String)
35
+
36
+ class Assessment(Base):
37
+ __tablename__ = "assessments"
38
+ id = Column(Integer, primary_key=True, autoincrement=True)
39
+ subject = Column(String)
40
+ date = Column(String)
41
+ time = Column(String)
42
+ description = Column(String)
guides/HUGGINGFACE_DEPLOYMENT_GUIDE.md ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Deploying Class Assistant Bot on Hugging Face Spaces
2
+
3
+ This guide explains how to deploy your Discord bot to [Hugging Face Spaces](https://huggingface.co/spaces) using Docker.
4
+
5
+ ## Why Hugging Face Spaces?
6
+ Hugging Face Spaces offers a simple way to host machine learning demos and apps. By using the Docker SDK, we can deploy our custom Discord bot environment. The free tier is generous and suitable for many bot applications.
7
+
8
+ ## Prerequisites
9
+ 1. A [Hugging Face](https://huggingface.co) account.
10
+ 2. A [GitHub](https://github.com) account with this code pushed to a repository (optional, but recommended for syncing).
11
+
12
+ ## Deployment Steps
13
+
14
+ ### 1. Create a New Space
15
+ 1. Log in to Hugging Face.
16
+ 2. Click on your profile picture and select **New Space**.
17
+ 3. **Space Name**: `class-assistant-bot` (or any name you prefer).
18
+ 4. **License**: MIT (or your choice).
19
+ 5. **SDK**: Select **Docker**.
20
+ 6. **Space Hardware**: **CPU Basic (Free)** is usually sufficient.
21
+ 7. **Visibility**: Public or Private (Private is recommended if you want to hide your source code/logs, but Public is fine if you don't mind).
22
+
23
+ ### 2. Configure the Space
24
+ After creating the Space, you'll see a "Building" status. It might fail initially if environment variables aren't set.
25
+
26
+ 1. Go to the **Settings** tab of your Space.
27
+ 2. Scroll down to the **Variables and secrets** section.
28
+ 3. Click **New secret** to add your sensitive environment variables.
29
+
30
+ ### 3. Set Environment Variables (Secrets)
31
+ Add the following secrets. You can find these values in your local `.env` file.
32
+
33
+ | Key | Value Description |
34
+ | :--- | :--- |
35
+ | `DATABASE_URL` | **Required**: Your PostgreSQL connection string from Supabase. See [SUPABASE_SETUP.md](SUPABASE_SETUP.md) for setup instructions. |
36
+ | `DISCORD_TOKEN` | Your Discord Bot Token. |
37
+ | `MISTRAL_API_KEY` | Your Mistral AI API Key. |
38
+ | `CR_USER_ID` | (Optional) The Discord User ID of the Class Representative. |
39
+ | `CR_ROLE_NAME` | (Optional) The role name for CR permissions (e.g., "CR"). |
40
+ | `ANNOUNCEMENT_CHANNEL_ID` | (Optional) Channel ID for daily reminders. |
41
+
42
+ > **Note**: Hugging Face Spaces exposes port `7860` by default. The `Dockerfile` in this repo is configured to expose this port and the bot's keep-alive server listens on it.
43
+
44
+ ### 4. Sync with GitHub (Optional but Recommended)
45
+ If you want to push code from your local machine/GitHub to the Space:
46
+ 1. In the Space **Settings**, look for "Git" or "Repository".
47
+ 2. You can add your GitHub repository as a remote or use GitHub Actions to push to the Space.
48
+ 3. Alternatively, you can manually upload files via the "Files" tab, but using Git is better.
49
+
50
+ ### 5. Verify Deployment
51
+ 1. Go to the **App** tab.
52
+ 2. You should see "Running" status.
53
+ 3. The app preview might show a JSON response `{"status": "ok", "service": "class-assistant-bot"}`. This means the keep-alive server is running.
54
+ 4. Check your Discord server; the bot should be online!
55
+
56
+ ## Troubleshooting
57
+
58
+ * **Build Failed**: Check the "Logs" tab. Ensure `requirements.txt` is correct and the Dockerfile build process finishes without errors.
59
+ * **Runtime Error**: If the build succeeds but the app crashes, check the "Logs".
60
+ * **Port Issues**: If the app is "Running" but the health check fails or the bot doesn't stay online, ensure the `PORT` environment variable is effectively being used (defaults to 7860 in Dockerfile).
61
+ * **Database**: Ensure `DATABASE_URL` is correct. The bot relies on Supabase for persistence. Local SQLite files in the Space will be lost when the Space restarts.
62
+
63
+ ## Important Notes
64
+
65
+ ### Persistence
66
+ * **PostgreSQL (Supabase)**: Highly recommended. Data is stored externally and persists across restarts.
67
+ * **Local Storage**: Hugging Face Spaces are ephemeral. Any files written to the disk (like a local SQLite DB) will be lost when the Space restarts or rebuilds. **Always use Supabase for production.**
guides/SUPABASE_SETUP.md ADDED
@@ -0,0 +1,87 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Supabase PostgreSQL Setup Guide
2
+
3
+ This guide shows you how to set up a **free PostgreSQL database** on Supabase for your Discord bot.
4
+
5
+ ## Why Supabase?
6
+ - **Free tier**: 500MB database storage
7
+ - **Always online**: Database never sleeps
8
+ - **PostgreSQL**: Industry-standard, reliable database
9
+ - **Easy setup**: Takes less than 5 minutes
10
+
11
+ ## Step 1: Create a Supabase Account
12
+
13
+ 1. Go to [supabase.com](https://supabase.com)
14
+ 2. Click **Start your project**
15
+ 3. Sign up with GitHub (recommended) or email
16
+
17
+ ## Step 2: Create a New Project
18
+
19
+ 1. Click **New Project**
20
+ 2. Fill in the details:
21
+ - **Name**: `class-assistant-bot` (or any name)
22
+ - **Database Password**: Create a strong password (save this!)
23
+ - **Region**: Choose closest to you (e.g., Singapore, Mumbai)
24
+ - **Pricing Plan**: Free
25
+ 3. Click **Create new project**
26
+ 4. Wait 1-2 minutes for the database to be provisioned
27
+
28
+ ## Step 3: Get Your Database Connection String
29
+
30
+ 1. In your Supabase project dashboard, click **Settings** (gear icon in sidebar)
31
+ 2. Click **Database** in the left menu
32
+ 3. Scroll down to **Connection string**
33
+ 4. Select **URI** tab
34
+ 5. Copy the connection string. It looks like:
35
+ ```
36
+ postgresql://postgres:[YOUR-PASSWORD]@db.xxxxxxxxxxxxx.supabase.co:5432/postgres
37
+ ```
38
+ 6. **Important**: Replace `[YOUR-PASSWORD]` with the actual password you created in Step 2
39
+
40
+ ## Step 4: Add to Hugging Face Spaces Environment Variables
41
+
42
+ 1. Go to your Hugging Face Space settings.
43
+ 2. Scroll to **Variables and secrets**.
44
+ 3. Click **New secret**.
45
+ 4. Add:
46
+ - **Key**: `DATABASE_URL`
47
+ - **Value**: Paste your connection string from Step 3
48
+ 5. Click **Save**.
49
+
50
+ Hugging Face will automatically rebuild your Space with the new database connection.
51
+
52
+ ## Step 5: Verify It's Working
53
+
54
+ 1. Wait for the Space to finish building.
55
+ 2. Check the logs in the Space dashboard.
56
+ 3. You should see "Service is live" (or similar startup logs) and no database errors.
57
+ 4. Test your bot in Discord - try adding an assignment or schedule.
58
+
59
+ ## Troubleshooting
60
+
61
+ **Connection Error**:
62
+ - Double-check the password in your connection string
63
+ - Make sure there are no extra spaces
64
+
65
+ **Tables Not Created**:
66
+ - The bot automatically creates tables on first startup
67
+ - Check Space logs for any errors
68
+
69
+ **Data Not Persisting**:
70
+ - Verify `DATABASE_URL` is set correctly in Space secrets
71
+ - Check Supabase dashboard → Table Editor to see your data
72
+
73
+ ## Managing Your Database
74
+
75
+ You can view and manage your data directly in Supabase:
76
+ 1. Go to your Supabase project
77
+ 2. Click **Table Editor** in the sidebar
78
+ 3. You'll see all your tables: `schedule`, `assignments`, `notes`, etc.
79
+ 4. You can view, edit, or delete data directly here
80
+
81
+ ## Free Tier Limits
82
+
83
+ - **Storage**: 500MB (plenty for a Discord bot)
84
+ - **Bandwidth**: 2GB/month
85
+ - **API Requests**: Unlimited
86
+
87
+ Your bot will likely use less than 10MB of storage, so you're well within limits!
main.json ADDED
@@ -0,0 +1,350 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "program": "Electronics, Communication and Information Engineering (BEI)",
4
+ "year_part": "II/I",
5
+ "class_start": "2082-08-15",
6
+ "class_end": "2082-12-03",
7
+ "default_lecture_room": "Lecture Room 4",
8
+ "note": "All other days begin at 10:15."
9
+ },
10
+ "GroupA": {
11
+ "sunday": [
12
+ {
13
+ "time": "10:15\u201311:55",
14
+ "subject": "Control System [L+R]",
15
+ "instructor": "SKR",
16
+ "room": "Lecture Room 4"
17
+ },
18
+ {
19
+ "time": "11:55\u20131:35",
20
+ "subject": "Advanced Electronics [L]",
21
+ "instructor": "JKM",
22
+ "room": "Lecture Room 4"
23
+ },
24
+ {
25
+ "time": "1:35\u20132:25",
26
+ "subject": "BREAK",
27
+ "instructor": "None",
28
+ "room": "Lecture Room 4"
29
+ },
30
+ {
31
+ "time": "2:25\u20134:55",
32
+ "subject": "Microprocessor[Practical] A",
33
+ "instructor": "Dr. RKS + KN",
34
+ "room": "Electronics Lab 4"
35
+ }
36
+ ],
37
+ "monday": [
38
+ {
39
+ "time": "10:15\u201311:55",
40
+ "subject": "Microprocessor [L]",
41
+ "instructor": "Dr. RKS",
42
+ "room": "Lecture Room 4"
43
+ },
44
+ {
45
+ "time": "11:55\u20131:35",
46
+ "subject": "English",
47
+ "instructor": "None",
48
+ "room": "Lecture Room 4"
49
+ },
50
+ {
51
+ "time": "1:35\u20132:25",
52
+ "subject": "English Lab Group A",
53
+ "instructor": "None",
54
+ "room": "Lecture Room 4"
55
+ },
56
+ {
57
+ "time": "3:15\u20134:05",
58
+ "subject": "English(T)",
59
+ "instructor": "None",
60
+ "room": "Lecture Room 4"
61
+ }
62
+ ],
63
+ "tuesday": [
64
+ {
65
+ "time": "7:15\u20139:30",
66
+ "subject": "Control System [P] ALT WEEK GROUP A/B ALT WEEK",
67
+ "instructor": "YA + AD + AP",
68
+ "room": "Lecture Room 4"
69
+ },
70
+ {
71
+ "time": "9:30\u201310:15",
72
+ "subject": "BREAK",
73
+ "instructor": "None",
74
+ "room": "Lecture Room 4"
75
+ },
76
+ {
77
+ "time": "10:15\u201311:55",
78
+ "subject": "Computer Graphics and Visualization [L+T]",
79
+ "instructor": "JM",
80
+ "room": "Lecture Room 4"
81
+ },
82
+ {
83
+ "time": "11:55\u20131:35",
84
+ "subject": "Microprocessor[L+T]",
85
+ "instructor": "Dr. RKS",
86
+ "room": "Lecture Room 4"
87
+ },
88
+ {
89
+ "time": "1:35\u20132:25",
90
+ "subject": "BREAK",
91
+ "instructor": "None",
92
+ "room": "Lecture Room 4"
93
+ },
94
+ {
95
+ "time": "2:25\u20133:15",
96
+ "subject": "None",
97
+ "instructor": "None",
98
+ "room": "Lecture Room 4"
99
+ },
100
+ {
101
+ "time": "3:15\u20134:55",
102
+ "subject": "Math III [L]",
103
+ "instructor": "AKB",
104
+ "room": "Lecture Room 4"
105
+ }
106
+ ],
107
+ "wednesday": [
108
+ {
109
+ "time": "10:15\u201311:55",
110
+ "subject": "Control System [L]",
111
+ "instructor": "SKR",
112
+ "room": "Lecture Room 4"
113
+ },
114
+ {
115
+ "time": "11:55\u20131:35",
116
+ "subject": "Computer Graphics and Visualization [L+T]",
117
+ "instructor": "JM",
118
+ "room": "Lecture Room 4"
119
+ },
120
+ {
121
+ "time": "1:35\u20132:25",
122
+ "subject": "BREAK",
123
+ "instructor": "None",
124
+ "room": "Lecture Room 4"
125
+ },
126
+ {
127
+ "time": "2:25\u20134:55",
128
+ "subject": "Computer Graphics and Visualization [Practical] Group A",
129
+ "instructor": "JM + SST",
130
+ "room": "Computer Lab 6"
131
+ }
132
+ ],
133
+ "thursday": [
134
+ {
135
+ "time": "12:45\u20131:35",
136
+ "subject": "Math III [L]",
137
+ "instructor": "VP",
138
+ "room": "Lecture Room 4"
139
+ },
140
+ {
141
+ "time": "1:35\u20132:25",
142
+ "subject": "BREAK",
143
+ "instructor": "None",
144
+ "room": "Lecture Room 4"
145
+ },
146
+ {
147
+ "time": "2:25\u20134:55",
148
+ "subject": "Advance Electronics [P] GROUP A/B ALT WEEK",
149
+ "instructor": "JKM + SJ",
150
+ "room": "Electronics Lab 3"
151
+ }
152
+ ],
153
+ "friday": [
154
+ {
155
+ "time": "10:15\u201311:55",
156
+ "subject": "Advanced Electronics [L]",
157
+ "instructor": "NA",
158
+ "room": "Lecture Room 4"
159
+ },
160
+ {
161
+ "time": "11:55\u201312:45",
162
+ "subject": "MATH III [T]",
163
+ "instructor": "GBJ",
164
+ "room": "Lecture Room 4"
165
+ },
166
+ {
167
+ "time": "12:45\u20131:35",
168
+ "subject": "Math III [L]",
169
+ "instructor": "SG",
170
+ "room": "Lecture Room 4"
171
+ },
172
+ {
173
+ "time": "1:35\u20132:25",
174
+ "subject": "BREAK",
175
+ "instructor": "None",
176
+ "room": "Lecture Room 4"
177
+ }
178
+ ]
179
+ },
180
+ "GroupB": {
181
+ "sunday": [
182
+ {
183
+ "time": "10:15\u201311:55",
184
+ "subject": "Control System [L+R]",
185
+ "instructor": "SKR",
186
+ "room": "Lecture Room 4"
187
+ },
188
+ {
189
+ "time": "11:55\u20131:35",
190
+ "subject": "Advanced Electronics [L]",
191
+ "instructor": "JKM",
192
+ "room": "Lecture Room 4"
193
+ },
194
+ {
195
+ "time": "1:35\u20132:25",
196
+ "subject": "BREAK",
197
+ "instructor": "None",
198
+ "room": "Lecture Room 4"
199
+ }
200
+ ],
201
+ "monday": [
202
+ {
203
+ "time": "10:15\u201311:55",
204
+ "subject": "Microprocessor [L]",
205
+ "instructor": "Dr. RKS",
206
+ "room": "Lecture Room 4"
207
+ },
208
+ {
209
+ "time": "11:55\u20131:35",
210
+ "subject": "English",
211
+ "instructor": "None",
212
+ "room": "Lecture Room 4"
213
+ },
214
+ {
215
+ "time": "2:25\u20133:15",
216
+ "subject": "English Lab Group B",
217
+ "instructor": "None",
218
+ "room": "Lecture Room 4"
219
+ },
220
+ {
221
+ "time": "3:15\u20134:05",
222
+ "subject": "English(T)",
223
+ "instructor": "None",
224
+ "room": "Lecture Room 4"
225
+ }
226
+ ],
227
+ "tuesday": [
228
+ {
229
+ "time": "7:15\u20139:30",
230
+ "subject": "Control System [P] ALT WEEK GROUP A/B ALT WEEK",
231
+ "instructor": "YA + AD + AP",
232
+ "room": "Lecture Room 4"
233
+ },
234
+ {
235
+ "time": "9:30\u201310:15",
236
+ "subject": "BREAK",
237
+ "instructor": "None",
238
+ "room": "Lecture Room 4"
239
+ },
240
+ {
241
+ "time": "10:15\u201311:55",
242
+ "subject": "Computer Graphics and Visualization [L+T]",
243
+ "instructor": "JM",
244
+ "room": "Lecture Room 4"
245
+ },
246
+ {
247
+ "time": "11:55\u20131:35",
248
+ "subject": "Microprocessor[L+T]",
249
+ "instructor": "Dr. RKS",
250
+ "room": "Lecture Room 4"
251
+ },
252
+ {
253
+ "time": "1:35\u20132:25",
254
+ "subject": "BREAK",
255
+ "instructor": "None",
256
+ "room": "Lecture Room 4"
257
+ },
258
+ {
259
+ "time": "2:25\u20133:15",
260
+ "subject": "None",
261
+ "instructor": "None",
262
+ "room": "Lecture Room 4"
263
+ },
264
+ {
265
+ "time": "3:15\u20134:55",
266
+ "subject": "Math III [L]",
267
+ "instructor": "AKB",
268
+ "room": "Lecture Room 4"
269
+ }
270
+ ],
271
+ "wednesday": [
272
+ {
273
+ "time": "10:15\u201311:55",
274
+ "subject": "Control System [L]",
275
+ "instructor": "SKR",
276
+ "room": "Lecture Room 4"
277
+ },
278
+ {
279
+ "time": "11:55\u20131:35",
280
+ "subject": "Computer Graphics and Visualization [L+T]",
281
+ "instructor": "JM",
282
+ "room": "Lecture Room 4"
283
+ },
284
+ {
285
+ "time": "1:35\u20132:25",
286
+ "subject": "BREAK",
287
+ "instructor": "None",
288
+ "room": "Lecture Room 4"
289
+ }
290
+ ],
291
+ "thursday": [
292
+ {
293
+ "time": "10:15\u201312:45",
294
+ "subject": "Computer Graphics and Visualization [Practical] Group B",
295
+ "instructor": "JM + SST",
296
+ "room": "Computer Lab 5"
297
+ },
298
+ {
299
+ "time": "12:45\u20131:35",
300
+ "subject": "Math III [L]",
301
+ "instructor": "VP",
302
+ "room": "Lecture Room 4"
303
+ },
304
+ {
305
+ "time": "1:35\u20132:25",
306
+ "subject": "BREAK",
307
+ "instructor": "None",
308
+ "room": "Lecture Room 4"
309
+ },
310
+ {
311
+ "time": "2:25\u20134:55",
312
+ "subject": "Advance Electronics [P] GROUP A/B ALT WEEK",
313
+ "instructor": "JKM + SJ",
314
+ "room": "Electronics Lab 3"
315
+ }
316
+ ],
317
+ "friday": [
318
+ {
319
+ "time": "10:15\u201311:55",
320
+ "subject": "Advanced Electronics [L]",
321
+ "instructor": "NA",
322
+ "room": "Lecture Room 4"
323
+ },
324
+ {
325
+ "time": "11:55\u201312:45",
326
+ "subject": "MATH III [T]",
327
+ "instructor": "GBJ",
328
+ "room": "Lecture Room 4"
329
+ },
330
+ {
331
+ "time": "12:45\u20131:35",
332
+ "subject": "Math III [L]",
333
+ "instructor": "SG",
334
+ "room": "Lecture Room 4"
335
+ },
336
+ {
337
+ "time": "1:35\u20132:25",
338
+ "subject": "BREAK",
339
+ "instructor": "None",
340
+ "room": "Lecture Room 4"
341
+ },
342
+ {
343
+ "time": "2:25\u20134:55",
344
+ "subject": "Microprocessor[Practical] B",
345
+ "instructor": "Dr. RKS + KN",
346
+ "room": "Electronics Lab 4"
347
+ }
348
+ ]
349
+ }
350
+ }
requirements.txt ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ discord.py>=2.3.0
2
+ python-dotenv>=1.0.0
3
+ sqlalchemy>=2.0.0
4
+ aiohttp>=3.9.0
5
+ mistralai>=0.1.0
6
+ asyncio
7
+ requests
8
+ flask
9
+ psycopg2-binary
10
+ APScheduler==3.10.4
scripts/start.sh ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env bash
2
+ set -euo pipefail
3
+
4
+ # Run from project root (/app) and ensure Python path includes both project root and src
5
+ cd "$(dirname "$0")/.."
6
+ export PYTHONPATH="/app:/app/src"
7
+ cd /app/src
8
+ exec python main.py
src/cogs/ai.py ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import discord
2
+ from discord.ext import commands
3
+ from database.database import get_db
4
+ from database.models import Assessment
5
+ from mistral_client import ask_mistral
6
+
7
+ class AI(commands.Cog):
8
+ def __init__(self, bot):
9
+ self.bot = bot
10
+
11
+ #==============================================================================================================
12
+ #===============================================>AI ASK<=======================================================
13
+ #==============================================================================================================
14
+ @commands.command()
15
+ async def ask(self, ctx, *, question: str = None):
16
+ # Try local DB first
17
+ if not question:
18
+ await ctx.send('Usage: `!ask <question>`')
19
+ return
20
+ db = get_db()
21
+ try:
22
+ exams = db.query(Assessment).order_by(Assessment.date).all()
23
+ context = "Upcoming exams: " + "; ".join([f"{e.subject} on {e.date}" for e in exams[:5]])
24
+ finally:
25
+ db.close()
26
+
27
+ # Try Mistral
28
+ ai_answer = await ask_mistral(question, context)
29
+ if ai_answer:
30
+ await ctx.send(f"🤖 **AI Answer:**\n{ai_answer}")
31
+ else:
32
+ await ctx.send("I couldn't find an answer. Try asking the CR or teacher!")
33
+
34
+ async def setup(bot):
35
+ await bot.add_cog(AI(bot))
src/cogs/assignments.py ADDED
@@ -0,0 +1,249 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import discord
2
+ import io
3
+ from discord.ext import commands
4
+ import re
5
+ from database.database import get_db
6
+ from database.models import Assignment, Note, Material
7
+ from utils import is_cr
8
+
9
+ class Assignments(commands.Cog):
10
+ def __init__(self, bot):
11
+ self.bot = bot
12
+
13
+ #==============================================================================================================
14
+ #============================================>ASSIGNMENTS<=====================================================
15
+ #==============================================================================================================
16
+ @commands.group(invoke_without_command=True)
17
+ async def assignment(self, ctx):
18
+ await ctx.send("Usage: `!assignment add/list/delete`")
19
+
20
+ @assignment.command()
21
+ async def add(self, ctx, *, args: str = None):
22
+ try:
23
+ if not args:
24
+ await ctx.send('Usage: `!assignment add Subject="Math" Topic="Algebra" Due="2025-12-01"`')
25
+ return
26
+ # Robust parsing with regex: Captures Key="Value" or Key='Value'
27
+ matches = re.findall(r'(\w+)="([^\"]*)"|(\w+)=\'([^\']*)\'', args)
28
+ parts = {}
29
+ for match in matches:
30
+ key = match[0] or match[2]
31
+ value = match[1] or match[3]
32
+ parts[key] = value
33
+
34
+ subject = parts.get("Subject", "").strip()
35
+ topic = parts.get("Topic", "").strip()
36
+ due = parts.get("Due", "").strip()
37
+
38
+ if not all([subject, topic, due]):
39
+ raise ValueError("Missing fields")
40
+
41
+ db = get_db()
42
+ try:
43
+ new_assignment = Assignment(subject=subject.title(), topic=topic, due_date=due)
44
+ db.add(new_assignment)
45
+ db.commit()
46
+ await ctx.send(f"Assignment added: **{topic}** for **{subject.title()}**, due **{due}**")
47
+ finally:
48
+ db.close()
49
+ except:
50
+ await ctx.send('Usage: `!assignment add Subject="Math" Topic="Algebra" Due="2025-12-01"`\n\n**Examples that now work:**\n- `Subject="Control system" Topic="Chapter 1" Due="2025-12-01"`\n- `Subject="Math" Topic="Algebra" Due="2025-12-01" `')
51
+
52
+ @assignment.command()
53
+ async def list(self, ctx):
54
+ db = get_db()
55
+ try:
56
+ assignments = db.query(Assignment).order_by(Assignment.due_date).all()
57
+
58
+ if not assignments:
59
+ await ctx.send("No assignments pending.")
60
+ return
61
+
62
+ msg = "**Pending Assignments**\n"
63
+ for a in assignments:
64
+ msg += f"{a.id}. **{a.subject}** - {a.topic} (Due: {a.due_date})\n"
65
+ await ctx.send(msg)
66
+ finally:
67
+ db.close()
68
+
69
+ @assignment.command()
70
+ @is_cr()
71
+ async def delete(self, ctx, index: int):
72
+ db = get_db()
73
+ try:
74
+ assignment = db.query(Assignment).filter(Assignment.id == index).first()
75
+ if assignment:
76
+ db.delete(assignment)
77
+ db.commit()
78
+ await ctx.send(f"Deleted assignment: {assignment.topic} ({assignment.subject})")
79
+ else:
80
+ await ctx.send("Assignment not found.")
81
+ finally:
82
+ db.close()
83
+
84
+ # NOTES (view-only) & MATERIALS (Similar structure)
85
+ @commands.group(invoke_without_command=True)
86
+ async def notes(self, ctx, subject=None):
87
+ """View study notes for a subject. Adding/removing notes is disabled.
88
+
89
+ Usage: `!notes <Subject>`
90
+ """
91
+ if subject:
92
+ db = get_db()
93
+ try:
94
+ links = db.query(Note).filter(Note.subject == subject.title()).all()
95
+ if links:
96
+ msg = f"**Study Notes for {subject.title()}**\n"
97
+ for l in links:
98
+ msg += f"• {l.link}\n"
99
+ await ctx.send(msg)
100
+ else:
101
+ await ctx.send(f"No notes found for **{subject.title()}**")
102
+ finally:
103
+ db.close()
104
+ else:
105
+ await ctx.send("Usage: `!notes <Subject>`")
106
+
107
+ # MATERIALS (Google Drive)
108
+ @commands.group(invoke_without_command=True)
109
+ async def materials(self, ctx):
110
+ if ctx.invoked_subcommand is None:
111
+ await ctx.send("Usage: `!materials [subject]/all/add/delete`\nNote: `add` and `delete` are CR-only commands.")
112
+
113
+ @materials.command()
114
+ async def all(self, ctx):
115
+ db = get_db()
116
+ try:
117
+ items = db.query(Material).all()
118
+ except Exception:
119
+ db.close()
120
+ await ctx.send("❌ Failed to read materials from the database.")
121
+ return
122
+
123
+ try:
124
+ if not items:
125
+ await ctx.send("No materials uploaded yet.")
126
+ return
127
+
128
+ header = "**All Google Drive Materials**\n"
129
+ # Build messages under Discord's 2000 char limit (leave some headroom)
130
+ max_len = 1900
131
+ current = header
132
+ for i in items:
133
+ line = f"• **{i.subject}**: {i.drive_link}\n"
134
+ if len(current) + len(line) > max_len:
135
+ try:
136
+ await ctx.send(current)
137
+ except Exception:
138
+ # fallback: send as file (use BytesIO)
139
+ try:
140
+ bio = io.BytesIO(current.encode('utf-8'))
141
+ bio.seek(0)
142
+ await ctx.send(file=discord.File(bio, filename='materials.txt'))
143
+ except Exception:
144
+ await ctx.send("❌ Failed to send materials list (message too large).")
145
+ return
146
+ current = header + line
147
+ else:
148
+ current += line
149
+
150
+ # send remaining
151
+ try:
152
+ await ctx.send(current)
153
+ except Exception:
154
+ try:
155
+ bio = io.BytesIO(current.encode('utf-8'))
156
+ bio.seek(0)
157
+ await ctx.send(file=discord.File(bio, filename='materials.txt'))
158
+ except Exception:
159
+ await ctx.send("❌ Failed to send materials list.")
160
+ finally:
161
+ db.close()
162
+
163
+
164
+ @materials.command()
165
+ @is_cr()
166
+ async def add(self, ctx, *, args: str = None):
167
+ """Add a Google Drive material link. CR-only.
168
+
169
+ Expected format: Subject="SubjectName" Link="https://drive.link/..."
170
+ """
171
+ if not args:
172
+ await ctx.send('Usage: `!materials add Subject="Math" Link="https://..."`')
173
+ return
174
+ try:
175
+ matches = re.findall(r'(\w+)="([^"]*)"|(\w+)=\'([^\']*)\'', args)
176
+ parts = {}
177
+ for match in matches:
178
+ key = match[0] or match[2]
179
+ value = match[1] or match[3]
180
+ parts[key] = value
181
+
182
+ subject = parts.get("Subject", "").strip()
183
+ link = parts.get("Link", "").strip()
184
+ if not subject or not link:
185
+ raise ValueError("Missing Subject or Link")
186
+
187
+ db = get_db()
188
+ try:
189
+ new_material = Material(subject=subject.title(), drive_link=link)
190
+ db.add(new_material)
191
+ db.commit()
192
+ await ctx.send(f"✅ Material added for **{subject.title()}**: {link}")
193
+ finally:
194
+ db.close()
195
+ except ValueError as ve:
196
+ await ctx.send(f"❌ {ve}. Usage: `!materials add Subject=\"Math\" Link=\"https://...\"`")
197
+ except Exception:
198
+ await ctx.send("❌ Failed to add material. Ensure command format is correct.")
199
+
200
+
201
+ @materials.command()
202
+ @is_cr()
203
+ async def delete(self, ctx, *, args: str = None):
204
+ """Delete a material by subject+link. CR-only.
205
+
206
+ Expected: Subject="Math" Link="https://..."
207
+ """
208
+ # If user didn't provide args, show usage instead of raising a framework error
209
+ if not args:
210
+ await ctx.send('Usage: `!materials delete Subject="Math" Link="https://..."`')
211
+ return
212
+ try:
213
+ matches = re.findall(r'(\w+)="([^"]*)"|(\w+)=\'([^\']*)\'', args)
214
+ parts = {}
215
+ for match in matches:
216
+ key = match[0] or match[2]
217
+ value = match[1] or match[3]
218
+ parts[key] = value
219
+
220
+ subject = parts.get("Subject", "").strip()
221
+ link = parts.get("Link", "").strip()
222
+ if not subject or not link:
223
+ raise ValueError("Missing Subject or Link")
224
+
225
+ db = get_db()
226
+ try:
227
+ # Find and delete
228
+ # Note: This deletes ALL matching entries.
229
+ rows = db.query(Material).filter(
230
+ Material.subject == subject.title(),
231
+ Material.drive_link == link
232
+ ).all()
233
+
234
+ if rows:
235
+ for row in rows:
236
+ db.delete(row)
237
+ db.commit()
238
+ await ctx.send(f"✅ Material deleted for **{subject.title()}**")
239
+ else:
240
+ await ctx.send("⚠️ No matching material found to delete.")
241
+ finally:
242
+ db.close()
243
+ except ValueError as ve:
244
+ await ctx.send(f"❌ {ve}. Usage: `!materials delete Subject=\"Math\" Link=\"https://...\"`")
245
+ except Exception:
246
+ await ctx.send("❌ Failed to delete material. Ensure the format is correct and the material exists.")
247
+
248
+ async def setup(bot):
249
+ await bot.add_cog(Assignments(bot))
src/cogs/general.py ADDED
@@ -0,0 +1,82 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import discord
2
+ from discord.ext import commands
3
+
4
+ class General(commands.Cog):
5
+ def __init__(self, bot):
6
+ self.bot = bot
7
+
8
+ #==============================================================================================================
9
+ #===============================================>HELP<=========================================================
10
+ #==============================================================================================================
11
+ @commands.command(name="bothelp", aliases=["commands"])
12
+ async def bothelp(self, ctx, *, topic: str = None):
13
+ """Show bot help. Without args lists categories and top-level commands.
14
+ Provide a topic (cog name or command) to see detailed commands.
15
+ """
16
+ # Collect top-level commands (ignore hidden)
17
+ top_cmds = [c for c in self.bot.commands if not getattr(c, "parent", None) and not c.hidden]
18
+
19
+ # Group by cog name
20
+ cogs = {}
21
+ for cmd in top_cmds:
22
+ cog_name = cmd.cog_name or "General"
23
+ cogs.setdefault(cog_name, []).append(cmd)
24
+
25
+ if not topic:
26
+ # Build summary message
27
+ msg_lines = ["**Class Assistant Bot - Available command categories**\n"]
28
+ for cog_name, cmds in sorted(cogs.items()):
29
+ # show cog header and top-level command names
30
+ cmd_list = ", ".join([f"`!{c.name}`" for c in cmds])
31
+ msg_lines.append(f"**{cog_name}:** {cmd_list}")
32
+
33
+ msg_lines.append("\nType `!bothelp <category>` or `!bothelp <command>` for details. Example: `!bothelp schedule`")
34
+ await ctx.send("\n".join(msg_lines))
35
+ return
36
+
37
+ # Topic provided -> try to match a cog name first
38
+ topic_lower = topic.strip().lower()
39
+
40
+ # Match cog
41
+ for cog_name, cmds in cogs.items():
42
+ if cog_name.lower() == topic_lower:
43
+ lines = [f"**{cog_name} commands:**\n"]
44
+ for cmd in sorted(cmds, key=lambda x: x.name):
45
+ if hasattr(cmd, "commands") and cmd.commands:
46
+ # it's a Group; list its subcommands
47
+ lines.append(f"`!{cmd.name}` - {cmd.help or 'Group of related commands.'}")
48
+ for sc in cmd.commands:
49
+ if sc.hidden:
50
+ continue
51
+ usage = f"!{sc.qualified_name} {sc.signature}".strip()
52
+ lines.append(f" • `{usage}` - {sc.help or ''}")
53
+ else:
54
+ usage = f"!{cmd.qualified_name} {cmd.signature}".strip()
55
+ lines.append(f"`{usage}` - {cmd.help or ''}")
56
+
57
+ await ctx.send("\n".join(lines))
58
+ return
59
+
60
+ # Match a top-level command name
61
+ for cmd in top_cmds:
62
+ if cmd.name.lower() == topic_lower:
63
+ # If it's a group, show subcommands
64
+ if hasattr(cmd, "commands") and cmd.commands:
65
+ lines = [f"**{cmd.name} (group) commands:**\n", f"{cmd.help or 'Group of related commands.'}"]
66
+ for sc in cmd.commands:
67
+ if sc.hidden:
68
+ continue
69
+ usage = f"!{sc.qualified_name} {sc.signature}".strip()
70
+ lines.append(f"• `{usage}` - {sc.help or ''}")
71
+ await ctx.send("\n".join(lines))
72
+ return
73
+ else:
74
+ usage = f"!{cmd.qualified_name} {cmd.signature}".strip()
75
+ await ctx.send(f"`{usage}` - {cmd.help or 'No description available.'}")
76
+ return
77
+
78
+ # No match found
79
+ await ctx.send("I couldn't find that category or command. Try `!bothelp` to see categories.")
80
+
81
+ async def setup(bot):
82
+ await bot.add_cog(General(bot))
src/cogs/schedule.py ADDED
@@ -0,0 +1,379 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import discord
2
+ from discord.ext import commands
3
+ from datetime import datetime, timedelta
4
+ from database.database import get_db
5
+ from database.models import Schedule as ScheduleModel
6
+ from utils import (
7
+ is_cr, get_week_key, load_main_schedule_from_file, save_main_schedule_to_file,
8
+ apply_temp_replacement, apply_temp_cancellation, merge_schedule_for_week,
9
+ apply_temp_changes_to_db_rows, _parse_edit_cancel_args, _normalize_time,
10
+ _normalize_subject, MAIN_SCHEDULE, TEMP_CHANGES
11
+ )
12
+ import utils
13
+
14
+ class Schedule(commands.Cog):
15
+ def __init__(self, bot):
16
+ self.bot = bot
17
+
18
+ def _format_schedule_entry(self, e):
19
+ """Helper to format a single schedule entry."""
20
+ subject = e.get('subject', 'Unknown')
21
+ time = e.get('time', 'Unknown')
22
+ room = e.get('room', '')
23
+ instructor = e.get('instructor', '')
24
+ note = e.get('note', '')
25
+
26
+ # Parse Type from Subject
27
+ type_str = "Unknown"
28
+ if "(L)" in subject:
29
+ type_str = "Lecture"
30
+ elif "(T)" in subject:
31
+ type_str = "Tutorial"
32
+ elif "(P)" in subject or "Lab" in subject or "Practical" in subject:
33
+ type_str = "Lab"
34
+ elif "(L+T)" in subject:
35
+ type_str = "Lecture + Tutorial"
36
+
37
+ # Parse Alternate from Note
38
+ alternate = "False"
39
+ if "Alt. week" in note or "Alternate" in note:
40
+ alternate = "True"
41
+
42
+ entry_str = f"Subject: {subject}\n"
43
+ entry_str += f" - time : {time}\n"
44
+ if instructor:
45
+ entry_str += f" - instructor : {instructor}\n"
46
+ if room:
47
+ entry_str += f" - room : {room}\n"
48
+ entry_str += f" - type : {type_str}\n"
49
+ entry_str += f" - alternate : {alternate}\n"
50
+ entry_str += "-------------------------------------------------------"
51
+
52
+ return entry_str
53
+
54
+ @commands.group(invoke_without_command=True)
55
+ async def schedule(self, ctx):
56
+ await ctx.send("Usage: `!schedule add/delete/today/week`")
57
+
58
+ @schedule.command()
59
+ async def today(self, ctx):
60
+ weekday = datetime.now().strftime("%A")
61
+ day_l = weekday.lower()
62
+
63
+ db = get_db()
64
+ try:
65
+ rows = db.query(ScheduleModel).filter(ScheduleModel.day.ilike(day_l)).all()
66
+
67
+ week_key = get_week_key()
68
+ merged = apply_temp_changes_to_db_rows(rows, week_key)
69
+
70
+ if not merged:
71
+ await ctx.send(f"No classes scheduled for **{weekday}**.")
72
+ return
73
+
74
+ msg = f"**{weekday}:**\n"
75
+ msg += "--------------------------------------------------------\n"
76
+
77
+ grouped = {}
78
+ for e in merged:
79
+ grp = e.get('group_name', 'General')
80
+ if grp not in grouped:
81
+ grouped[grp] = []
82
+ grouped[grp].append(e)
83
+
84
+ for grp, entries in grouped.items():
85
+ msg += f"**{grp}:**\n"
86
+ msg += "--------------------------------------------------------\n"
87
+ for e in sorted(entries, key=lambda x: x['time']):
88
+ msg += self._format_schedule_entry(e) + "\n"
89
+
90
+ await ctx.send(msg)
91
+ finally:
92
+ db.close()
93
+
94
+ @schedule.command()
95
+ async def tomorrow(self, ctx):
96
+ tomorrow_date = datetime.now() + timedelta(days=1)
97
+ weekday = tomorrow_date.strftime("%A")
98
+ day_l = weekday.lower()
99
+
100
+ db = get_db()
101
+ try:
102
+ rows = db.query(ScheduleModel).filter(ScheduleModel.day.ilike(day_l)).all()
103
+
104
+ week_key = get_week_key()
105
+ merged = apply_temp_changes_to_db_rows(rows, week_key)
106
+
107
+ if not merged:
108
+ await ctx.send(f"No classes scheduled for **{weekday}**.")
109
+ return
110
+
111
+ msg = f"**{weekday}:**\n"
112
+ msg += "--------------------------------------------------------\n"
113
+
114
+ grouped = {}
115
+ for e in merged:
116
+ grp = e.get('group_name', 'General')
117
+ if grp not in grouped:
118
+ grouped[grp] = []
119
+ grouped[grp].append(e)
120
+
121
+ for grp, entries in grouped.items():
122
+ msg += f"**{grp}:**\n"
123
+ msg += "--------------------------------------------------------\n"
124
+ for e in sorted(entries, key=lambda x: x['time']):
125
+ msg += self._format_schedule_entry(e) + "\n"
126
+
127
+ await ctx.send(msg)
128
+ finally:
129
+ db.close()
130
+
131
+ @schedule.command()
132
+ async def day(self, ctx, day_name: str):
133
+ day_name = day_name.lower()
134
+ valid_days = ['monday', 'tuesday', 'wednesday', 'thursday', 'friday', 'saturday', 'sunday']
135
+ if day_name not in valid_days:
136
+ await ctx.send("Please provide a valid day of the week (e.g., monday, tuesday, etc.).")
137
+ return
138
+
139
+ db = get_db()
140
+ try:
141
+ rows = db.query(ScheduleModel).filter(ScheduleModel.day.ilike(day_name)).all()
142
+
143
+ week_key = get_week_key()
144
+ merged = apply_temp_changes_to_db_rows(rows, week_key)
145
+
146
+ if not merged:
147
+ await ctx.send(f"No classes scheduled for **{day_name.capitalize()}**.")
148
+ return
149
+
150
+ msg = f"**{day_name.capitalize()}:**\n"
151
+ msg += "--------------------------------------------------------\n"
152
+
153
+ grouped = {}
154
+ for e in merged:
155
+ grp = e.get('group_name', 'General')
156
+ if grp not in grouped:
157
+ grouped[grp] = []
158
+ grouped[grp].append(e)
159
+
160
+ for grp, entries in grouped.items():
161
+ msg += f"**{grp}:**\n"
162
+ msg += "--------------------------------------------------------\n"
163
+ for e in sorted(entries, key=lambda x: x['time']):
164
+ msg += self._format_schedule_entry(e) + "\n"
165
+
166
+ await ctx.send(msg)
167
+ finally:
168
+ db.close()
169
+
170
+ @schedule.command()
171
+ async def week(self, ctx):
172
+ days = ["sunday", "monday", "tuesday", "wednesday", "thursday", "friday", "saturday"]
173
+ week_key = get_week_key()
174
+
175
+ db = get_db()
176
+ try:
177
+ msg = ""
178
+ any_entry = False
179
+ for d in days:
180
+ rows = db.query(ScheduleModel).filter(ScheduleModel.day.ilike(d)).all()
181
+ merged = apply_temp_changes_to_db_rows(rows, week_key)
182
+ if merged:
183
+ any_entry = True
184
+ day_msg = f"**{d.title()}:**\n"
185
+ day_msg += "--------------------------------------------------------\n"
186
+
187
+ grouped = {}
188
+ for e in merged:
189
+ grp = e.get('group_name', 'General')
190
+ if grp not in grouped:
191
+ grouped[grp] = []
192
+ grouped[grp].append(e)
193
+
194
+ for grp, entries in grouped.items():
195
+ day_msg += f"**{grp}:**\n"
196
+ day_msg += "--------------------------------------------------------\n"
197
+ for e in sorted(entries, key=lambda x: x['time']):
198
+ day_msg += self._format_schedule_entry(e) + "\n"
199
+
200
+ if len(msg) + len(day_msg) > 1900:
201
+ await ctx.send(msg)
202
+ msg = day_msg
203
+ else:
204
+ msg += day_msg
205
+
206
+ if not any_entry:
207
+ await ctx.send("No schedule set yet.")
208
+ return
209
+
210
+ if msg:
211
+ await ctx.send(msg)
212
+ finally:
213
+ db.close()
214
+
215
+ @schedule.command()
216
+ @is_cr()
217
+ async def delete(self, ctx, day: str, time: str, *, subject: str):
218
+ db = get_db()
219
+ try:
220
+ rows = db.query(ScheduleModel).filter(
221
+ ScheduleModel.day == day.title(),
222
+ ScheduleModel.time == time,
223
+ ScheduleModel.subject == subject.title()
224
+ ).all()
225
+
226
+ if rows:
227
+ for row in rows:
228
+ db.delete(row)
229
+ db.commit()
230
+ await ctx.send(f"Deleted: {subject.title()} on {day.title()} at {time}")
231
+ else:
232
+ await ctx.send("No matching class found.")
233
+ finally:
234
+ db.close()
235
+
236
+ @schedule.command(name='main')
237
+ @is_cr()
238
+ async def schedule_main(self, ctx, subcmd: str, *, filename: str):
239
+ if subcmd.lower() != 'routine':
240
+ await ctx.send("Usage: `!schedule main routine \"assets/main.json\"`")
241
+ return
242
+ filename = filename.strip('"').strip("'")
243
+ try:
244
+ schedule_data = load_main_schedule_from_file(filename)
245
+
246
+ db = get_db()
247
+ try:
248
+ db.query(ScheduleModel).delete()
249
+
250
+ for group, days in schedule_data.items():
251
+ for day, entries in days.items():
252
+ for e in entries:
253
+ new_entry = ScheduleModel(
254
+ day=day.title(),
255
+ time=e.get('time'),
256
+ subject=e.get('subject'),
257
+ group_name=group,
258
+ room=e.get('room', ''),
259
+ instructor=e.get('instructor', ''),
260
+ note=e.get('note', '')
261
+ )
262
+ db.add(new_entry)
263
+ db.commit()
264
+ await ctx.send(f"✅ Main schedule loaded successfully from `{filename}` and written to database.")
265
+ finally:
266
+ db.close()
267
+ except FileNotFoundError:
268
+ await ctx.send(f"❌ File not found: `{filename}`")
269
+ except Exception as e:
270
+ await ctx.send(f"❌ Failed to load schedule: {e}")
271
+
272
+ @schedule.command()
273
+ @is_cr()
274
+ async def edit(self, ctx, *, args: str = None):
275
+ """Edit a class time/subject. Usage:
276
+ !schedule edit GroupA monday 9:00 AM Math 10:00 AM Math permanent|temporary
277
+ """
278
+ if not args:
279
+ await ctx.send("❌ Invalid format. Usage: `!schedule edit GroupA monday 9:00 AM Math 10:00 AM Math permanent` (use 'temporary' or 'permanent')")
280
+ return
281
+ parsed = _parse_edit_cancel_args(args)
282
+ if not parsed or 'new_time' not in parsed:
283
+ await ctx.send("❌ Invalid format. Usage: `!schedule edit GroupA monday 9:00 AM Math 10:00 AM Math permanent` (use 'temporary' or 'permanent')")
284
+ return
285
+
286
+ group = parsed['group']
287
+ day = parsed['day']
288
+ orig_time = parsed['orig_time']
289
+ orig_subject = parsed['orig_subject']
290
+ new_time = parsed['new_time']
291
+ new_subject = parsed['new_subject']
292
+ permanent = parsed['permanent']
293
+
294
+ if not utils.MAIN_SCHEDULE:
295
+ await ctx.send("❌ No main schedule loaded. Use `!schedule main routine \"main.json\"` first.")
296
+ return
297
+
298
+ if permanent:
299
+ group_data = utils.MAIN_SCHEDULE.get(group)
300
+ if not group_data or day not in group_data:
301
+ await ctx.send("⚠️ Group or day not found in main schedule.")
302
+ return
303
+ entries = group_data[day]
304
+ for e in entries:
305
+ if _normalize_time(e.get('time')) == _normalize_time(orig_time) and _normalize_subject(e.get('subject')) == _normalize_subject(orig_subject):
306
+ e['time'] = new_time
307
+ e['subject'] = new_subject
308
+ try:
309
+ save_main_schedule_to_file()
310
+ await ctx.send(f"✅ Schedule for {group} on {day.title()} updated permanently: {orig_time} {orig_subject} -> {new_time} {new_subject}")
311
+ except Exception as ex:
312
+ await ctx.send(f"❌ Failed to save main schedule: {ex}")
313
+ return
314
+ await ctx.send("⚠️ Matching class not found in main schedule.")
315
+ else:
316
+ wk = get_week_key()
317
+ apply_temp_replacement(wk, group, day, orig_time, orig_subject, {'time': new_time, 'subject': new_subject, 'room': ''})
318
+ await ctx.send(f"✅ Schedule for {group} on {day.title()} {orig_time} {orig_subject} has been temporarily changed to {new_time} {new_subject} for this week.")
319
+
320
+ @schedule.command()
321
+ @is_cr()
322
+ async def cancel(self, ctx, *, args: str = None):
323
+ """Cancel a class. Usage: !schedule cancel GroupA monday 9:00 AM Math permanent|temporary"""
324
+ if not args:
325
+ await ctx.send("❌ Invalid format. Usage: `!schedule cancel GroupA monday 9:00 AM Math permanent`")
326
+ return
327
+ parsed = _parse_edit_cancel_args(args)
328
+ if not parsed:
329
+ await ctx.send("❌ Invalid format. Usage: `!schedule cancel GroupA monday 9:00 AM Math permanent`")
330
+ return
331
+
332
+ group = parsed['group']
333
+ day = parsed['day']
334
+ orig_time = parsed['orig_time']
335
+ orig_subject = parsed['orig_subject']
336
+ permanent = parsed.get('permanent', False)
337
+
338
+ if permanent:
339
+ if not utils.MAIN_SCHEDULE:
340
+ await ctx.send("❌ No main schedule loaded.")
341
+ return
342
+ group_data = utils.MAIN_SCHEDULE.get(group)
343
+ if not group_data or day not in group_data:
344
+ await ctx.send("⚠️ Group or day not found in main schedule.")
345
+ return
346
+ entries = group_data[day]
347
+ for i, e in enumerate(entries):
348
+ if _normalize_time(e.get('time')) == _normalize_time(orig_time) and _normalize_subject(e.get('subject')) == _normalize_subject(orig_subject):
349
+ entries.pop(i)
350
+ try:
351
+ save_main_schedule_to_file()
352
+ await ctx.send(f"✅ {orig_subject} on {day.title()} at {orig_time} permanently cancelled for {group}.")
353
+ except Exception as ex:
354
+ await ctx.send(f"❌ Failed to save main schedule: {ex}")
355
+ return
356
+ await ctx.send("⚠️ Matching class not found in main schedule.")
357
+ else:
358
+ wk = get_week_key()
359
+ apply_temp_cancellation(wk, group, day, orig_time, orig_subject)
360
+ await ctx.send(f"✅ {orig_subject} on {day.title()} at {orig_time} temporarily cancelled for {group} this week.")
361
+
362
+ @schedule.command()
363
+ async def view(self, ctx, group: str, day: str):
364
+ if not utils.MAIN_SCHEDULE:
365
+ await ctx.send("❌ No main schedule loaded. Use `!schedule main routine \"main.json\"` first.")
366
+ return
367
+ day_l = day.lower()
368
+ week_key = get_week_key()
369
+ merged = merge_schedule_for_week(group, day_l, week_key)
370
+ if not merged:
371
+ await ctx.send(f"No schedule found for **{group}** on **{day.title()}**")
372
+ return
373
+ msg = f"**{group} schedule for {day.title()} (week {week_key[1]})**\n"
374
+ for e in merged:
375
+ msg += self._format_schedule_entry(e) + "\n"
376
+ await ctx.send(msg)
377
+
378
+ async def setup(bot):
379
+ await bot.add_cog(Schedule(bot))
src/cogs/tasks.py ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import discord
2
+ from discord.ext import commands
3
+ from datetime import datetime
4
+ from apscheduler.schedulers.asyncio import AsyncIOScheduler
5
+ from database.database import get_db
6
+ from database.models import Assessment
7
+ from configuration.config import CHANNEL_ID
8
+
9
+ class Tasks(commands.Cog):
10
+ def __init__(self, bot):
11
+ self.bot = bot
12
+ self.scheduler = AsyncIOScheduler()
13
+ self.scheduler.start()
14
+ # Add job if not already present (though in a cog, we usually just add it on load)
15
+ if not self.scheduler.get_job("daily_reminder"):
16
+ self.scheduler.add_job(self.daily_assessment_reminder, 'cron', hour=18, minute=0, id="daily_reminder")
17
+
18
+ async def daily_assessment_reminder(self):
19
+ today = datetime.now().strftime("%Y-%m-%d")
20
+ db = get_db()
21
+ try:
22
+ assessments = db.query(Assessment).filter(Assessment.date == today).all()
23
+
24
+ if assessments and CHANNEL_ID:
25
+ channel = self.bot.get_channel(CHANNEL_ID)
26
+ if channel:
27
+ msg = "**🔔 Today's Assessments Reminder**\n\n"
28
+ for a in assessments:
29
+ time = a.time or "Time not set"
30
+ desc = a.description or "No description"
31
+ msg += f"• **{a.subject}**: {desc} at {time}\n"
32
+ await channel.send("@Class\n" + msg)
33
+ finally:
34
+ db.close()
35
+
36
+ def cog_unload(self):
37
+ self.scheduler.shutdown()
38
+
39
+ async def setup(bot):
40
+ await bot.add_cog(Tasks(bot))
src/main.py ADDED
@@ -0,0 +1,223 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+ """Bot entrypoint.
3
+
4
+ Supports optional --override flag to control whether schedule import from main.json
5
+ overwrites the database.
6
+
7
+ Also configures logging and logs incoming messages ("message to bot") and messages
8
+ sent by the bot ("message by bot").
9
+ """
10
+ import argparse
11
+ import logging
12
+ import os
13
+ import sys
14
+ import asyncio
15
+ import discord
16
+ from discord.ext import commands
17
+
18
+ # Ensure project root is on sys.path so packages at repo root (e.g., database, configuration)
19
+ # can be imported when running this file as a script: `python src/main.py`.
20
+ PROJECT_ROOT = os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
21
+ if PROJECT_ROOT not in sys.path:
22
+ sys.path.insert(0, PROJECT_ROOT)
23
+
24
+ from database.database import init_db, get_db
25
+ from database.models import Schedule as ScheduleModel
26
+ from configuration.config import TOKEN
27
+ from utils import load_main_schedule_from_file
28
+
29
+
30
+ # ----- Logging setup -----------------------------------------------------
31
+ logger = logging.getLogger('discord_bot')
32
+ logger.setLevel(logging.DEBUG)
33
+ fmt = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
34
+ ch = logging.StreamHandler()
35
+ ch.setLevel(logging.INFO)
36
+ ch.setFormatter(fmt)
37
+ fh = logging.FileHandler('/tmp/bot.log', encoding='utf-8')
38
+ fh.setLevel(logging.DEBUG)
39
+ fh.setFormatter(fmt)
40
+ logger.addHandler(ch)
41
+ logger.addHandler(fh)
42
+
43
+
44
+ # ----- Bot setup ---------------------------------------------------------
45
+ intents = discord.Intents.default()
46
+ intents.message_content = True
47
+ bot = commands.Bot(command_prefix="!", intents=intents)
48
+
49
+
50
+ # Initialize database
51
+
52
+
53
+
54
+ def auto_import_schedule(override: bool = False):
55
+ """Import schedule from main.json.
56
+
57
+ If override is False, do not import when Schedule table already has rows.
58
+ If override is True, clear existing schedule entries and import.
59
+ """
60
+ try:
61
+ # Try multiple possible locations for main.json
62
+ possible_paths = [
63
+ os.path.join(PROJECT_ROOT, 'main.json'),
64
+ os.path.join(os.path.dirname(__file__), '..', 'main.json'),
65
+ '/app/main.json',
66
+ 'main.json',
67
+ ]
68
+
69
+ logger.debug("🔍 Looking for main.json...")
70
+ logger.debug(f" PROJECT_ROOT: {PROJECT_ROOT}")
71
+ logger.debug(f" Current dir: {os.getcwd()}")
72
+
73
+ main_json_path = None
74
+ for path in possible_paths:
75
+ abs_path = os.path.abspath(path)
76
+ logger.debug(f" Checking: {abs_path}")
77
+ if os.path.exists(abs_path):
78
+ main_json_path = abs_path
79
+ logger.info(f" ✅ Found at: {abs_path}")
80
+ break
81
+
82
+ if not main_json_path:
83
+ logger.warning("❌ main.json not found in any of these locations:")
84
+ for path in possible_paths:
85
+ logger.warning(f" - {os.path.abspath(path)}")
86
+ return
87
+
88
+ logger.info(f"📅 Importing schedule from {main_json_path}...")
89
+ schedule_data = load_main_schedule_from_file(main_json_path)
90
+
91
+ db = get_db()
92
+ try:
93
+ # If not overriding, and DB already has schedule rows, skip import
94
+ if not override:
95
+ existing = db.query(ScheduleModel).count()
96
+ if existing > 0:
97
+ logger.info(f"Found {existing} existing schedule entries in DB; skipping import (use --override to replace).")
98
+ return
99
+
100
+ # Clear existing schedule data (when override True or DB empty)
101
+ deleted_count = db.query(ScheduleModel).delete()
102
+ logger.info(f"🗑️ Cleared {deleted_count} existing schedule entries")
103
+
104
+ # Insert new data from main.json
105
+ entry_count = 0
106
+ for group, days in schedule_data.items():
107
+ # skip any top-level keys that are not schedule groups (e.g., metadata)
108
+ if not isinstance(days, dict):
109
+ continue
110
+ for day, entries in days.items():
111
+ # entries should be a list of dicts; skip otherwise
112
+ if not isinstance(entries, list):
113
+ continue
114
+ for e in entries:
115
+ if not isinstance(e, dict):
116
+ continue
117
+ new_entry = ScheduleModel(
118
+ day=day.title(),
119
+ time=e.get('time'),
120
+ subject=e.get('subject'),
121
+ group_name=group,
122
+ room=e.get('room', ''),
123
+ instructor=e.get('instructor', ''),
124
+ note=e.get('note', ''),
125
+ )
126
+ db.add(new_entry)
127
+ entry_count += 1
128
+ db.commit()
129
+ logger.info(f"✅ Schedule imported successfully! ({entry_count} entries)")
130
+ finally:
131
+ db.close()
132
+ except Exception as e:
133
+ logger.exception(f"❌ Error auto-importing schedule: {e}")
134
+
135
+
136
+ @bot.event
137
+ async def on_ready():
138
+ logger.info(f"{bot.user} is online!")
139
+
140
+
141
+ @bot.event
142
+ async def on_command_error(ctx, error):
143
+ # Global handler for command errors to ensure they are logged and user is notified
144
+ try:
145
+ # Handle missing required argument gracefully
146
+ if isinstance(error, commands.MissingRequiredArgument):
147
+ param = error.param.name if hasattr(error, 'param') else 'argument'
148
+ usage = ''
149
+ if ctx.command:
150
+ usage = f" Usage: `!{ctx.command.qualified_name} {ctx.command.signature}`"
151
+ await ctx.send(f"❌ Missing required argument: `{param}`.{usage}")
152
+ logger.warning(f"Missing argument in command {ctx.command}: {param}")
153
+ return
154
+
155
+ logger.exception(f"Error in command '{getattr(ctx, 'command', None)}': {error}")
156
+ # Friendly message to channel
157
+ await ctx.send("❌ An error occurred while processing your command. The error has been logged.")
158
+ except Exception:
159
+ logger.exception("Failed in on_command_error handler")
160
+
161
+
162
+ @bot.event
163
+ async def on_message(message: discord.Message):
164
+ try:
165
+ # Ignore webhooks
166
+ if getattr(message, 'webhook_id', None) is not None:
167
+ return
168
+
169
+ if message.author == bot.user:
170
+ # message by bot
171
+ logger.info(f"message by bot: channel={getattr(message.channel, 'name', message.channel.id)} content={message.content}")
172
+ else:
173
+ # message to bot
174
+ logger.info(f"message to bot: author={message.author} channel={getattr(message.channel, 'name', message.channel.id)} content={message.content}")
175
+
176
+ # ensure commands still processed
177
+ await bot.process_commands(message)
178
+ except Exception:
179
+ logger.exception("Error in on_message handler")
180
+
181
+
182
+ async def load_extensions():
183
+ # Load all cogs from the cogs directory
184
+ cogs_dir = os.path.join(os.path.dirname(__file__), 'cogs')
185
+ for filename in os.listdir(cogs_dir):
186
+ if filename.endswith('.py') and filename != '__init__.py':
187
+ await bot.load_extension(f'cogs.{filename[:-3]}')
188
+
189
+
190
+ async def main(argv=None):
191
+ parser = argparse.ArgumentParser(description='Start the Discord bot')
192
+ parser.add_argument('--override', action='store_true', help='Override existing schedule data in DB with main.json')
193
+ args = parser.parse_args(argv)
194
+
195
+ # Initialize database
196
+ try:
197
+ init_db()
198
+ except Exception as e:
199
+ logger.error(f"❌ Database initialization failed: {e}")
200
+ logger.error(" Check your DATABASE_URL in .env or Supabase credentials.")
201
+ sys.exit(1)
202
+
203
+ # Import schedule according to flag
204
+ try:
205
+ auto_import_schedule(override=args.override)
206
+ except Exception:
207
+ logger.exception('Auto-import schedule failed')
208
+
209
+ # Start bot
210
+ async with bot:
211
+ await load_extensions()
212
+ await bot.start(TOKEN)
213
+
214
+
215
+ if __name__ == '__main__':
216
+ try:
217
+ asyncio.run(main())
218
+ except KeyboardInterrupt:
219
+ # Handle Ctrl+C gracefully
220
+ logger.info('Shutting down (KeyboardInterrupt)')
221
+ except Exception:
222
+ logger.exception('Fatal error in main')
223
+
src/mistral_client.py ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # mistral_client.py
2
+ import aiohttp
3
+ import json
4
+ import os
5
+
6
+ async def ask_mistral(question, context=""):
7
+ api_key = os.getenv("MISTRAL_API_KEY")
8
+ if not api_key:
9
+ return None
10
+
11
+ url = "https://api.mistral.ai/v1/chat/completions"
12
+ headers = {
13
+ "Authorization": f"Bearer {api_key}",
14
+ "Content-Type": "application/json"
15
+ }
16
+ payload = {
17
+ "model": "mistral-small-latest",
18
+ "messages": [
19
+ {"role": "system", "content": f"You are a helpful class assistant. Use this context if relevant: {context}"},
20
+ {"role": "user", "content": question}
21
+ ],
22
+ "temperature": 0.5,
23
+ "max_tokens": 300
24
+ }
25
+
26
+ try:
27
+ async with aiohttp.ClientSession() as session:
28
+ async with session.post(url, headers=headers, json=payload, timeout=10) as response:
29
+ if response.status == 200:
30
+ data = await response.json()
31
+ return data["choices"][0]["message"]["content"]
32
+ except Exception as e:
33
+ print(f"Mistral API Error: {e}")
34
+ pass
35
+ return None
src/utils.py ADDED
@@ -0,0 +1,307 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ import json
3
+ import re
4
+ from datetime import datetime
5
+ from discord.ext import commands
6
+ from configuration.config import CR_USER_ID, CR_ROLE_NAME
7
+
8
+ # Global State
9
+ MAIN_SCHEDULE = {}
10
+ MAIN_SCHEDULE_FILE = None
11
+ TEMP_CHANGES = {}
12
+
13
+ # Ensure project root is on sys.path so packages at repo root (e.g., database, configuration)
14
+ # can be imported when running this file as a script: `python src/main.py`.
15
+ PROJECT_ROOT = os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
16
+
17
+ def is_cr():
18
+ async def predicate(ctx):
19
+ if CR_USER_ID and ctx.author.id == CR_USER_ID:
20
+ return True
21
+ if any(role.name == CR_ROLE_NAME for role in ctx.author.roles):
22
+ return True
23
+ await ctx.send("❌ You don't have permission to use this command.")
24
+ return False
25
+ return commands.check(predicate)
26
+
27
+ def get_week_key(dt=None):
28
+ d = dt or datetime.now()
29
+ iso = d.isocalendar()
30
+ return (iso.year, iso.week)
31
+
32
+ def load_main_schedule_from_file(path):
33
+ global MAIN_SCHEDULE, MAIN_SCHEDULE_FILE
34
+ # Resolve path: allow passing a filename relative to project root (where main.json usually lives)
35
+ candidate_paths = []
36
+ # expand user + absolute
37
+ p = os.path.expanduser(path)
38
+ if not os.path.isabs(p):
39
+ p = os.path.abspath(p)
40
+ candidate_paths.append(p)
41
+ # also try relative to project root (one level up from src)
42
+ try:
43
+ pr = PROJECT_ROOT
44
+ except NameError:
45
+ pr = os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
46
+ candidate_paths.append(os.path.join(pr, path))
47
+ candidate_paths.append(os.path.join(pr, 'assets', path))
48
+ candidate_paths.append(os.path.join(pr, 'assets', os.path.basename(path)))
49
+ candidate_paths.append(os.path.abspath(os.path.join(os.path.dirname(__file__), path)))
50
+
51
+ found = None
52
+ for cp in candidate_paths:
53
+ if os.path.isfile(cp):
54
+ found = cp
55
+ break
56
+ if not found:
57
+ raise FileNotFoundError(f"Schedule file not found: {path}. Tried: {', '.join(candidate_paths)}")
58
+ with open(found, 'r', encoding='utf-8') as f:
59
+ data = json.load(f)
60
+ # Normalize day keys to lowercase
61
+ normalized = {}
62
+ for group, days in data.items():
63
+ normalized[group] = {}
64
+ for day, entries in days.items():
65
+ normalized[group][day.lower()] = entries
66
+ MAIN_SCHEDULE = normalized
67
+ MAIN_SCHEDULE_FILE = found
68
+ return MAIN_SCHEDULE
69
+
70
+ def save_main_schedule_to_file(path=None):
71
+ global MAIN_SCHEDULE, MAIN_SCHEDULE_FILE
72
+ if path is None:
73
+ path = MAIN_SCHEDULE_FILE
74
+ if not path:
75
+ raise ValueError("No main schedule file set")
76
+ # We will write days as they are in MAIN_SCHEDULE (lowercase days)
77
+ with open(path, 'w', encoding='utf-8') as f:
78
+ json.dump(MAIN_SCHEDULE, f, indent=2, ensure_ascii=False)
79
+
80
+ def _normalize_time(t):
81
+ if not t:
82
+ return ''
83
+ s = str(t).strip()
84
+ # Ensure there's a space before AM/PM if present (e.g., '9:00AM' -> '9:00 AM')
85
+ s = re.sub(r'\s*([AaPp][Mm])$', r' \1', s)
86
+ # Normalize spacing and AM/PM case
87
+ s = re.sub(r'\s+', ' ', s).strip()
88
+ s = re.sub(r'([AaPp][Mm])', lambda m: m.group(1).upper(), s)
89
+ return s
90
+
91
+ def _normalize_subject(sub):
92
+ if not sub:
93
+ return ''
94
+ return str(sub).strip().lower()
95
+
96
+ def apply_temp_replacement(week_key, group, day, orig_time, orig_subject, new_entry):
97
+ # store normalized keys for robust matching
98
+ ot = _normalize_time(orig_time)
99
+ osub = _normalize_subject(orig_subject)
100
+ # normalize new_entry time/subject for consistency
101
+ new_e = dict(new_entry)
102
+ if 'time' in new_e:
103
+ new_e['time'] = _normalize_time(new_e['time'])
104
+ if 'subject' in new_e:
105
+ new_e['subject'] = new_e['subject'].strip()
106
+ TEMP_CHANGES.setdefault(week_key, {}).setdefault(group, {}).setdefault(day, {}).setdefault('replacements', []).append(((ot, osub), new_e))
107
+
108
+ def apply_temp_cancellation(week_key, group, day, orig_time, orig_subject):
109
+ ot = _normalize_time(orig_time)
110
+ osub = _normalize_subject(orig_subject)
111
+ TEMP_CHANGES.setdefault(week_key, {}).setdefault(group, {}).setdefault(day, {}).setdefault('cancellations', []).append((ot, osub))
112
+
113
+ def merge_schedule_for_week(group, day, week_key=None):
114
+ """Return a list of schedule entries for the given group and day, applying temporary changes for the week_key if present."""
115
+ week_key = week_key or get_week_key()
116
+ base = []
117
+ if group in MAIN_SCHEDULE and day in MAIN_SCHEDULE[group]:
118
+ # clone base entries
119
+ base = [dict(e) for e in MAIN_SCHEDULE[group][day]]
120
+ # Apply temporary changes if any
121
+ wk = TEMP_CHANGES.get(week_key, {})
122
+ grp = wk.get(group, {})
123
+ day_changes = grp.get(day, {})
124
+ cancels = set(day_changes.get('cancellations', []))
125
+ replacements = {k: v for (k, v) in day_changes.get('replacements', [])}
126
+
127
+ merged = []
128
+ handled_orig_keys = set()
129
+
130
+ for e in base:
131
+ key = (_normalize_time(e.get('time')), _normalize_subject(e.get('subject')))
132
+ # If original time+subject canceled directly, skip
133
+ if key in cancels:
134
+ handled_orig_keys.add(key)
135
+ continue
136
+
137
+ # If there is a replacement mapping for this original, check if the replacement itself is canceled
138
+ if key in replacements:
139
+ new_e = replacements[key]
140
+ new_key = (_normalize_time(new_e.get('time')), _normalize_subject(new_e.get('subject')))
141
+ # If replacement is canceled explicitly, skip (neither original nor replacement)
142
+ if new_key in cancels:
143
+ handled_orig_keys.add(key)
144
+ continue
145
+ merged.append({
146
+ 'time': new_e.get('time'),
147
+ 'subject': new_e.get('subject'),
148
+ 'room': new_e.get('room', ''),
149
+ 'instructor': new_e.get('instructor', ''),
150
+ 'note': new_e.get('note', '')
151
+ })
152
+ handled_orig_keys.add(key)
153
+ else:
154
+ # keep original (normalize time for consistent display)
155
+ merged.append({
156
+ 'time': _normalize_time(e.get('time')),
157
+ 'subject': e.get('subject'),
158
+ 'room': e.get('room', ''),
159
+ 'instructor': e.get('instructor', ''),
160
+ 'note': e.get('note', '')
161
+ })
162
+
163
+ # Add any replacement entries that did not map to an existing original (standalone adds)
164
+ for (orig_k, new_e) in day_changes.get('replacements', []):
165
+ if orig_k not in handled_orig_keys:
166
+ new_key = (_normalize_time(new_e.get('time')), _normalize_subject(new_e.get('subject')))
167
+ if new_key in cancels:
168
+ # this standalone replacement was later canceled
169
+ continue
170
+ merged.append({
171
+ 'time': new_e.get('time'),
172
+ 'subject': new_e.get('subject'),
173
+ 'room': new_e.get('room', ''),
174
+ 'instructor': new_e.get('instructor', ''),
175
+ 'note': new_e.get('note', '')
176
+ })
177
+
178
+ return merged
179
+
180
+ def apply_temp_changes_to_db_rows(rows, week_key):
181
+ """Given a list of sqlite Row-like dicts with keys day,time,subject,group_name,room,
182
+ apply temporary changes from TEMP_CHANGES for week_key and return merged list.
183
+ """
184
+ # build mapping of replacements and cancellations for groups/days
185
+ result = []
186
+ if not rows:
187
+ return []
188
+
189
+ def _rget(r, key, default=None):
190
+ try:
191
+ # SQLAlchemy object access
192
+ if hasattr(r, key):
193
+ return getattr(r, key)
194
+ # Dictionary access
195
+ elif hasattr(r, 'get'):
196
+ return r.get(key, default)
197
+ else:
198
+ return r[key]
199
+ except Exception:
200
+ return default
201
+
202
+ # Group rows by (group, day) so we can apply replacements/cancellations per group/day
203
+ grouped = {}
204
+ for r in rows:
205
+ group = _rget(r, 'group_name') or _rget(r, 'group') or ''
206
+ day = (_rget(r, 'day') or '').lower()
207
+ grouped.setdefault((group, day), []).append(r)
208
+
209
+ wk = TEMP_CHANGES.get(week_key, {})
210
+
211
+ # Process each group/day present in DB rows
212
+ for (group, day), rlist in grouped.items():
213
+ # get changes for this group/day
214
+ grp = wk.get(group, {})
215
+ day_changes = grp.get(day, {})
216
+ cancels = set(day_changes.get('cancellations', []))
217
+ replacements = {k: v for (k, v) in day_changes.get('replacements', [])}
218
+
219
+ handled_orig_keys = set()
220
+
221
+ for r in rlist:
222
+ time = _rget(r, 'time')
223
+ subject = _rget(r, 'subject')
224
+ key = (_normalize_time(time), _normalize_subject(subject))
225
+
226
+ if key in cancels:
227
+ # skip cancelled original
228
+ handled_orig_keys.add(key)
229
+ continue
230
+
231
+ if key in replacements:
232
+ new_e = replacements[key]
233
+ handled_orig_keys.add(key)
234
+ merged_entry = {
235
+ 'time': new_e.get('time'),
236
+ 'subject': new_e.get('subject'),
237
+ 'room': new_e.get('room', _rget(r, 'room', '')),
238
+ 'group_name': group
239
+ }
240
+ result.append(merged_entry)
241
+ else:
242
+ # keep original (normalize time for consistent display)
243
+ result.append({'time': _normalize_time(time), 'subject': subject, 'room': _rget(r, 'room', ''), 'group_name': group})
244
+
245
+ # Add any replacement entries that did not map to an existing original (standalone adds)
246
+ for (orig_k, new_e) in day_changes.get('replacements', []):
247
+ if orig_k not in handled_orig_keys:
248
+ # This replacement did not correspond to any existing row - append as new
249
+ result.append({'time': new_e.get('time'), 'subject': new_e.get('subject'), 'room': new_e.get('room', ''), 'group_name': group})
250
+
251
+ # Also process any TEMP_CHANGES for groups/days not present in DB rows (pure adds)
252
+ for grp_name, groups in wk.items():
253
+ for dname, dchanges in groups.items():
254
+ if (grp_name, dname) in grouped:
255
+ continue
256
+ # no DB rows for this group/day; add all replacements that are not cancellations
257
+ for (orig_k, new_e) in dchanges.get('replacements', []):
258
+ result.append({'time': new_e.get('time'), 'subject': new_e.get('subject'), 'room': new_e.get('room',''), 'group_name': grp_name})
259
+
260
+ return result
261
+
262
+ def _find_time_tokens(tokens, start=0):
263
+ """Find a time token starting at or after start. Returns (time_string, start_index, end_index).
264
+ Supports formats like '9:00', '9:00 AM', '09:00', '9:00PM' etc.
265
+ """
266
+ ampm = set(['AM', 'PM', 'am', 'pm'])
267
+ time_re = re.compile(r'^\d{1,2}:\d{2}([AaPp][Mm])?$')
268
+ for i in range(start, len(tokens)):
269
+ t = tokens[i]
270
+ # token like 9:00 or 9:00AM
271
+ if time_re.match(t):
272
+ return (t, i, i)
273
+ # token like 9:00 and next token AM/PM
274
+ if i + 1 < len(tokens) and re.match(r'^\d{1,2}:\d{2}$', t) and tokens[i+1] in ampm:
275
+ return (f"{t} {tokens[i+1]}", i, i+1)
276
+ return (None, -1, -1)
277
+
278
+ def _parse_edit_cancel_args(args):
279
+ tokens = args.split()
280
+ if len(tokens) < 4:
281
+ return None
282
+ group = tokens[0]
283
+ day = tokens[1].lower()
284
+ # find first time token
285
+ t1, t1s, t1e = _find_time_tokens(tokens, 2)
286
+ if not t1:
287
+ return None
288
+ # find second time token after t1e+1 (for edit). For cancel, second time may not exist.
289
+ t2, t2s, t2e = _find_time_tokens(tokens, t1e+1)
290
+ # mode is last token if it's 'permanent' or 'temporary'
291
+ mode = tokens[-1].lower() if tokens[-1].lower() in ('permanent', 'temporary') else 'temporary'
292
+
293
+ if t2:
294
+ # edit: subject is between t1e+1 and t2s-1; new_subject is between t2e+1 and -1 (mode)
295
+ orig_subject = ' '.join(tokens[t1e+1:t2s]).strip()
296
+ new_time = t2
297
+ new_subject = ' '.join(tokens[t2e+1:-1]).strip()
298
+ return {
299
+ 'group': group, 'day': day, 'orig_time': t1, 'orig_subject': orig_subject,
300
+ 'new_time': new_time, 'new_subject': new_subject, 'permanent': (mode == 'permanent')
301
+ }
302
+ else:
303
+ # cancel: subject is remaining tokens after t1e
304
+ orig_subject = ' '.join(tokens[t1e+1: -1]).strip() if mode in ('permanent', 'temporary') else ' '.join(tokens[t1e+1:]).strip()
305
+ return {
306
+ 'group': group, 'day': day, 'orig_time': t1, 'orig_subject': orig_subject, 'permanent': (mode == 'permanent')
307
+ }