jbbove commited on
Commit
f557cda
·
0 Parent(s):

Initial commit with docker image ready for Hugging Face Spaces deployment

Browse files
.gitignore ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ .streamlit/secrets.toml
2
+ *.env
3
+ .env
.streamlit/config.toml ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ [server]
2
+ headless = true
3
+ enableCORS = false
4
+ port = 7860
Dockerfile ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Base image with Python
2
+ FROM python:3.10-slim
3
+
4
+ # Set environment variable to prevent Python from writing .pyc files
5
+ ENV PYTHONDONTWRITEBYTECODE=1
6
+
7
+ # Set working directory inside the container
8
+ WORKDIR /project
9
+
10
+ # Copy all project files into the container
11
+ COPY . .
12
+
13
+ # Install system dependencies (you may need more later, like git or libgl)
14
+ RUN apt-get update && apt-get install -y \
15
+ build-essential \
16
+ && rm -rf /var/lib/apt/lists/*
17
+
18
+ # Install Python dependencies
19
+ RUN pip install --no-cache-dir -r requirements.txt
20
+
21
+ # Expose the port that the app will run on
22
+ EXPOSE 7860
23
+
24
+ # Hugging Face expects this command to run the app
25
+ CMD ["streamlit", "run", "app/main.py", "--server.port", "7860", "--server.address", "0.0.0.0"]
README.md ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ # PromptAId Operations
2
+
3
+ This is a simple chat prototype that will become an agentic RAG system for emergency operations. It currently uses a basic LLM pipeline. Future versions will support RAG, tool use, and knowledge graph querying.
agent/__init__.py ADDED
File without changes
agent/agent.py ADDED
File without changes
agent/tools.py ADDED
File without changes
app/__init__.py ADDED
File without changes
app/main.py ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # app/main.py
2
+ import streamlit as st # For building the UI
3
+ from langchain_openai import OpenAI # OpenAI wrapper for LLM
4
+ from langchain.chains import ConversationChain # Simple chat pipeline
5
+ import os
6
+ openai_key = os.getenv("OPENAI_API_KEY")
7
+
8
+ # Set Streamlit page config
9
+ st.set_page_config(page_title="PromptAId Chat", page_icon="⚠️")
10
+
11
+ # Display title
12
+ st.title("🧠 PromptAId Operations — Chat Demo")
13
+
14
+ # Input field for user prompt
15
+ user_input = st.text_input("Ask something about risk or emergencies...")
16
+
17
+ # Load your LLM (replace with your OpenAI API key in .streamlit/secrets.toml or env)
18
+ llm = OpenAI(temperature=0.4)
19
+
20
+ # Initialize a simple conversation chain (no memory yet)
21
+ conversation = ConversationChain(llm=llm)
22
+
23
+ # If there's input, run the LLM and display the result
24
+ if user_input:
25
+ response = conversation.run(user_input)
26
+ st.markdown(f"**Model Response:** {response}")
app/ui_components.py ADDED
File without changes
kg/__init__.py ADDED
File without changes
kg/ingest.py ADDED
File without changes
kg/query.py ADDED
File without changes
rag/__init__.py ADDED
File without changes
rag/llm.py ADDED
File without changes
rag/retrieval.py ADDED
File without changes
requirements.txt ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ streamlit
2
+ openai
3
+ langchain
4
+ langchain-community
5
+ langchain-openai
scripts/build_index.py ADDED
File without changes