Spaces:
Sleeping
Sleeping
File size: 2,319 Bytes
347c5a3 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 | import streamlit as st
import os
from groq import Groq
# Get API Key
api_key = os.getenv("GROQ_API_KEY")
# Initialize Groq client
client = Groq(api_key=api_key) if api_key else None
# Streamlit UI
st.set_page_config(page_title="π§ͺ ChemEng AI Assistant", page_icon="π§ͺ")
st.title("π§ͺ Chemical Engineering Equipment Sizing Assistant")
st.markdown("""
Ask any question about chemical process equipment sizing and get AI-powered answers based on domain knowledge.
**Try examples like:**
- `What size of distillation column is used for 5000 kg/hr ethanol separation?`
- `Size a shell-and-tube heat exchanger for 1 MW heat duty.`
- `Suggest the volume of a CSTR with 1-hour residence time at 2000 L/hr.`
""")
# User Input
query = st.text_area("π¬ Enter your question:", height=150)
# Model Selector (Only supported by Groq)
model = st.selectbox("π Choose a model:", [
"llama-3.1-8b-instant",
"llama-3.3-70b-versatile",
"openai/gpt-oss-120b"
])
# Process the query
if st.button("π Get Response"):
if not api_key:
st.error("π¨ Groq API Key not found. Set the GROQ_API_KEY in your environment.")
elif not query.strip():
st.warning("β οΈ Please enter a question.")
else:
with st.spinner("Generating response..."):
try:
chat_completion = client.chat.completions.create(
model=model,
messages=[
{
"role": "system",
"content": (
"You are a senior chemical engineer. Answer user questions about equipment sizing, "
"design rules, and heuristics clearly and concisely. Provide real-world estimates based on "
"chemical engineering practices."
)
},
{
"role": "user",
"content": query
}
]
)
response = chat_completion.choices[0].message.content
st.success("β
Answer:")
st.markdown(response)
except Exception as e:
st.error(f"β Error: {str(e)}")
|