Spaces:
Sleeping
A newer version of the Streamlit SDK is available:
1.53.1
title: π€ VCell BioModel Chatbot
emoji: π§¬
colorFrom: blue
colorTo: purple
sdk: streamlit
sdk_version: 1.44.1
app_file: app.py
pinned: true
license: mit
short_description: Demo Chatbot to query VCell modeling resources
VCell BioModel Chatbot Demo
This demo presents an intelligent chatbot interface for querying and interpreting biomodels stored in the VCell BioModel Database. It enables users to interact with biological modeling resources using natural language and provides structured outputs, model metadata, downloadable files, and visualizations in real time. This work was developed as a demo for the NRNB Organization for Google Summer of Code 2025.
Overview
This chatbot is an AI-powered assistant capable of:
- Interpreting natural language queries
- Extracting structured parameters
- Querying the VCell API
- Summarizing model data in a human-readable way using a large language model (LLM)
- Visualizing model diagrams
- Providing downloadable model files
Features
- Natural Language Interface β Query the database with simple English prompts
- LLM Parameters Extraction β Uses LLaMA 3 to extract structured parameters
- VCell API Integration β Supports dynamic querying with filters like author, category, biomodelid...
- Summarization β Generates high-level descriptions of model contents in a human [gif of parameters extraction, API response and summarization]
- Visualization β Displays system reaction diagrams from the API
- Download Options β Direct links to SBML and VCML formats for downstream use [gif of downloading the provided files and visualizations]
- Streamlit UI β Minimalist, responsive, and easy to deploy
Technologies Used
- Python
- Streamlit
- Groq API for the LLM (LLaMA 3.3-70B)
- VCell Public API
- Pydantic (parameter schema)
- Dotenv (secret management)
Getting Started
Prerequisites
- Python 3.10 or higher
- A valid Groq API key
Installation
git clone https://github.com/KacemMathlouthi/VCell-Demo.git
cd VCell-Demo
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
Configuration
Create a .env file at the root of the project:
LLM_API_KEY=your_groq_api_key_here
Run the Application
streamlit run app.py
Visit http://localhost:8501 to start using the chatbot.
Example Prompts
Here are some questions the chatbot can understand:
List all public models by user ionFind the model with ID 201844485Show VCell models related to calcium
Project Structure
βββ app.py # Main application entry point
βββ requirements.txt # Project dependencies
βββ .env # Environment variable for API key
βββ .streamlit/config.toml # UI configuration
βββ vcelldb/
β βββ vcell_api.py # Wrapper for VCell API calls
β βββ diagram.py # Utilities for diagrams and downloads
β βββ params_model.py # Schema definitions
βββ utils/
βββ llm_helper.py # LLM instance creation and response generation
βββ params_extraction.py # Prompt-to-parameter process
License
This project is licensed under the MIT License. You are free to use, modify, and distribute the software with proper attribution.
Special thanks to the Virtual Cell (VCell) team and the National Resource for Network Biology (NRNB) for their support.
For more information, visit vcell.org.