Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
ndwdgda
/
cpu
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
94ef460
cpu
37.1 kB
Ctrl+K
Ctrl+K
1 contributor
History:
17 commits
Nhughes09
Ollama-only chatbot - working locally with llama3.2:3b
94ef460
5 months ago
__pycache__
Add Ollama client with modular architecture for local AI
5 months ago
README.md
Safe
419 Bytes
Fix: Pin exact versions gradio==4.19.2 huggingface_hub==0.22.2 for HfFolder compatibility
5 months ago
app.py
Safe
3.94 kB
Ollama-only chatbot - working locally with llama3.2:3b
5 months ago
cloudflare_client.py
Safe
8.31 kB
Modular architecture: logging_config.py, cloudflare_client.py, app.py for Cloudflare AI
5 months ago
logging_config.py
Safe
1.04 kB
Modular architecture: logging_config.py, cloudflare_client.py, app.py for Cloudflare AI
5 months ago
ollama_client.py
Safe
7.25 kB
Add Ollama client with modular architecture for local AI
5 months ago
requirements.txt
Safe
48 Bytes
Fix: Pin exact versions gradio==4.19.2 huggingface_hub==0.22.2 for HfFolder compatibility
5 months ago