Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
TeamGenKI
/
LLMServer
like
0
Paused
App
Files
Files
Community
2
Fetching metadata from the HF Docker repository...
526ff2e
LLMServer
/
main
41.4 kB
Ctrl+K
Ctrl+K
1 contributor
History:
49 commits
AurelioAguirre
Added double init, for embedding and chat models at the same time.
8083005
over 1 year ago
utils
Added Progressbar, REPLACED stream to logger
over 1 year ago
__init__.py
Safe
0 Bytes
Fixed Dockerfile v12
over 1 year ago
api.py
Safe
12.9 kB
Added double init, for embedding and chat models at the same time.
over 1 year ago
app.py
Safe
1.19 kB
Added double init, for embedding and chat models at the same time.
over 1 year ago
config.yaml
Safe
631 Bytes
Fixing download logging issue. v3
over 1 year ago
env_template
Safe
770 Bytes
Fixed Dockerfile v12
over 1 year ago
routes.py
Safe
15 kB
Added double init, for embedding and chat models at the same time.
over 1 year ago
test_locally.py
Safe
1.8 kB
This should work then
over 1 year ago