Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
fhueni
/
on-device-vs-cloud-llm-inference
like
1
Running
App
Files
Files
Community
1
94662d5
on-device-vs-cloud-llm-inference
/
src
34.5 kB
5 contributors
History:
16 commits
fhueni
feat: add queues for on device and cloud processing to measure queue time
94662d5
26 days ago
services
feat: added additional models
27 days ago
evaluator.js
Safe
3.88 kB
feat: implement CSV download and converted logs to a table
30 days ago
main.js
10.5 kB
feat: add queues for on device and cloud processing to measure queue time
26 days ago
requestManager.js
Safe
8.86 kB
feat: add queues for on device and cloud processing to measure queue time
26 days ago
scheduler.js
4.23 kB
feat: track queueing time, inference time, and total latency separately. the timestamp for start inference should be taken somewhere else
27 days ago
utils.js
Safe
1.27 kB
feat: track queueing time, inference time, and total latency separately. the timestamp for start inference should be taken somewhere else
27 days ago