| # Local-Llama-Inference v0.1.0 - SHA256 Checksums | |
| # Generated: $(date) | |
| # Platform: Linux x86_64 | |
| # Complete Package (CUDA + llama.cpp + NCCL + Python SDK): ~1.4GB | |
| # SDK-Only Package (Python SDK source): ~45KB | |
| ## Complete Package (Batteries-Included) | |
| ## Use this for end-users who want everything pre-configured | |
| b9b1a813e44f38c249e4d312ee88be94849a907da4f22fe9995c3d29d845c0b9 local-llama-inference-complete-v0.1.0.tar.gz | |
| 2483eb8aef04fa6f515c8b2ba33ed8964a227e43cedeaa9b353bff895e1537d9 local-llama-inference-complete-v0.1.0.zip | |
| ## SDK-Only Package (Source Code) | |
| ## Use this if you have llama.cpp and NCCL pre-installed | |
| 8ef64e094e9284ed2bd507b22d13772dd90b22c9fec185c45090d7c4a7e3fffd local-llama-inference-sdk-v0.1.0.tar.gz | |
| 5b29ea2811d958d1fda6de5cb9310ea7a84edf867dad58fa9494dc4b457a872b local-llama-inference-sdk-v0.1.0.zip | |
| ## Verification Instructions | |
| To verify file integrity: | |
| # Linux/macOS | |
| sha256sum -c CHECKSUMS.txt | |
| # Or verify individual file: | |
| sha256sum -c local-llama-inference-complete-v0.1.0.tar.gz.sha256 | |
| # macOS (alternate) | |
| shasum -a 256 -c CHECKSUMS.txt | |