| license: apache-2.0 | |
| base_model: | |
| - internlm/Intern-S1-mini | |
| ## How to run | |
| ``` | |
| ./llama.cpp/build/bin/llama-server -hf yarikdevcom/Intern-S1-mini-GGUF --n-gpu-layers 99 --temp 0.8 --top-p 0.8 --top-k 50 --port 8999 --host 0.0.0.0 -fa | |
| ``` | |
| license: apache-2.0 | |
| base_model: | |
| - internlm/Intern-S1-mini | |
| ## How to run | |
| ``` | |
| ./llama.cpp/build/bin/llama-server -hf yarikdevcom/Intern-S1-mini-GGUF --n-gpu-layers 99 --temp 0.8 --top-p 0.8 --top-k 50 --port 8999 --host 0.0.0.0 -fa | |
| ``` | |