Spaces:
Sleeping
Sleeping
OpenCode Deployer commited on
Commit ·
e6d126a
1
Parent(s): eb50311
全面修复 llama-server 共享库依赖问题
Browse files修复 libgomp.so.1 和其他潜在共享库依赖:
- 添加 libomp-dev 构建时依赖和 libgomp1 运行时依赖
- 增强静态链接:-DCMAKE_EXE_LINKER_FLAGS='-static-libgcc -static-libstdc++'
- 确保 OpenMP 支持在容器环境中正常工作
- 构建和运行双重保障,解决所有已知共享库错误
这应该彻底解决 HuggingFace Space 部署时的依赖问题。
- Dockerfile +4 -2
- README.md +3 -2
Dockerfile
CHANGED
|
@@ -8,11 +8,12 @@ RUN apt-get update && apt-get install -y \
|
|
| 8 |
git \
|
| 9 |
cmake \
|
| 10 |
curl \
|
|
|
|
| 11 |
&& rm -rf /var/lib/apt/lists/*
|
| 12 |
|
| 13 |
RUN git clone https://github.com/ggerganov/llama.cpp.git /tmp/llamacpp && \
|
| 14 |
cd /tmp/llamacpp && \
|
| 15 |
-
cmake -B build -DLLAMA_BUILD_SERVER=ON -DBUILD_SHARED_LIBS=OFF && \
|
| 16 |
cmake --build build --config Release
|
| 17 |
|
| 18 |
# 运行阶段
|
|
@@ -27,9 +28,10 @@ ENV THREADS="-1"
|
|
| 27 |
ENV TEMPERATURE="0.7"
|
| 28 |
ENV PREDICT_TOKENS="2048"
|
| 29 |
|
| 30 |
-
# 仅安装运行时依赖
|
| 31 |
RUN apt-get update && apt-get install -y \
|
| 32 |
curl \
|
|
|
|
| 33 |
&& rm -rf /var/lib/apt/lists/*
|
| 34 |
|
| 35 |
WORKDIR /app
|
|
|
|
| 8 |
git \
|
| 9 |
cmake \
|
| 10 |
curl \
|
| 11 |
+
libomp-dev \
|
| 12 |
&& rm -rf /var/lib/apt/lists/*
|
| 13 |
|
| 14 |
RUN git clone https://github.com/ggerganov/llama.cpp.git /tmp/llamacpp && \
|
| 15 |
cd /tmp/llamacpp && \
|
| 16 |
+
cmake -B build -DLLAMA_BUILD_SERVER=ON -DBUILD_SHARED_LIBS=OFF -DCMAKE_EXE_LINKER_FLAGS="-static-libgcc -static-libstdc++" && \
|
| 17 |
cmake --build build --config Release
|
| 18 |
|
| 19 |
# 运行阶段
|
|
|
|
| 28 |
ENV TEMPERATURE="0.7"
|
| 29 |
ENV PREDICT_TOKENS="2048"
|
| 30 |
|
| 31 |
+
# 仅安装运行时依赖(包括 OpenMP 运行时库)
|
| 32 |
RUN apt-get update && apt-get install -y \
|
| 33 |
curl \
|
| 34 |
+
libgomp1 \
|
| 35 |
&& rm -rf /var/lib/apt/lists/*
|
| 36 |
|
| 37 |
WORKDIR /app
|
README.md
CHANGED
|
@@ -49,10 +49,11 @@ curl -X POST "http://localhost:7860/v1/chat/completions" \
|
|
| 49 |
Dockerfile 使用多阶段构建和 CMake 构建系统来编译 llama.cpp:
|
| 50 |
|
| 51 |
**构建阶段**:
|
| 52 |
-
- 安装构建依赖(build-essential, git, cmake)
|
| 53 |
-
- 使用 `-DLLAMA_BUILD_SERVER=ON -DBUILD_SHARED_LIBS=OFF` 标志
|
| 54 |
- 编译 Release 版本以获得最佳性能
|
| 55 |
- 静态链接解决了 `libmtmd.so.0` 共享库依赖问题
|
|
|
|
| 56 |
|
| 57 |
**运行阶段**:
|
| 58 |
- 仅安装运行时依赖(curl)
|
|
|
|
| 49 |
Dockerfile 使用多阶段构建和 CMake 构建系统来编译 llama.cpp:
|
| 50 |
|
| 51 |
**构建阶段**:
|
| 52 |
+
- 安装构建依赖(build-essential, git, cmake, libomp-dev)
|
| 53 |
+
- 使用 `-DLLAMA_BUILD_SERVER=ON -DBUILD_SHARED_LIBS=OFF -DCMAKE_EXE_LINKER_FLAGS="-static-libgcc -static-libstdc++"` 标志
|
| 54 |
- 编译 Release 版本以获得最佳性能
|
| 55 |
- 静态链接解决了 `libmtmd.so.0` 共享库依赖问题
|
| 56 |
+
- OpenMP 支持通过开发库和运行时库双重保障
|
| 57 |
|
| 58 |
**运行阶段**:
|
| 59 |
- 仅安装运行时依赖(curl)
|