nsarrazin commited on
Commit
31dbc8a
·
unverified ·
1 Parent(s): 260c3b7

feat: add SKIP_LLAMA_CPP_BUILD env var to build process (#1829)

Browse files

allows you to skip building llama.cpp if you know you are not going to use it

.github/workflows/deploy-prod.yml CHANGED
@@ -49,6 +49,7 @@ jobs:
49
  APP_BASE=/chat
50
  PUBLIC_APP_COLOR=yellow
51
  PUBLIC_COMMIT_SHA=${{ env.GITHUB_SHA_SHORT }}
 
52
  deploy:
53
  name: Deploy on prod
54
  runs-on: ubuntu-latest
 
49
  APP_BASE=/chat
50
  PUBLIC_APP_COLOR=yellow
51
  PUBLIC_COMMIT_SHA=${{ env.GITHUB_SHA_SHORT }}
52
+ SKIP_LLAMA_CPP_BUILD=true
53
  deploy:
54
  name: Deploy on prod
55
  runs-on: ubuntu-latest
.github/workflows/lint-and-test.yml CHANGED
@@ -49,4 +49,4 @@ jobs:
49
  steps:
50
  - uses: actions/checkout@v3
51
  - name: Build Docker image
52
- run: docker build --secret id=DOTENV_LOCAL,src=.env.ci -t chat-ui:latest .
 
49
  steps:
50
  - uses: actions/checkout@v3
51
  - name: Build Docker image
52
+ run: docker build --secret id=DOTENV_LOCAL,src=.env.ci --build-arg SKIP_LLAMA_CPP_BUILD=true -t chat-ui:latest .
Dockerfile CHANGED
@@ -54,7 +54,9 @@ COPY --link --chown=1000 package-lock.json package.json ./
54
 
55
  ARG APP_BASE=
56
  ARG PUBLIC_APP_COLOR=blue
 
57
  ENV BODY_SIZE_LIMIT=15728640
 
58
 
59
  RUN --mount=type=cache,target=/app/.npm \
60
  npm set cache /app/.npm && \
 
54
 
55
  ARG APP_BASE=
56
  ARG PUBLIC_APP_COLOR=blue
57
+ ARG SKIP_LLAMA_CPP_BUILD
58
  ENV BODY_SIZE_LIMIT=15728640
59
+ ENV SKIP_LLAMA_CPP_BUILD=$SKIP_LLAMA_CPP_BUILD
60
 
61
  RUN --mount=type=cache,target=/app/.npm \
62
  npm set cache /app/.npm && \
README.md CHANGED
@@ -1098,7 +1098,7 @@ You can build the docker images locally using the following commands:
1098
  ```bash
1099
  docker build -t chat-ui-db:latest --build-arg INCLUDE_DB=true .
1100
  docker build -t chat-ui:latest --build-arg INCLUDE_DB=false .
1101
- docker build -t huggingchat:latest --build-arg INCLUDE_DB=false --build-arg APP_BASE=/chat --build-arg PUBLIC_APP_COLOR=yellow .
1102
  ```
1103
 
1104
  If you want to run the images with your local .env.local you have two options
 
1098
  ```bash
1099
  docker build -t chat-ui-db:latest --build-arg INCLUDE_DB=true .
1100
  docker build -t chat-ui:latest --build-arg INCLUDE_DB=false .
1101
+ docker build -t huggingchat:latest --build-arg INCLUDE_DB=false --build-arg APP_BASE=/chat --build-arg PUBLIC_APP_COLOR=yellow --build-arg SKIP_LLAMA_CPP_BUILD=true .
1102
  ```
1103
 
1104
  If you want to run the images with your local .env.local you have two options
vite.config.ts CHANGED
@@ -21,7 +21,9 @@ function loadTTFAsArrayBuffer() {
21
  };
22
  }
23
  const isViteNode = process.argv.some((arg) => arg.includes("vite-node")) || !!process.env.VITE_NODE;
24
- const shouldCopyLlama = process.env.npm_lifecycle_event === "build" && !isViteNode; // Copy node-llama-cpp/llama files to build output
 
 
25
 
26
  function copyLlamaFiles() {
27
  return {
 
21
  };
22
  }
23
  const isViteNode = process.argv.some((arg) => arg.includes("vite-node")) || !!process.env.VITE_NODE;
24
+ const skipLlamaCppBuild = process.env.SKIP_LLAMA_CPP_BUILD === "true";
25
+ const shouldCopyLlama =
26
+ process.env.npm_lifecycle_event === "build" && !isViteNode && !skipLlamaCppBuild; // Copy node-llama-cpp/llama files to build output
27
 
28
  function copyLlamaFiles() {
29
  return {