Added windows instructions
Browse files
README.md
CHANGED
|
@@ -24,11 +24,11 @@ A high-performance, uncensored language model fine-tuned for cybersecurity appli
|
|
| 24 |
## Model Details
|
| 25 |
This model is built upon [bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF](https://huggingface.co/bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF), leveraging its capabilities for text generation in the cybersecurity domain.
|
| 26 |
|
| 27 |
-
##
|
| 28 |
|
| 29 |
### Linux/Mac Instructions
|
| 30 |
To run the model locally:
|
| 31 |
-
1. Download the
|
| 32 |
2. Open a terminal and navigate to the download directory.
|
| 33 |
3. Run the model using `./navi.llamafile`.
|
| 34 |
|
|
@@ -36,3 +36,20 @@ To run the model locally:
|
|
| 36 |
For a web interface:
|
| 37 |
1. Follow the steps above.
|
| 38 |
2. Run the model with `./navi.llamafile --server --v2`.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 24 |
## Model Details
|
| 25 |
This model is built upon [bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF](https://huggingface.co/bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF), leveraging its capabilities for text generation in the cybersecurity domain.
|
| 26 |
|
| 27 |
+
## Instructions
|
| 28 |
|
| 29 |
### Linux/Mac Instructions
|
| 30 |
To run the model locally:
|
| 31 |
+
1. Download the `navi.llamafile`
|
| 32 |
2. Open a terminal and navigate to the download directory.
|
| 33 |
3. Run the model using `./navi.llamafile`.
|
| 34 |
|
|
|
|
| 36 |
For a web interface:
|
| 37 |
1. Follow the steps above.
|
| 38 |
2. Run the model with `./navi.llamafile --server --v2`.
|
| 39 |
+
|
| 40 |
+
### Windows Instructions
|
| 41 |
+
1. Download the `navi.llamafile`
|
| 42 |
+
2. Head over to your downloads folder
|
| 43 |
+
3. Find `navi.llamafile` and right click on it
|
| 44 |
+
4. Rename the file to `navi.llamafile.exe`
|
| 45 |
+
5. Double click on the file
|
| 46 |
+
- From here it should launch a terminal window and load the model / cli chat interface.
|
| 47 |
+
|
| 48 |
+
### Windows Web Server
|
| 49 |
+
**Following the instructions above from 1 - 4**
|
| 50 |
+
1. Right click any open space and select open terminal
|
| 51 |
+
- Alternatively open a terminal anywhere and navigate to wherever the `navi.llamafile.exe` is.
|
| 52 |
+
2. Once found type `.\navi.llamafile.exe --server --v2` to launch the included webserver
|
| 53 |
+
3. Open up a webbrowser and navigate to: localhost:8080
|
| 54 |
+
|
| 55 |
+
**NOTE: This system has been tested on windows 11 as well as Ubuntu 24.04 LTS**
|