|
|
--- |
|
|
language: |
|
|
- en |
|
|
base_model: |
|
|
- bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF |
|
|
tags: |
|
|
- cybersecurity |
|
|
- uncensored |
|
|
- code |
|
|
license: mit |
|
|
pipeline_tag: text-generation |
|
|
--- |
|
|
 |
|
|
|
|
|
# Navi |
|
|
A high-performance, uncensored language model fine-tuned for cybersecurity applications. |
|
|
|
|
|
## Table of Contents |
|
|
- [Model Details](#model-details) |
|
|
- [Instructions](#instructions) |
|
|
- [Linux/Mac Instructions](#linuxmac-instructions) |
|
|
- [Web UI](#web-ui) |
|
|
- [Windows Instructions](#windows-instructions) |
|
|
- [Windows Web UI](#windows-web-ui) |
|
|
|
|
|
## Model Details |
|
|
This model is built upon [bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF](https://huggingface.co/bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF), leveraging its capabilities for text generation in the cybersecurity domain. |
|
|
|
|
|
## __Instructions__ |
|
|
|
|
|
### Linux/Mac Instructions |
|
|
To run the model locally: |
|
|
1. Download the `navi.llamafile` |
|
|
2. Open a terminal and navigate to the download directory. |
|
|
3. Run the model using `./navi.llamafile`. |
|
|
|
|
|
### Web UI |
|
|
For a web interface: |
|
|
1. Follow the steps above. |
|
|
2. Run the model with `./navi.llamafile --server --v2`. |
|
|
|
|
|
### Windows Instructions |
|
|
1. Download the `navi.llamafile` |
|
|
2. Head over to your downloads folder |
|
|
3. Find `navi.llamafile` and right click on it |
|
|
4. Rename the file to `navi.llamafile.exe` |
|
|
5. Double click on the file |
|
|
- From here it should launch a terminal window and load the model / cli chat interface. |
|
|
|
|
|
### Windows Web UI |
|
|
**Following the instructions above from 1 - 4** |
|
|
1. Right click any open space and select open terminal |
|
|
- Alternatively open a terminal anywhere and navigate to wherever the `navi.llamafile.exe` is. |
|
|
2. Once found type `.\navi.llamafile.exe --server --v2` to launch the included webserver |
|
|
3. Open up a webbrowser and navigate to: localhost:8080 |
|
|
|
|
|
**NOTE: This system has been tested on windows 11 as well as Ubuntu 24.04 LTS** |