File size: 1,962 Bytes
fd05de8
 
 
 
 
 
 
 
 
cf81d89
 
f98d85b
bb756fd
 
f98d85b
 
 
 
 
5fe5c58
f98d85b
 
5fe5c58
 
f98d85b
 
 
 
5fe5c58
f98d85b
 
 
5f0e12b
f98d85b
 
 
 
 
 
 
5f0e12b
 
 
 
 
 
 
 
 
5fe5c58
5f0e12b
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
language:
- en
base_model:
- bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF
tags:
- cybersecurity
- uncensored
- code
license: mit
pipeline_tag: text-generation
---
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/671dda880d1e5e69d2169ba7/pUZNo_WHK-6X93EfuU3Uk.jpeg)

# Navi
A high-performance, uncensored language model fine-tuned for cybersecurity applications.

## Table of Contents
- [Model Details](#model-details)
- [Instructions](#instructions)
  - [Linux/Mac Instructions](#linuxmac-instructions)
  - [Web UI](#web-ui)
  - [Windows Instructions](#windows-instructions)
  - [Windows Web UI](#windows-web-ui)

## Model Details
This model is built upon [bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF](https://huggingface.co/bartowski/Llama-3.2-3B-Instruct-uncensored-GGUF), leveraging its capabilities for text generation in the cybersecurity domain.

## __Instructions__

### Linux/Mac Instructions
To run the model locally:
1. Download the `navi.llamafile`
2. Open a terminal and navigate to the download directory.
3. Run the model using `./navi.llamafile`.

### Web UI
For a web interface:
1. Follow the steps above.
2. Run the model with `./navi.llamafile --server --v2`.

### Windows Instructions
1. Download the `navi.llamafile`
2. Head over to your downloads folder
3. Find `navi.llamafile` and right click on it
4. Rename the file to `navi.llamafile.exe`
5. Double click on the file
   - From here it should launch a terminal window and load the model / cli chat interface.

### Windows Web UI
**Following the instructions above from 1 - 4**
1. Right click any open space and select open terminal 
    - Alternatively open a terminal anywhere and navigate to wherever the `navi.llamafile.exe` is.
2. Once found type `.\navi.llamafile.exe --server --v2` to launch the included webserver
3. Open up a webbrowser and navigate to: localhost:8080

**NOTE: This system has been tested on windows 11 as well as Ubuntu 24.04 LTS**