Create README.md
#1
by
Roshan1162003 - opened
README.md
ADDED
|
@@ -0,0 +1,78 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
language:
|
| 4 |
+
- en
|
| 5 |
+
base_model:
|
| 6 |
+
- google/gemma-2b-it
|
| 7 |
+
pipeline_tag: text-generation
|
| 8 |
+
tags:
|
| 9 |
+
- electron
|
| 10 |
+
- desktopapplication
|
| 11 |
+
- windows
|
| 12 |
+
---
|
| 13 |
+
|
| 14 |
+
# RemiAI / Bujji Open Source Framework
|
| 15 |
+
|
| 16 |
+
[](https://opensource.org/licenses/MIT)
|
| 17 |
+
[](https://www.electronjs.org/)
|
| 18 |
+
[](https://huggingface.co/)
|
| 19 |
+
|
| 20 |
+
**A "No-Setup" Local AI Framework for Students**
|
| 21 |
+
|
| 22 |
+
This project is an open-source, offline AI chat application designed for students and colleges. It allows you to run powerful LLMs (like Llama 3, Mistral, etc.) on your laptop without needing GPU, internet, Python, or complicated installations.
|
| 23 |
+
|
| 24 |
+
**Note** - No need any GPU in your laptop to run, it will use the CPU in your laptop for the response generation(inference) and if you want to modify the project code and use another model make sure that your are using the `.gguf` formated weights only, normal weights like `.safetensors` will not supported in this application.
|
| 25 |
+
---
|
| 26 |
+
|
| 27 |
+
## 🚀 Quick Start (One-Line Command)
|
| 28 |
+
|
| 29 |
+
If you have Git and Node.js installed, open your terminal (Command Prompt or PowerShell) and run:
|
| 30 |
+
|
| 31 |
+
```bash
|
| 32 |
+
git clone https://huggingface.co/remiai3/RemiAI_Framework && cd RemiAI-App && npm install && npm start
|
| 33 |
+
```
|
| 34 |
+
|
| 35 |
+
---
|
| 36 |
+
|
| 37 |
+
## 💻 Manual Installation
|
| 38 |
+
|
| 39 |
+
### 1. Requirements
|
| 40 |
+
* **Node.js**: [Download Here](https://nodejs.org/) (Install the LTS version).
|
| 41 |
+
* **Windows Laptop**: (Code includes optimized `.exe` binaries for Windows).
|
| 42 |
+
|
| 43 |
+
### 2. Download & Setup
|
| 44 |
+
1. **Download** the project zip (or clone the repo).
|
| 45 |
+
2. **Extract** the folder.
|
| 46 |
+
3. **Open Terminal** inside the folder path.
|
| 47 |
+
4. Run the installer for libraries:
|
| 48 |
+
```bash
|
| 49 |
+
npm install
|
| 50 |
+
```
|
| 51 |
+
|
| 52 |
+
### 3. Run the App
|
| 53 |
+
Simply type:
|
| 54 |
+
```bash
|
| 55 |
+
npm start
|
| 56 |
+
```
|
| 57 |
+
The application will launch, the AI engine will start in the background, and you can begin chatting immediately!
|
| 58 |
+
|
| 59 |
+
---
|
| 60 |
+
|
| 61 |
+
## 📦 Features
|
| 62 |
+
|
| 63 |
+
* **Zero Python Dependency**: We use compiled binaries (`.dll` and `.exe` included) so you don't need to install Python, PyTorch, or set up virtual environments.
|
| 64 |
+
* **Plug & Play Models**: Supports `.gguf` format.
|
| 65 |
+
* Want a different model? Download any `.gguf` file, rename it to `model.gguf`, and place it in the project root.
|
| 66 |
+
* **Auto-Optimization**: Automatically detects your CPU features (AVX vs AVX2) to give you the best speed possible.
|
| 67 |
+
* **Privacy First**: Runs 100% offline. No data leaves your device.
|
| 68 |
+
|
| 69 |
+
---
|
| 70 |
+
|
| 71 |
+
## 🛠️ Credits & License
|
| 72 |
+
|
| 73 |
+
* **Created By**: RemiAI Team
|
| 74 |
+
* **License**: MIT License.
|
| 75 |
+
* *You are free to rename, modify, and distribute this application as your own project!*
|
| 76 |
+
|
| 77 |
+
**Note on Models**: The application will only uses the `.gguf` formated weights only to make it as the CPU friendly run the application without any GPU
|
| 78 |
+
---
|