|  | |
| NanoCoder is a Fill-in-the-Middle (FIM) language model specifically designed for React frontend development and coding assistance. It helps users with intelligent code autocompletion and context-aware generation. The model was fine-tuned using Unsloth on the Qwen 3 0.6B base model, leveraging a high-quality FIM dataset curated from GitHub repositories to enhance coding capabilities and developer productivity. | |
| ## 🧠 Datasets | |
| We trained **NanoCoder** using a **high-quality Fill-in-the-Middle (FIM)** dataset curated from GitHub repositories: | |
| [**srisree/nextjs_typescript_fim_dataset**](https://huggingface.co/datasets/srisree/nextjs_typescript_fim_dataset) on Hugging Face. | |
| This dataset focuses on **React/Next.js** and **TypeScript** projects, providing rich, real-world coding examples that help the model understand frontend architecture, component composition, and React ecosystem patterns. | |
| By leveraging this dataset, **NanoCoder** learns to: | |
| - Predict and fill missing code intelligently using FIM objectives. | |
| - Understand React component structures and TypeScript typing patterns. | |
| - Generate clean, production-grade frontend code snippets. | |
| ## ⚙️ FIM Training Colab Script | |
| We’re preparing an interactive **Google Colab notebook** for reproducing the **Fill-in-the-Middle (FIM)** fine-tuning process used to train **NanoCoder** with **Unsloth** on the **Qwen 3 0.6B** base model. | |
| The Colab script will include: | |
| - ✅ Environment setup with Unsloth and Qwen 3 0.6B | |
| - ✅ Loading and preprocessing the [Next.js TypeScript FIM Dataset](https://huggingface.co/datasets/srisree/nextjs_typescript_fim_dataset) | |
| - ✅ Training configuration (LoRA, batch size, sequence length, etc.) | |
| - ✅ Evaluation and inference examples | |
| 🚀 **Coming soon...** Stay tuned for the full release! | |
| --- | |
| # ⚙️ Setup and Run NanoCoder Locally with Ollama in VS Code | |
| > Step-by-step guide to install, configure, and use **NanoCoder** for intelligent React frontend code completion with the **Continue** VS Code extension. | |
| --- | |
| ## 🧠 Prerequisites | |
| Before getting started, ensure you have the following installed: | |
| - [VS Code](https://code.visualstudio.com/) | |
| - [Ollama](https://ollama.ai) (latest version) | |
| - [Continue extension](https://marketplace.visualstudio.com/items?itemName=Continue.continue) | |
| - A system with at least **8GB RAM** (recommended for 0.6B models) | |
| --- | |
| ## 🧩 Step 1: Install Ollama | |
| If you haven’t already, download and install **Ollama**: | |
| - macOS / Linux / Windows: [https://ollama.ai/download](https://ollama.ai/download) | |
| Once installed, open your terminal and verify the installation. | |
| ## 💾 Step 2: Pull NanoCoder Model | |
| >ollama pull srisree/nanocoder | |
| ## ⚡ Step 3: Run NanoCoder with Ollama | |
| Once downloaded, you can test NanoCoder directly in the terminal: | |
| >ollama run nanocoder | |
| Read more [Continue Docs](https://docs.continue.dev/customize/deep-dives/autocomplete) |