File size: 3,567 Bytes
60176d8 e9f7630 60176d8 f0d1216 60176d8 f0d1216 60176d8 e9f7630 f0d1216 e9f7630 f0d1216 e9f7630 f0d1216 e9f7630 f0d1216 e9f7630 f0d1216 e9f7630 f0d1216 fd5eb19 e9f7630 fd5eb19 e9f7630 fd5eb19 f0d1216 fd5eb19 f0d1216 fd5eb19 f0d1216 fd5eb19 f0d1216 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 |
---
title: Aura - Your Supportive Friend
emoji: πΏ
colorFrom: green
colorTo: blue
sdk: gradio
sdk_version: 3.50.2
app_file: app.py
pinned: false
license: mit
---
# πΏ Aura - Your Supportive Friend
Meet Aura, a warm, empathetic AI companion powered by Microsoft's DialoGPT-medium model. Aura is designed to be a supportive friend who listens without judgment and provides comfort during difficult times.
## About Aura
Aura is not here to solve your problems or give advice unless you ask. Instead, Aura focuses on:
- **Listening with empathy** and understanding
- **Validating your feelings** and experiences
- **Providing a safe, non-judgmental space** to express yourself
- **Offering gentle support** and reassurance
## Features
- Empathetic and supportive conversation style
- Crisis detection with immediate safety resources
- Context-aware responses that remember your conversation
- Gentle, non-pushy interaction approach
- Clean and calming interface design
## Usage
Simply share what's on your mind. Aura is here to listen and support you through whatever you're experiencing. Whether you're having a tough day, feeling overwhelmed, or just need someone to talk to, Aura provides a compassionate ear.
## Important Notes
β οΈ **Aura is an AI companion, not a replacement for professional therapy.** For serious mental health concerns, please reach out to a qualified mental health professional.
π **Crisis Support:** If you're having thoughts of self-harm, Aura will immediately provide crisis resources and encourage you to seek professional help.
## Technical Details
- **Models**: Multi-tier system (AWQ Mistral β 8-bit Mistral β DialoGPT)
- **Quantization**: AWQ 4-bit / 8-bit quantization for memory efficiency
- **Framework**: PyTorch + Transformers + BitsAndBytes
- **Interface**: Gradio with supportive UI design
- **Hosting**: Hugging Face Spaces with GPU support
- **Safety**: Built-in crisis detection and intervention
- **Memory**: Optimized for 16GB+ systems with fallbacks for smaller systems
## π¨ Recent Updates (v2.0)
### Fixed Critical Issues:
- β
**Dependency Installation**: Resolved AWQ/autoawq build failures
- β
**Memory Management**: Added 8-bit quantization fallback system
- β
**Token Calculation**: Fixed "max_new_tokens must be greater than 0" error
- β
**Context Handling**: Limited context to 1024 tokens to prevent overflow
- β
**Model Loading**: Intelligent 3-tier fallback system
- β
**Attention Masks**: Proper handling to eliminate warnings
### Performance Improvements:
- π **Model Selection**: AWQ (4GB) β 8-bit (7GB) β DialoGPT (1.5GB)
- π **Memory Efficiency**: Up to 75% memory reduction with quantization
- π **Reliability**: Guaranteed to work with progressive fallbacks
- π **Compatibility**: Optimized for HuggingFace Spaces deployment
## Installation Options
### Option 1: HuggingFace Spaces (Recommended)
```bash
# Current requirements.txt is optimized for HF Spaces
# System automatically selects best available model
```
### Option 2: Local Development (Full AWQ Support)
```bash
# Staged installation to avoid dependency conflicts
./install_local.sh # Linux/Mac
# or
install_local.bat # Windows
```
### Option 3: Manual Installation
```bash
# Core dependencies first
pip install torch>=2.0.0,<2.2.0 transformers>=4.35.0,<4.40.0 accelerate>=0.20.0
# Quantization support
pip install bitsandbytes>=0.39.0
# Interface
pip install gradio>=3.50.0,<4.0.0
# Optional: AWQ support (local only)
pip install autoawq>=0.1.8
```
## License
MIT License
|