yipfram commited on
Commit
b00f1d1
·
verified ·
1 Parent(s): 5cd7e58

Upload 14 files

Browse files
Files changed (14) hide show
  1. .env.example +8 -0
  2. .gitignore +73 -0
  3. DEPLOY_INSTRUCTIONS.md +80 -0
  4. README.md +235 -13
  5. README_HF.md +48 -0
  6. README_HUGGINGFACE.md +92 -0
  7. app.py +18 -64
  8. app_config.py +26 -0
  9. app_original.py +534 -0
  10. chat_interface.py +288 -0
  11. config.py +16 -0
  12. impacts_tracker.py +40 -0
  13. mistral_client.py +116 -0
  14. requirements.txt +4 -1
.env.example ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ # Example environment variables
2
+ # Copy this file to .env and fill in your actual values
3
+
4
+ # Mistral AI API Key
5
+ MISTRAL_API_KEY=your_mistral_api_key_here
6
+
7
+ # Optional: Default model
8
+ DEFAULT_MODEL=mistral-tiny
.gitignore ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Python
2
+ __pycache__/
3
+ *.py[cod]
4
+ *$py.class
5
+ *.so
6
+ .Python
7
+ build/
8
+ develop-eggs/
9
+ dist/
10
+ downloads/
11
+ eggs/
12
+ .eggs/
13
+ lib/
14
+ lib64/
15
+ parts/
16
+ sdist/
17
+ var/
18
+ wheels/
19
+ *.egg-info/
20
+ .installed.cfg
21
+ *.egg
22
+ MANIFEST
23
+
24
+ # Virtual environments
25
+ .env
26
+ .venv
27
+ env/
28
+ venv/
29
+ ENV/
30
+ env.bak/
31
+ venv.bak/
32
+
33
+ # Environment variables
34
+ .env
35
+ .env.local
36
+ .env.development.local
37
+ .env.test.local
38
+ .env.production.local
39
+
40
+ # IDE
41
+ .vscode/
42
+ .idea/
43
+ *.swp
44
+ *.swo
45
+ *~
46
+
47
+ # Gradio
48
+ gradio_cached_examples/
49
+ flagged/
50
+ gradio.db
51
+
52
+ # Logs
53
+ *.log
54
+ logs/
55
+
56
+ # OS generated files
57
+ .DS_Store
58
+ .DS_Store?
59
+ ._*
60
+ .Spotlight-V100
61
+ .Trashes
62
+ ehthumbs.db
63
+ Thumbs.db
64
+
65
+ # Jupyter Notebook
66
+ .ipynb_checkpoints
67
+
68
+ # pyenv
69
+ .python-version
70
+
71
+ # Temporary files
72
+ *.tmp
73
+ *.temp
DEPLOY_INSTRUCTIONS.md ADDED
@@ -0,0 +1,80 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Instructions pour déployer sur Hugging Face Spaces
2
+
3
+ ## Prérequis
4
+ 1. Compte Hugging Face : [huggingface.co](https://huggingface.co)
5
+ 2. Git installé sur votre machine
6
+
7
+ ## Étapes de déploiement
8
+
9
+ ### 1. Créer un nouveau Space
10
+ 1. Allez sur [huggingface.co/new-space](https://huggingface.co/new-space)
11
+ 2. Donnez un nom à votre Space (ex: `ecologits-chat`)
12
+ 3. Choisissez `Gradio` comme SDK
13
+ 4. Sélectionnez `Public` ou `Private` selon vos préférences
14
+ 5. Cliquez sur "Create Space"
15
+
16
+ ### 2. Cloner le repository
17
+ ```bash
18
+ git clone https://huggingface.co/spaces/VOTRE_USERNAME/VOTRE_SPACE_NAME
19
+ cd VOTRE_SPACE_NAME
20
+ ```
21
+
22
+ ### 3. Copier les fichiers
23
+ Copiez tous ces fichiers dans le dossier de votre Space :
24
+ - `app.py`
25
+ - `chat_interface.py`
26
+ - `mistral_client.py`
27
+ - `impacts_tracker.py`
28
+ - `config.py`
29
+ - `requirements.txt`
30
+ - `README.md` (ou renommez `README_HF.md` en `README.md`)
31
+
32
+ ### 4. Créer le fichier de configuration Hugging Face
33
+ Créez un fichier `README.md` avec les métadonnées Hugging Face :
34
+
35
+ ```yaml
36
+ ---
37
+ title: EcoLogits Chat
38
+ emoji: 🌱
39
+ colorFrom: green
40
+ colorTo: blue
41
+ sdk: gradio
42
+ sdk_version: 4.44.0
43
+ app_file: app.py
44
+ pinned: false
45
+ license: mit
46
+ ---
47
+
48
+ # 🌱 EcoLogits Chat
49
+
50
+ [Contenu de votre README...]
51
+ ```
52
+
53
+ ### 5. Pousser sur Hugging Face
54
+ ```bash
55
+ git add .
56
+ git commit -m "Initial commit - EcoLogits Chat"
57
+ git push
58
+ ```
59
+
60
+ ### 6. Configurer l'application (optionnel)
61
+ Si vous voulez utiliser votre propre clé API par défaut :
62
+ 1. Allez dans les Settings de votre Space
63
+ 2. Ajoutez une variable d'environnement : `MISTRAL_API_KEY`
64
+ 3. Mettez votre clé API comme valeur
65
+ 4. Redémarrez le Space
66
+
67
+ ## ✅ Résultat
68
+ Votre application sera disponible à l'adresse :
69
+ `https://huggingface.co/spaces/VOTRE_USERNAME/VOTRE_SPACE_NAME`
70
+
71
+ Les utilisateurs pourront :
72
+ - Utiliser l'app directement sans fork
73
+ - Fournir leur propre clé API Mistral
74
+ - Voir l'impact environnemental de leurs conversations
75
+ - Choisir parmi tous les modèles Mistral disponibles
76
+
77
+ ## 🔒 Sécurité
78
+ - Les clés API des utilisateurs ne sont jamais stockées
79
+ - Elles n'existent que pendant la session active
80
+ - Aucune donnée n'est partagée avec des tiers
README.md CHANGED
@@ -1,13 +1,235 @@
1
- ---
2
- title: Ecologits Chat
3
- emoji: 💬
4
- colorFrom: yellow
5
- colorTo: purple
6
- sdk: gradio
7
- sdk_version: 5.0.1
8
- app_file: app.py
9
- pinned: false
10
- license: apache-2.0
11
- ---
12
-
13
- An example chatbot using [Gradio](https://gradio.app), [`huggingface_hub`](https://huggingface.co/docs/huggingface_hub/v0.22.2/en/index), and the [Hugging Face Inference API](https://huggingface.co/docs/api-inference/index).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # EcoLogits - AI Environmental Impact Tracker
2
+
3
+ A clean, modular chat application that tracks the environmental impact of AI model usage using **EcoLogits** library and **Mistral AI** models.
4
+
5
+ ## 🌱 Overview
6
+
7
+ This application provides a user-friendly interface to chat with Mistral AI models while tracking their environmental impact in real-time. It displays energy consumption and carbon footprint for each conversation, promoting awareness of AI's environmental costs.
8
+
9
+ ## ✨ Features
10
+
11
+ - **Clean Chat Interface**: Simple Gradio-based web interface with tabbed impact tracking
12
+ - **Environmental Tracking**: Real-time tracking of energy usage and carbon emissions using EcoLogits
13
+ - **Smart Model Filtering**: Automatically filters to show only text-to-text capable models (excludes OCR, embedding, vision models)
14
+ - **Tabbed Impact Visualization**:
15
+ - 🎯 **Last Message**: Impact of the specific last request with warnings/errors
16
+ - 📈 **Total Session**: Cumulative environmental metrics across the entire session
17
+ - **Warning System**: Shows EcoLogits warnings and errors conditionally in the UI
18
+ - **Modular Architecture**: Clean separation of concerns across multiple files
19
+ - **Session Management**: Clear conversation and reset impacts with the "Effacer tout" button
20
+
21
+ ## 🏗️ Architecture
22
+
23
+ The application is organized into focused, reusable modules:
24
+
25
+ ```
26
+ ├── app.py # Main application entry point
27
+ ├── config.py # Configuration and environment variables
28
+ ├── mistral_client.py # Mistral API wrapper with EcoLogits integration
29
+ ├── impacts_tracker.py # Environmental impact tracking and calculations
30
+ ├── chat_interface.py # Gradio UI components and user interaction
31
+ └── app_original.py # Backup of original monolithic version
32
+ ```
33
+
34
+ ### Module Responsibilities
35
+
36
+ - **`config.py`**: Centralized configuration management, environment variables, fallback models
37
+ - **`mistral_client.py`**: Handles Mistral API communication, intelligent model filtering, EcoLogits integration
38
+ - **`impacts_tracker.py`**: Tracks and calculates cumulative environmental impacts
39
+ - **`chat_interface.py`**: Manages the Gradio web interface with tabbed impact display
40
+ - **`app.py`**: Simple entry point that ties everything together
41
+
42
+ ## 🚀 Setup
43
+
44
+ ### Prerequisites
45
+
46
+ - Python 3.8+
47
+ - Mistral AI API key
48
+
49
+ ### Installation
50
+
51
+ 1. **Clone or download the project files**
52
+
53
+ 2. **Create a virtual environment:**
54
+ ```bash
55
+ python -m venv .venv
56
+ ```
57
+
58
+ 3. **Activate the virtual environment:**
59
+ ```bash
60
+ # Windows
61
+ .venv\Scripts\Activate.ps1
62
+
63
+ # Linux/Mac
64
+ source .venv/bin/activate
65
+ ```
66
+
67
+ 4. **Install dependencies:**
68
+ ```bash
69
+ pip install gradio mistralai ecologits python-dotenv
70
+ ```
71
+
72
+ 5. **Set up environment variables:**
73
+
74
+ Create a `.env` file in the project root:
75
+ ```env
76
+ MISTRAL_API_KEY=your_mistral_api_key_here
77
+ ```
78
+
79
+ Or set the environment variable directly:
80
+ ```bash
81
+ # Windows
82
+ $env:MISTRAL_API_KEY="your_mistral_api_key_here"
83
+
84
+ # Linux/Mac
85
+ export MISTRAL_API_KEY="your_mistral_api_key_here"
86
+ ```
87
+
88
+ ### Getting a Mistral API Key
89
+
90
+ 1. Visit [Mistral AI Console](https://console.mistral.ai/)
91
+ 2. Create an account or sign in
92
+ 3. Navigate to API Keys section
93
+ 4. Create a new API key
94
+ 5. Copy the key and add it to your `.env` file
95
+
96
+ ## 🎯 Usage
97
+
98
+ ### Running the Application
99
+
100
+ 1. **Activate your virtual environment:**
101
+ ```bash
102
+ .venv\Scripts\Activate.ps1
103
+ ```
104
+
105
+ 2. **Run the application:**
106
+ ```bash
107
+ python app.py
108
+ ```
109
+
110
+ 3. **Open your browser** to the displayed URL (typically `http://127.0.0.1:7860`)
111
+
112
+ ### Using the Interface
113
+
114
+ 1. **Select a Model**: Choose from available text-to-text Mistral models (automatically filtered)
115
+ 2. **Start Chatting**: Enter your message and click Send
116
+ 3. **Monitor Impact**: Use the tabs to view different impact metrics:
117
+ - **🎯 Dernier message**: Impact of your last request + any EcoLogits warnings/errors
118
+ - **📈 Session totale**: Cumulative impact across the entire conversation
119
+ 4. **Track Progress**: See real-time energy and carbon consumption updates
120
+ 5. **Clear Session**: Use "🗑️ Effacer tout" to start fresh and reset all impacts
121
+
122
+ ## 📊 Environmental Metrics
123
+
124
+ The application tracks and displays:
125
+
126
+ - **Energy Consumption**: Measured in kilowatt-hours (kWh)
127
+ - **Carbon Emissions**: CO2 equivalent emissions in grams
128
+ - **Cumulative Totals**: Running totals across the entire session
129
+ - **Per-Message Impact**: Individual impact of each AI response
130
+
131
+ ## 🔧 Configuration
132
+
133
+ ### Environment Variables
134
+
135
+ | Variable | Description | Default |
136
+ |----------|-------------|---------|
137
+ | `MISTRAL_API_KEY` | Your Mistral AI API key | *Required* |
138
+
139
+ ### Model Selection
140
+
141
+ The application automatically:
142
+ - Fetches available models from Mistral API
143
+ - Filters to only show text-to-text capable models (using `completion_chat` capability)
144
+ - Falls back to predefined models if API is unavailable
145
+
146
+ ### Fallback Models
147
+
148
+ If the Mistral API is unavailable, the application uses these models:
149
+ - `mistral-small-latest`
150
+ - `mistral-large-latest`
151
+ - `open-mistral-nemo`
152
+
153
+ ## 🛠️ Development
154
+
155
+ ### Code Structure
156
+
157
+ The codebase follows clean architecture principles:
158
+
159
+ ```python
160
+ # Clean separation of concerns
161
+ config.py # Configuration management
162
+ mistral_client.py # External API integration
163
+ impacts_tracker.py # Business logic for tracking
164
+ chat_interface.py # UI presentation layer
165
+ app.py # Application composition
166
+ ```
167
+
168
+ ### Key Design Decisions
169
+
170
+ 1. **Model Filtering**: Only shows models with `completion_chat: true` capability
171
+ 2. **Error Handling**: Graceful degradation when APIs are unavailable
172
+ 3. **Environmental Focus**: EcoLogits warnings displayed prominently in UI
173
+ 4. **Modular Design**: Each file has a single, clear responsibility
174
+
175
+ ### Adding New Features
176
+
177
+ The modular structure makes it easy to extend:
178
+
179
+ - **New Providers**: Add to `mistral_client.py` or create similar modules
180
+ - **UI Changes**: Modify `chat_interface.py`
181
+ - **Tracking Logic**: Extend `impacts_tracker.py`
182
+ - **Configuration**: Add to `config.py`
183
+
184
+ ## 🌍 About EcoLogits
185
+
186
+ [EcoLogits](https://ecologits.ai/) is a Python library that tracks the environmental impact of AI model usage. It provides:
187
+
188
+ - Real-time energy consumption tracking
189
+ - Carbon footprint calculations
190
+ - Support for multiple AI providers
191
+ - Detailed impact reporting
192
+
193
+ This application demonstrates responsible AI usage by making environmental costs visible and measurable.
194
+
195
+ ## 🐛 Troubleshooting
196
+
197
+ ### Common Issues
198
+
199
+ 1. **"Invalid model" errors**:
200
+ - Fixed! The app now filters out incompatible models automatically
201
+ - Check the console output to see which models are included/excluded
202
+
203
+ 2. **Clear button not working**:
204
+ - Fixed! The "Effacer tout" button now properly resets conversation and impacts
205
+
206
+ 3. **Missing tabs**:
207
+ - Fixed! Tabbed interface restored with "Dernier message" and "Session totale"
208
+
209
+ 4. **API connection issues**:
210
+ - Check your `MISTRAL_API_KEY` in the `.env` file
211
+ - The app will fall back to predefined models if the API is unavailable
212
+
213
+ ### Recent Improvements
214
+
215
+ - ✅ **Fixed Model Filtering**: No more "Invalid model" errors from OCR/specialized models
216
+ - ✅ **Restored Tabbed Interface**: Full impact tracking with organized tabs
217
+ - ✅ **Fixed Clear Button**: "Effacer tout" now properly resets conversation and impacts
218
+ - ✅ **Enhanced Debugging**: Visible model filtering process with inclusion/exclusion lists
219
+
220
+ ## 📄 License
221
+
222
+ This project is provided as-is for educational and development purposes.
223
+
224
+ ## 🤝 Contributing
225
+
226
+ Feel free to fork, modify, and improve this application. The modular structure makes it easy to:
227
+
228
+ - Add new AI providers
229
+ - Enhance the UI
230
+ - Improve environmental tracking
231
+ - Add new features
232
+
233
+ ---
234
+
235
+ **Made with 🌱 for sustainable AI development**
README_HF.md ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # 🌱 EcoLogits Chat - Chat Écologique avec Mistral
2
+
3
+ Une application de chat qui utilise l'API Mistral AI tout en surveillant l'impact environnemental de chaque génération grâce à [EcoLogits](https://ecologits.ai/).
4
+
5
+ ## 🚀 Utilisation sur Hugging Face
6
+
7
+ ### 1. Obtenez votre clé API Mistral
8
+ 1. Créez un compte sur [console.mistral.ai](https://console.mistral.ai)
9
+ 2. Générez une nouvelle clé API dans votre tableau de bord
10
+ 3. Copiez votre clé API
11
+
12
+ ### 2. Utilisez l'application
13
+ 1. Collez votre clé API dans le champ prévu à cet effet
14
+ 2. Choisissez votre modèle Mistral préféré
15
+ 3. Commencez à chatter tout en surveillant votre impact environnemental !
16
+
17
+ ## 📊 Fonctionnalités
18
+
19
+ - **Chat intelligent** : Discutez avec les modèles Mistral AI
20
+ - **Suivi environnemental** : Mesurez la consommation d'énergie et l'empreinte carbone de chaque message
21
+ - **Interface intuitive** : Interface Gradio simple et conviviale
22
+ - **Sécurisé** : Votre clé API n'est jamais stockée, elle n'existe que pendant votre session
23
+
24
+ ## 🔒 Sécurité
25
+
26
+ - Vos clés API ne sont jamais sauvegardées
27
+ - Les clés API sont uniquement utilisées pendant votre session active
28
+ - Aucune donnée n'est partagée avec des tiers
29
+
30
+ ## 💚 Impact Environnemental
31
+
32
+ Cette application utilise [EcoLogits](https://ecologits.ai/) pour traquer :
33
+ - **Consommation d'énergie** (kWh) par message et cumulée
34
+ - **Empreinte carbone** (kgCO2eq) par message et cumulée
35
+ - **Avertissements** sur l'impact environnemental
36
+
37
+ ## 🛠️ Installation Locale
38
+
39
+ Si vous voulez exécuter l'application localement :
40
+
41
+ ```bash
42
+ pip install -r requirements.txt
43
+ python app.py
44
+ ```
45
+
46
+ ## 📝 License
47
+
48
+ MIT License - Voir le fichier LICENSE pour plus de détails.
README_HUGGINGFACE.md ADDED
@@ -0,0 +1,92 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ title: EcoLogits Chat
3
+ emoji: 🌱
4
+ colorFrom: green
5
+ colorTo: blue
6
+ sdk: gradio
7
+ sdk_version: 4.44.0
8
+ app_file: app.py
9
+ pinned: false
10
+ license: mit
11
+ tags:
12
+ - mistral
13
+ - ecologits
14
+ - environment
15
+ - sustainability
16
+ - ai-chat
17
+ ---
18
+
19
+ # 🌱 EcoLogits Chat - Chat Écologique avec Mistral
20
+
21
+ Une application de chat qui utilise l'API Mistral AI tout en surveillant l'impact environnemental de chaque génération grâce à [EcoLogits](https://ecologits.ai/).
22
+
23
+ ## 🚀 Utilisation
24
+
25
+ ### 🔑 Obtenez votre clé API Mistral
26
+ 1. Créez un compte sur [console.mistral.ai](https://console.mistral.ai)
27
+ 2. Générez une nouvelle clé API dans votre tableau de bord
28
+ 3. Copiez votre clé API
29
+ 4. Collez-la dans le champ prévu dans l'application
30
+
31
+ ### 💬 Commencez à chatter
32
+ 1. Choisissez votre modèle Mistral préféré
33
+ 2. Tapez votre message
34
+ 3. Surveillez l'impact environnemental en temps réel !
35
+
36
+ ## 📊 Fonctionnalités
37
+
38
+ - **Chat intelligent** : Discutez avec les modèles Mistral AI (50+ modèles disponibles)
39
+ - **Suivi environnemental** : Mesurez la consommation d'énergie et l'empreinte carbone
40
+ - **Interface intuitive** : Interface Gradio simple et conviviale
41
+ - **Sécurisé** : Votre clé API n'est jamais stockée
42
+ - **Temps réel** : Impact affiché pour chaque message et en cumul
43
+
44
+ ## 🔒 Sécurité & Confidentialité
45
+
46
+ - ✅ **Aucun stockage** : Vos clés API ne sont jamais sauvegardées
47
+ - ✅ **Session uniquement** : Les clés API n'existent que pendant votre session
48
+ - ✅ **Pas de partage** : Aucune donnée partagée avec des tiers
49
+ - ✅ **Code ouvert** : Code source entièrement visible et auditable
50
+
51
+ ## 💚 Impact Environnemental Tracké
52
+
53
+ Cette application utilise [EcoLogits](https://ecologits.ai/) pour mesurer :
54
+
55
+ ### Par message :
56
+ - **Consommation d'énergie** (kWh)
57
+ - **Empreinte carbone** (kgCO2eq)
58
+
59
+ ### Cumul session :
60
+ - **Énergie totale** consommée
61
+ - **Carbone total** émis
62
+ - **Nombre de générations**
63
+
64
+ ### Alertes :
65
+ - **Avertissements** sur l'impact
66
+ - **Erreurs** de mesure (si applicable)
67
+
68
+ ## 🤖 Modèles Supportés
69
+
70
+ L'application détecte automatiquement tous les modèles Mistral disponibles, incluant :
71
+ - mistral-tiny, mistral-small, mistral-medium
72
+ - mistral-large (dernières versions)
73
+ - codestral (pour le code)
74
+ - pixtral (multimodal)
75
+ - Et beaucoup d'autres !
76
+
77
+ ## 🛠️ Installation Locale
78
+
79
+ ```bash
80
+ git clone <ce-repo>
81
+ cd ecologitets
82
+ pip install -r requirements.txt
83
+ python app.py
84
+ ```
85
+
86
+ ## 📄 License
87
+
88
+ MIT License - Libre d'utilisation et de modification.
89
+
90
+ ---
91
+
92
+ 💡 **Astuce** : Utilisez des modèles plus petits (comme mistral-tiny) pour réduire votre impact environnemental !
app.py CHANGED
@@ -1,64 +1,18 @@
1
- import gradio as gr
2
- from huggingface_hub import InferenceClient
3
-
4
- """
5
- For more information on `huggingface_hub` Inference API support, please check the docs: https://huggingface.co/docs/huggingface_hub/v0.22.2/en/guides/inference
6
- """
7
- client = InferenceClient("HuggingFaceH4/zephyr-7b-beta")
8
-
9
-
10
- def respond(
11
- message,
12
- history: list[tuple[str, str]],
13
- system_message,
14
- max_tokens,
15
- temperature,
16
- top_p,
17
- ):
18
- messages = [{"role": "system", "content": system_message}]
19
-
20
- for val in history:
21
- if val[0]:
22
- messages.append({"role": "user", "content": val[0]})
23
- if val[1]:
24
- messages.append({"role": "assistant", "content": val[1]})
25
-
26
- messages.append({"role": "user", "content": message})
27
-
28
- response = ""
29
-
30
- for message in client.chat_completion(
31
- messages,
32
- max_tokens=max_tokens,
33
- stream=True,
34
- temperature=temperature,
35
- top_p=top_p,
36
- ):
37
- token = message.choices[0].delta.content
38
-
39
- response += token
40
- yield response
41
-
42
-
43
- """
44
- For information on how to customize the ChatInterface, peruse the gradio docs: https://www.gradio.app/docs/chatinterface
45
- """
46
- demo = gr.ChatInterface(
47
- respond,
48
- additional_inputs=[
49
- gr.Textbox(value="You are a friendly Chatbot.", label="System message"),
50
- gr.Slider(minimum=1, maximum=2048, value=512, step=1, label="Max new tokens"),
51
- gr.Slider(minimum=0.1, maximum=4.0, value=0.7, step=0.1, label="Temperature"),
52
- gr.Slider(
53
- minimum=0.1,
54
- maximum=1.0,
55
- value=0.95,
56
- step=0.05,
57
- label="Top-p (nucleus sampling)",
58
- ),
59
- ],
60
- )
61
-
62
-
63
- if __name__ == "__main__":
64
- demo.launch()
 
1
+ from chat_interface import ChatInterface
2
+
3
+ def main():
4
+ """Launch the chat application"""
5
+ # Create and launch the interface
6
+ chat_app = ChatInterface()
7
+ interface = chat_app.create_interface()
8
+
9
+ print("🌱 Launching EcoLogits Chat...")
10
+ interface.launch(
11
+ share=False,
12
+ server_name="0.0.0.0", # Allow external connections for Hugging Face
13
+ server_port=7860, # Standard Hugging Face port
14
+ inbrowser=False # Don't auto-open browser on server
15
+ )
16
+
17
+ if __name__ == "__main__":
18
+ main()
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
app_config.py ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+
3
+ # Hugging Face Spaces configuration
4
+ TITLE = "🌱 EcoLogits Chat - Chat Écologique avec Mistral"
5
+ DESCRIPTION = """
6
+ # 🌱 Chat Écologique avec Mistral
7
+
8
+ 💬 Discutez avec l'IA tout en surveillant l'impact environnemental de vos conversations.
9
+
10
+ ⚠️ **Important** : Vous devez fournir votre propre clé API Mistral pour utiliser cette application.
11
+
12
+ ## 🔑 Comment obtenir votre clé API :
13
+ 1. Créez un compte sur [console.mistral.ai](https://console.mistral.ai)
14
+ 2. Générez une clé API dans votre tableau de bord
15
+ 3. Collez votre clé dans le champ ci-dessous
16
+
17
+ ## 📊 Fonctionnalités :
18
+ - ✅ Chat avec les modèles Mistral AI
19
+ - ✅ Suivi de la consommation d'énergie (kWh)
20
+ - ✅ Mesure de l'empreinte carbone (kgCO2eq)
21
+ - ✅ Interface sécurisée (clé API non stockée)
22
+ """
23
+
24
+ # For Hugging Face Spaces metadata
25
+ TAGS = ["mistral", "ecologits", "environment", "chat", "ai"]
26
+ THUMBNAIL = None # Add a thumbnail URL if you have one
app_original.py ADDED
@@ -0,0 +1,534 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ from dotenv import load_dotenv
3
+ from ecologits import EcoLogits
4
+ from mistralai import Mistral
5
+ import gradio as gr
6
+
7
+ # Load environment variables from .env file
8
+ load_dotenv()
9
+
10
+ # Initialize EcoLogits
11
+ EcoLogits.init(providers=["mistralai"])
12
+
13
+ client = Mistral(api_key=os.getenv("MISTRAL_API_KEY"))
14
+
15
+ # Initialize cumulative impacts storage
16
+ class CumulativeImpacts:
17
+ def __init__(self):
18
+ self.reset()
19
+
20
+ def reset(self):
21
+ self.energy_min = 0.0
22
+ self.energy_max = 0.0
23
+ self.energy_unit = "kWh"
24
+ self.gwp_min = 0.0
25
+ self.gwp_max = 0.0
26
+ self.gwp_unit = "kgCO2eq"
27
+ self.adpe_min = 0.0
28
+ self.adpe_max = 0.0
29
+ self.adpe_unit = "kgSbeq"
30
+ self.pe_min = 0.0
31
+ self.pe_max = 0.0
32
+ self.pe_unit = "kWh"
33
+ self.usage_energy_min = 0.0
34
+ self.usage_energy_max = 0.0
35
+ self.usage_gwp_min = 0.0
36
+ self.usage_gwp_max = 0.0
37
+ self.embodied_gwp = 0.0
38
+ self.embodied_gwp_unit = "kgCO2eq"
39
+ self.embodied_adpe = 0.0
40
+ self.embodied_adpe_unit = "kgSbeq"
41
+ self.embodied_pe = 0.0
42
+ self.embodied_pe_unit = "kWh"
43
+ self.generation_count = 0
44
+
45
+ # Store previous values for delta calculation
46
+ self.prev_energy_min = 0.0
47
+ self.prev_energy_max = 0.0
48
+ self.prev_gwp_min = 0.0
49
+ self.prev_gwp_max = 0.0
50
+ self.prev_adpe_min = 0.0
51
+ self.prev_adpe_max = 0.0
52
+ self.prev_pe_min = 0.0
53
+ self.prev_pe_max = 0.0
54
+ self.prev_usage_energy_min = 0.0
55
+ self.prev_usage_energy_max = 0.0
56
+ self.prev_usage_gwp_min = 0.0
57
+ self.prev_usage_gwp_max = 0.0
58
+ self.prev_embodied_gwp = 0.0
59
+ self.prev_embodied_adpe = 0.0
60
+ self.prev_embodied_pe = 0.0
61
+
62
+ def get_generation_delta(self):
63
+ """Get the delta between current and previous totals (impact of last generation)"""
64
+ return {
65
+ 'energy_min': self.energy_min - self.prev_energy_min,
66
+ 'energy_max': self.energy_max - self.prev_energy_max,
67
+ 'energy_unit': self.energy_unit,
68
+ 'gwp_min': self.gwp_min - self.prev_gwp_min,
69
+ 'gwp_max': self.gwp_max - self.prev_gwp_max,
70
+ 'gwp_unit': self.gwp_unit,
71
+ 'adpe_min': self.adpe_min - self.prev_adpe_min,
72
+ 'adpe_max': self.adpe_max - self.prev_adpe_max,
73
+ 'adpe_unit': self.adpe_unit,
74
+ 'pe_min': self.pe_min - self.prev_pe_min,
75
+ 'pe_max': self.pe_max - self.prev_pe_max,
76
+ 'pe_unit': self.pe_unit,
77
+ 'usage_energy_min': self.usage_energy_min - self.prev_usage_energy_min,
78
+ 'usage_energy_max': self.usage_energy_max - self.prev_usage_energy_max,
79
+ 'usage_gwp_min': self.usage_gwp_min - self.prev_usage_gwp_min,
80
+ 'usage_gwp_max': self.usage_gwp_max - self.prev_usage_gwp_max,
81
+ 'embodied_gwp': float(self.embodied_gwp) - float(self.prev_embodied_gwp),
82
+ 'embodied_gwp_unit': self.embodied_gwp_unit,
83
+ 'embodied_adpe': float(self.embodied_adpe) - float(self.prev_embodied_adpe),
84
+ 'embodied_adpe_unit': self.embodied_adpe_unit,
85
+ 'embodied_pe': float(self.embodied_pe) - float(self.prev_embodied_pe),
86
+ 'embodied_pe_unit': self.embodied_pe_unit
87
+ }
88
+
89
+ def add_impacts(self, impacts):
90
+ # Store previous values before updating
91
+ self.prev_energy_min = self.energy_min
92
+ self.prev_energy_max = self.energy_max
93
+ self.prev_gwp_min = self.gwp_min
94
+ self.prev_gwp_max = self.gwp_max
95
+ self.prev_adpe_min = self.adpe_min
96
+ self.prev_adpe_max = self.adpe_max
97
+ self.prev_pe_min = self.pe_min
98
+ self.prev_pe_max = self.pe_max
99
+ self.prev_usage_energy_min = self.usage_energy_min
100
+ self.prev_usage_energy_max = self.usage_energy_max
101
+ self.prev_usage_gwp_min = self.usage_gwp_min
102
+ self.prev_usage_gwp_max = self.usage_gwp_max
103
+ # Convert values to float to avoid type issues
104
+ self.prev_embodied_gwp = float(self.embodied_gwp) if self.embodied_gwp is not None else 0.0
105
+ self.prev_embodied_adpe = float(self.embodied_adpe) if self.embodied_adpe is not None else 0.0
106
+ self.prev_embodied_pe = float(self.embodied_pe) if self.embodied_pe is not None else 0.0
107
+
108
+ # Update current values
109
+ self.energy_min += impacts.energy.value.min
110
+ self.energy_max += impacts.energy.value.max
111
+ self.energy_unit = impacts.energy.unit
112
+
113
+ self.gwp_min += impacts.gwp.value.min
114
+ self.gwp_max += impacts.gwp.value.max
115
+ self.gwp_unit = impacts.gwp.unit
116
+
117
+ self.adpe_min += impacts.adpe.value.min
118
+ self.adpe_max += impacts.adpe.value.max
119
+ self.adpe_unit = impacts.adpe.unit
120
+
121
+ self.pe_min += impacts.pe.value.min
122
+ self.pe_max += impacts.pe.value.max
123
+ self.pe_unit = impacts.pe.unit
124
+
125
+ self.usage_energy_min += impacts.usage.energy.value.min
126
+ self.usage_energy_max += impacts.usage.energy.value.max
127
+ self.usage_gwp_min += impacts.usage.gwp.value.min
128
+ self.usage_gwp_max += impacts.usage.gwp.value.max
129
+
130
+ # Handle embodied impacts - they might be RangeValue objects or simple floats
131
+ if hasattr(impacts.embodied.gwp.value, 'min'):
132
+ self.embodied_gwp += impacts.embodied.gwp.value.min # Use min if it's a RangeValue
133
+ else:
134
+ self.embodied_gwp += impacts.embodied.gwp.value # Use direct value if it's a float
135
+ self.embodied_gwp_unit = impacts.embodied.gwp.unit
136
+
137
+ if hasattr(impacts.embodied.adpe.value, 'min'):
138
+ self.embodied_adpe += impacts.embodied.adpe.value.min
139
+ else:
140
+ self.embodied_adpe += impacts.embodied.adpe.value
141
+ self.embodied_adpe_unit = impacts.embodied.adpe.unit
142
+
143
+ if hasattr(impacts.embodied.pe.value, 'min'):
144
+ self.embodied_pe += impacts.embodied.pe.value.min
145
+ else:
146
+ self.embodied_pe += impacts.embodied.pe.value
147
+ self.embodied_pe_unit = impacts.embodied.pe.unit
148
+
149
+ self.generation_count += 1
150
+
151
+ # Global cumulative impacts instance
152
+ cumulative_impacts = CumulativeImpacts()
153
+ def get_response(message, history, model):
154
+ message_aggregate = []
155
+ if history:
156
+ message_aggregate.extend(history)
157
+ message_aggregate.append({"role": "user", "content": message})
158
+ print(message_aggregate)
159
+
160
+ # Use the selected model or default to mistral-tiny
161
+ if not model:
162
+ model = "mistral-tiny"
163
+
164
+ print(f"Using model: {model}")
165
+ response = client.chat.complete(
166
+ messages=message_aggregate,
167
+ model=model
168
+ )
169
+ if response.impacts.has_warnings: # type: ignore
170
+ for w in response.impacts.warnings: # type: ignore
171
+ print(w)
172
+
173
+ if response.impacts.has_errors: # type: ignore
174
+ for e in response.impacts.errors: # type: ignore
175
+ print(e)
176
+
177
+ # Get the impacts and format them
178
+ impacts = response.impacts # type: ignore
179
+
180
+ # Add impacts to cumulative total
181
+ cumulative_impacts.add_impacts(impacts)
182
+
183
+ # Get the delta for this generation
184
+ generation_delta = cumulative_impacts.get_generation_delta()
185
+
186
+ message_aggregate.append({
187
+ "role": "assistant",
188
+ "content": response.choices[0].message.content
189
+ })
190
+
191
+ # Return current generation impacts, generation delta, and cumulative totals
192
+ return (
193
+ message_aggregate,
194
+ # Current generation impacts (raw from API)
195
+ impacts.energy.value.min,
196
+ impacts.energy.value.max,
197
+ impacts.energy.unit,
198
+ impacts.gwp.value.min,
199
+ impacts.gwp.value.max,
200
+ impacts.gwp.unit,
201
+ f"{impacts.adpe.value.min:.2e}",
202
+ f"{impacts.adpe.value.max:.2e}",
203
+ impacts.adpe.unit,
204
+ impacts.pe.value.min,
205
+ impacts.pe.value.max,
206
+ impacts.pe.unit,
207
+ impacts.usage.energy.value.min,
208
+ impacts.usage.energy.value.max,
209
+ impacts.usage.gwp.value.min,
210
+ impacts.usage.gwp.value.max,
211
+ f"{impacts.embodied.gwp.value:.2e}",
212
+ impacts.embodied.gwp.unit,
213
+ f"{impacts.embodied.adpe.value:.2e}",
214
+ impacts.embodied.adpe.unit,
215
+ f"{impacts.embodied.pe.value:.2e}",
216
+ impacts.embodied.pe.unit,
217
+ # Generation delta (calculated difference)
218
+ generation_delta['energy_min'],
219
+ generation_delta['energy_max'],
220
+ generation_delta['energy_unit'],
221
+ generation_delta['gwp_min'],
222
+ generation_delta['gwp_max'],
223
+ generation_delta['gwp_unit'],
224
+ f"{generation_delta['adpe_min']:.2e}",
225
+ f"{generation_delta['adpe_max']:.2e}",
226
+ generation_delta['adpe_unit'],
227
+ generation_delta['pe_min'],
228
+ generation_delta['pe_max'],
229
+ generation_delta['pe_unit'],
230
+ generation_delta['usage_energy_min'],
231
+ generation_delta['usage_energy_max'],
232
+ generation_delta['usage_gwp_min'],
233
+ generation_delta['usage_gwp_max'],
234
+ f"{generation_delta['embodied_gwp']:.2e}",
235
+ generation_delta['embodied_gwp_unit'],
236
+ f"{generation_delta['embodied_adpe']:.2e}",
237
+ generation_delta['embodied_adpe_unit'],
238
+ f"{generation_delta['embodied_pe']:.2e}",
239
+ generation_delta['embodied_pe_unit'],
240
+ # Cumulative impacts
241
+ cumulative_impacts.energy_min,
242
+ cumulative_impacts.energy_max,
243
+ cumulative_impacts.energy_unit,
244
+ cumulative_impacts.gwp_min,
245
+ cumulative_impacts.gwp_max,
246
+ cumulative_impacts.gwp_unit,
247
+ f"{cumulative_impacts.adpe_min:.2e}",
248
+ f"{cumulative_impacts.adpe_max:.2e}",
249
+ cumulative_impacts.adpe_unit,
250
+ cumulative_impacts.pe_min,
251
+ cumulative_impacts.pe_max,
252
+ cumulative_impacts.pe_unit,
253
+ cumulative_impacts.usage_energy_min,
254
+ cumulative_impacts.usage_energy_max,
255
+ cumulative_impacts.usage_gwp_min,
256
+ cumulative_impacts.usage_gwp_max,
257
+ f"{cumulative_impacts.embodied_gwp:.2e}",
258
+ cumulative_impacts.embodied_gwp_unit,
259
+ f"{cumulative_impacts.embodied_adpe:.2e}",
260
+ cumulative_impacts.embodied_adpe_unit,
261
+ f"{cumulative_impacts.embodied_pe:.2e}",
262
+ cumulative_impacts.embodied_pe_unit,
263
+ cumulative_impacts.generation_count
264
+ )
265
+
266
+ with gr.Blocks() as demo:
267
+ gr.Markdown("# 🌱 EcoLogits Chat - Chat écologique avec Mistral")
268
+ gr.Markdown("Discutez avec l'IA tout en surveillant l'impact environnemental de chaque requête")
269
+
270
+ with gr.Row():
271
+ with gr.Column(scale=3):
272
+ chatbot = gr.Chatbot(
273
+ type="messages",
274
+ height=600,
275
+ show_label=False
276
+ )
277
+ with gr.Row():
278
+ prompt = gr.Textbox(
279
+ placeholder="Tapez votre message ici...",
280
+ scale=4,
281
+ show_label=False,
282
+ max_lines=3
283
+ )
284
+ send_btn = gr.Button("Envoyer", variant="primary", scale=1)
285
+ with gr.Row():
286
+ # Get available models from Mistral API
287
+ try:
288
+ available_models = client.models.list()
289
+ model_choices = [model.id for model in available_models.data] if available_models.data else []
290
+ except Exception as e:
291
+ print(f"Warning: Could not fetch models from API: {e}")
292
+ model_choices = ["mistral-tiny", "mistral-small", "mistral-medium"]
293
+
294
+ if not model_choices:
295
+ model_choices = ["mistral-tiny", "mistral-small", "mistral-medium"]
296
+
297
+ model = gr.Dropdown(
298
+ choices=model_choices,
299
+ value=model_choices[0] if model_choices else "mistral-tiny",
300
+ label="Modèle Mistral",
301
+ scale=1
302
+ )
303
+
304
+ with gr.Column(scale=2):
305
+ gr.Markdown("## 📊 Impact Environnemental")
306
+
307
+ # Generation counter
308
+ generation_counter = gr.Number(label="🔢 Nombre de générations", precision=0, interactive=False)
309
+
310
+ # Tabs for current vs cumulative impacts
311
+ with gr.Tabs():
312
+ with gr.TabItem("🎯 Dernier message"):
313
+ gr.Markdown("### � **Coût calculé de cette requête spécifique**")
314
+ gr.Markdown("*Différence entre le total avant et après cette génération*")
315
+
316
+ with gr.Accordion("⚡ Consommation d'énergie", open=True):
317
+ with gr.Row():
318
+ delta_energy_min = gr.Number(label="Min", precision=8, interactive=False)
319
+ delta_energy_max = gr.Number(label="Max", precision=8, interactive=False)
320
+ delta_energy_unit = gr.Textbox(label="Unité", interactive=False)
321
+
322
+ with gr.Accordion("🌍 Empreinte carbone (GWP)", open=True):
323
+ with gr.Row():
324
+ delta_gwp_min = gr.Number(label="Min", precision=8, interactive=False)
325
+ delta_gwp_max = gr.Number(label="Max", precision=8, interactive=False)
326
+ delta_gwp_unit = gr.Textbox(label="Unité", interactive=False)
327
+
328
+ with gr.Accordion("⛏️ Épuisement des ressources (ADPe)", open=False):
329
+ with gr.Row():
330
+ delta_adpe_min = gr.Textbox(label="Min", interactive=False)
331
+ delta_adpe_max = gr.Textbox(label="Max", interactive=False)
332
+ delta_adpe_unit = gr.Textbox(label="Unité", interactive=False)
333
+
334
+ with gr.Accordion("🔋 Énergie primaire (PE)", open=False):
335
+ with gr.Row():
336
+ delta_pe_min = gr.Number(label="Min", precision=8, interactive=False)
337
+ delta_pe_max = gr.Number(label="Max", precision=8, interactive=False)
338
+ delta_pe_unit = gr.Textbox(label="Unité", interactive=False)
339
+
340
+ with gr.Accordion("💻 Impact d'usage", open=False):
341
+ gr.Markdown("**Énergie**")
342
+ with gr.Row():
343
+ delta_usage_energy_min = gr.Number(label="Min", precision=8, interactive=False)
344
+ delta_usage_energy_max = gr.Number(label="Max", precision=8, interactive=False)
345
+ gr.Markdown("**GWP**")
346
+ with gr.Row():
347
+ delta_usage_gwp_min = gr.Number(label="Min", precision=8, interactive=False)
348
+ delta_usage_gwp_max = gr.Number(label="Max", precision=8, interactive=False)
349
+
350
+ with gr.Accordion("🏭 Impact incorporé", open=False):
351
+ delta_embodied_gwp = gr.Textbox(label="GWP", interactive=False)
352
+ delta_embodied_gwp_unit = gr.Textbox(label="Unité GWP", interactive=False)
353
+ delta_embodied_adpe = gr.Textbox(label="ADPe", interactive=False)
354
+ delta_embodied_adpe_unit = gr.Textbox(label="Unité ADPe", interactive=False)
355
+ delta_embodied_pe = gr.Textbox(label="PE", interactive=False)
356
+ delta_embodied_pe_unit = gr.Textbox(label="Unité PE", interactive=False)
357
+
358
+ with gr.TabItem("📈 Chat complet"):
359
+ gr.Markdown("### 🔧 **Données directes de l'API EcoLogits**")
360
+
361
+ with gr.Accordion("⚡ Consommation d'énergie", open=True):
362
+ with gr.Row():
363
+ energy_min = gr.Number(label="Min", precision=6, interactive=False)
364
+ energy_max = gr.Number(label="Max", precision=6, interactive=False)
365
+ energy_unit = gr.Textbox(label="Unité", interactive=False)
366
+
367
+ with gr.Accordion("🌍 Empreinte carbone (GWP)", open=True):
368
+ with gr.Row():
369
+ gwp_min = gr.Number(label="Min", precision=6, interactive=False)
370
+ gwp_max = gr.Number(label="Max", precision=6, interactive=False)
371
+ gwp_unit = gr.Textbox(label="Unité", interactive=False)
372
+
373
+ with gr.Accordion("⛏️ Épuisement des ressources (ADPe)", open=False):
374
+ with gr.Row():
375
+ adpe_min = gr.Textbox(label="Min", interactive=False)
376
+ adpe_max = gr.Textbox(label="Max", interactive=False)
377
+ adpe_unit = gr.Textbox(label="Unité", interactive=False)
378
+
379
+ with gr.Accordion("🔋 Énergie primaire (PE)", open=False):
380
+ with gr.Row():
381
+ pe_min = gr.Number(label="Min", precision=6, interactive=False)
382
+ pe_max = gr.Number(label="Max", precision=6, interactive=False)
383
+ pe_unit = gr.Textbox(label="Unité", interactive=False)
384
+
385
+ with gr.Accordion("💻 Impact d'usage", open=False):
386
+ gr.Markdown("**Énergie**")
387
+ with gr.Row():
388
+ usage_energy_min = gr.Number(label="Min", precision=6, interactive=False)
389
+ usage_energy_max = gr.Number(label="Max", precision=6, interactive=False)
390
+ gr.Markdown("**GWP**")
391
+ with gr.Row():
392
+ usage_gwp_min = gr.Number(label="Min", precision=6, interactive=False)
393
+ usage_gwp_max = gr.Number(label="Max", precision=6, interactive=False)
394
+
395
+ with gr.Accordion("🏭 Impact incorporé", open=False):
396
+ embodied_gwp = gr.Textbox(label="GWP", interactive=False)
397
+ embodied_gwp_unit = gr.Textbox(label="Unité GWP", interactive=False)
398
+ embodied_adpe = gr.Textbox(label="ADPe", interactive=False)
399
+ embodied_adpe_unit = gr.Textbox(label="Unité ADPe", interactive=False)
400
+ embodied_pe = gr.Textbox(label="PE", interactive=False)
401
+ embodied_pe_unit = gr.Textbox(label="Unité PE", interactive=False)
402
+
403
+ with gr.TabItem("📊 Total session"):
404
+ with gr.Accordion("⚡ Consommation d'énergie totale", open=True):
405
+ with gr.Row():
406
+ total_energy_min = gr.Number(label="Min", precision=6, interactive=False)
407
+ total_energy_max = gr.Number(label="Max", precision=6, interactive=False)
408
+ total_energy_unit = gr.Textbox(label="Unité", interactive=False)
409
+
410
+ with gr.Accordion("🌍 Empreinte carbone totale (GWP)", open=True):
411
+ with gr.Row():
412
+ total_gwp_min = gr.Number(label="Min", precision=6, interactive=False)
413
+ total_gwp_max = gr.Number(label="Max", precision=6, interactive=False)
414
+ total_gwp_unit = gr.Textbox(label="Unité", interactive=False)
415
+
416
+ with gr.Accordion("⛏️ Épuisement des ressources totale (ADPe)", open=False):
417
+ with gr.Row():
418
+ total_adpe_min = gr.Textbox(label="Min", interactive=False)
419
+ total_adpe_max = gr.Textbox(label="Max", interactive=False)
420
+ total_adpe_unit = gr.Textbox(label="Unité", interactive=False)
421
+
422
+ with gr.Accordion("🔋 Énergie primaire totale (PE)", open=False):
423
+ with gr.Row():
424
+ total_pe_min = gr.Number(label="Min", precision=6, interactive=False)
425
+ total_pe_max = gr.Number(label="Max", precision=6, interactive=False)
426
+ total_pe_unit = gr.Textbox(label="Unité", interactive=False)
427
+
428
+ with gr.Accordion("💻 Impact d'usage total", open=False):
429
+ gr.Markdown("**Énergie**")
430
+ with gr.Row():
431
+ total_usage_energy_min = gr.Number(label="Min", precision=6, interactive=False)
432
+ total_usage_energy_max = gr.Number(label="Max", precision=6, interactive=False)
433
+ gr.Markdown("**GWP**")
434
+ with gr.Row():
435
+ total_usage_gwp_min = gr.Number(label="Min", precision=6, interactive=False)
436
+ total_usage_gwp_max = gr.Number(label="Max", precision=6, interactive=False)
437
+
438
+ with gr.Accordion("🏭 Impact incorporé total", open=False):
439
+ total_embodied_gwp = gr.Textbox(label="GWP", interactive=False)
440
+ total_embodied_gwp_unit = gr.Textbox(label="Unité GWP", interactive=False)
441
+ total_embodied_adpe = gr.Textbox(label="ADPe", interactive=False)
442
+ total_embodied_adpe_unit = gr.Textbox(label="Unité ADPe", interactive=False)
443
+ total_embodied_pe = gr.Textbox(label="PE", interactive=False)
444
+ total_embodied_pe_unit = gr.Textbox(label="Unité PE", interactive=False)
445
+
446
+ def handle_message(message, history, model):
447
+ if not message.strip():
448
+ return history, "", *([None]*68) # Updated for new output count with delta values
449
+
450
+ result = get_response(message, history, model)
451
+ return result[0], "", *result[1:]
452
+
453
+ def clear_chat():
454
+ """Clear the chat and reset cumulative impacts"""
455
+ cumulative_impacts.reset()
456
+ return [], *([None]*68) # Clear chatbot and all impact displays
457
+
458
+ # Connect both prompt submit and button click
459
+ for trigger in [prompt.submit, send_btn.click]:
460
+ trigger(
461
+ handle_message,
462
+ inputs=[prompt, chatbot, model],
463
+ outputs=[chatbot, prompt,
464
+ # Current generation impacts (raw API data)
465
+ energy_min, energy_max, energy_unit,
466
+ gwp_min, gwp_max, gwp_unit,
467
+ adpe_min, adpe_max, adpe_unit,
468
+ pe_min, pe_max, pe_unit,
469
+ usage_energy_min, usage_energy_max,
470
+ usage_gwp_min, usage_gwp_max,
471
+ embodied_gwp, embodied_gwp_unit,
472
+ embodied_adpe, embodied_adpe_unit,
473
+ embodied_pe, embodied_pe_unit,
474
+ # Generation delta (calculated difference)
475
+ delta_energy_min, delta_energy_max, delta_energy_unit,
476
+ delta_gwp_min, delta_gwp_max, delta_gwp_unit,
477
+ delta_adpe_min, delta_adpe_max, delta_adpe_unit,
478
+ delta_pe_min, delta_pe_max, delta_pe_unit,
479
+ delta_usage_energy_min, delta_usage_energy_max,
480
+ delta_usage_gwp_min, delta_usage_gwp_max,
481
+ delta_embodied_gwp, delta_embodied_gwp_unit,
482
+ delta_embodied_adpe, delta_embodied_adpe_unit,
483
+ delta_embodied_pe, delta_embodied_pe_unit,
484
+ # Cumulative impacts
485
+ total_energy_min, total_energy_max, total_energy_unit,
486
+ total_gwp_min, total_gwp_max, total_gwp_unit,
487
+ total_adpe_min, total_adpe_max, total_adpe_unit,
488
+ total_pe_min, total_pe_max, total_pe_unit,
489
+ total_usage_energy_min, total_usage_energy_max,
490
+ total_usage_gwp_min, total_usage_gwp_max,
491
+ total_embodied_gwp, total_embodied_gwp_unit,
492
+ total_embodied_adpe, total_embodied_adpe_unit,
493
+ total_embodied_pe, total_embodied_pe_unit,
494
+ generation_counter]
495
+ )
496
+
497
+ # Connect clear button
498
+ chatbot.clear(
499
+ clear_chat,
500
+ outputs=[chatbot,
501
+ # Current generation impacts (raw API data)
502
+ energy_min, energy_max, energy_unit,
503
+ gwp_min, gwp_max, gwp_unit,
504
+ adpe_min, adpe_max, adpe_unit,
505
+ pe_min, pe_max, pe_unit,
506
+ usage_energy_min, usage_energy_max,
507
+ usage_gwp_min, usage_gwp_max,
508
+ embodied_gwp, embodied_gwp_unit,
509
+ embodied_adpe, embodied_adpe_unit,
510
+ embodied_pe, embodied_pe_unit,
511
+ # Generation delta (calculated difference)
512
+ delta_energy_min, delta_energy_max, delta_energy_unit,
513
+ delta_gwp_min, delta_gwp_max, delta_gwp_unit,
514
+ delta_adpe_min, delta_adpe_max, delta_adpe_unit,
515
+ delta_pe_min, delta_pe_max, delta_pe_unit,
516
+ delta_usage_energy_min, delta_usage_energy_max,
517
+ delta_usage_gwp_min, delta_usage_gwp_max,
518
+ delta_embodied_gwp, delta_embodied_gwp_unit,
519
+ delta_embodied_adpe, delta_embodied_adpe_unit,
520
+ delta_embodied_pe, delta_embodied_pe_unit,
521
+ # Cumulative impacts
522
+ total_energy_min, total_energy_max, total_energy_unit,
523
+ total_gwp_min, total_gwp_max, total_gwp_unit,
524
+ total_adpe_min, total_adpe_max, total_adpe_unit,
525
+ total_pe_min, total_pe_max, total_pe_unit,
526
+ total_usage_energy_min, total_usage_energy_max,
527
+ total_usage_gwp_min, total_usage_gwp_max,
528
+ total_embodied_gwp, total_embodied_gwp_unit,
529
+ total_embodied_adpe, total_embodied_adpe_unit,
530
+ total_embodied_pe, total_embodied_pe_unit,
531
+ generation_counter]
532
+ )
533
+
534
+ demo.launch()
chat_interface.py ADDED
@@ -0,0 +1,288 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import gradio as gr
2
+ from mistral_client import MistralChatClient
3
+ from impacts_tracker import ImpactsTracker
4
+ from config import DEFAULT_MODEL, MISTRAL_API_KEY, ALLOW_USER_API_KEY
5
+
6
+ class ChatInterface:
7
+ """Simple chat interface with environmental impact tracking"""
8
+
9
+ def __init__(self):
10
+ # Try to initialize with default API key
11
+ self.client = None
12
+ self.tracker = ImpactsTracker()
13
+ self.user_api_key = None
14
+
15
+ # Initialize client if we have a default API key
16
+ if MISTRAL_API_KEY:
17
+ try:
18
+ self.client = MistralChatClient()
19
+ except Exception as e:
20
+ print(f"Warning: Could not initialize with default API key: {e}")
21
+
22
+ def update_api_key(self, api_key):
23
+ """Update or set the API key"""
24
+ if not api_key or not api_key.strip():
25
+ return "❌ Veuillez fournir une clé API valide"
26
+
27
+ try:
28
+ if self.client:
29
+ self.client.update_api_key(api_key.strip())
30
+ else:
31
+ self.client = MistralChatClient(api_key=api_key.strip())
32
+
33
+ self.user_api_key = api_key.strip()
34
+ return "✅ Clé API mise à jour avec succès"
35
+ except Exception as e:
36
+ return f"❌ Erreur lors de la mise à jour de la clé API: {str(e)}"
37
+
38
+ def process_message(self, message, history, model, api_key_input=None):
39
+ """Process a chat message and return updated conversation with impacts"""
40
+ # Check if we need to update the API key
41
+ if api_key_input and api_key_input.strip():
42
+ if not self.client or self.user_api_key != api_key_input.strip():
43
+ update_result = self.update_api_key(api_key_input.strip())
44
+ if "❌" in update_result:
45
+ return history, "", *self._get_empty_impacts(), update_result
46
+
47
+ # Check if we have a client initialized
48
+ if not self.client:
49
+ error_msg = "❌ Veuillez fournir votre clé API Mistral ci-dessus pour commencer."
50
+ return history, "", *self._get_empty_impacts(), error_msg
51
+
52
+ if not message.strip():
53
+ current_status = "✅ Clé API valide" if self.client else "❌ Clé API requise"
54
+ return history, "", *self._get_empty_impacts(), current_status
55
+
56
+ # Build message history
57
+ messages = []
58
+ if history:
59
+ messages.extend(history)
60
+ messages.append({"role": "user", "content": message})
61
+
62
+ # Get response from Mistral
63
+ result = self.client.chat(messages, model or DEFAULT_MODEL)
64
+
65
+ # Update conversation history
66
+ messages.append({
67
+ "role": "assistant",
68
+ "content": result['content']
69
+ })
70
+
71
+ # Track environmental impacts
72
+ if result['impacts']:
73
+ self.tracker.add_impact(result['impacts'])
74
+
75
+ # Get impact summary
76
+ impacts = self.tracker.get_summary()
77
+
78
+ # Format warnings and errors for display
79
+ warnings_text = ""
80
+ errors_text = ""
81
+
82
+ if result['warnings']:
83
+ warnings_text = "\n".join([f"⚠️ {w}" for w in result['warnings']])
84
+
85
+ if result['errors']:
86
+ errors_text = "\n".join([f"❌ {e}" for e in result['errors']])
87
+
88
+ return (
89
+ messages, # Updated conversation
90
+ "", # Clear input
91
+ f"{impacts['last_energy']:.6f}", # Last message energy
92
+ f"{impacts['last_carbon']:.6f}", # Last message carbon
93
+ f"{impacts['total_energy']:.6f}", # Total energy
94
+ f"{impacts['total_carbon']:.6f}", # Total carbon
95
+ impacts['generation_count'], # Message count
96
+ warnings_text, # Warnings
97
+ errors_text, # Errors
98
+ "✅ Clé API valide" # API key status
99
+ )
100
+
101
+ def clear_conversation(self):
102
+ """Clear conversation and reset impacts"""
103
+ self.tracker.reset()
104
+ return (
105
+ [], # Empty conversation
106
+ "", # Clear input
107
+ "0.000000", # Last energy
108
+ "0.000000", # Last carbon
109
+ "0.000000", # Total energy
110
+ "0.000000", # Total carbon
111
+ 0, # Message count
112
+ "", # Clear warnings
113
+ "" # Clear errors
114
+ )
115
+
116
+ def _get_empty_impacts(self):
117
+ """Return empty values for impact displays"""
118
+ return [
119
+ "0.000000", # Last energy
120
+ "0.000000", # Last carbon
121
+ "0.000000", # Total energy
122
+ "0.000000", # Total carbon
123
+ 0, # Message count
124
+ "", # Clear warnings
125
+ "" # Clear errors
126
+ ]
127
+
128
+ def create_interface(self):
129
+ """Create and configure the Gradio interface"""
130
+ with gr.Blocks(title="🌱 EcoLogits Chat") as interface:
131
+ gr.Markdown("# 🌱 Chat Écologique avec Mistral")
132
+ gr.Markdown("💬 Discutez avec l'IA en surveillant l'impact environnemental")
133
+
134
+ # API Key input section (always visible for Hugging Face users)
135
+ with gr.Group(visible=True):
136
+ gr.Markdown("## 🔑 Configuration de la clé API")
137
+ gr.Markdown("Pour utiliser cette application, vous devez fournir votre propre clé API Mistral:")
138
+ gr.Markdown("1. 📝 Créez un compte sur [console.mistral.ai](https://console.mistral.ai)")
139
+ gr.Markdown("2. 🔑 Générez une clé API dans votre tableau de bord")
140
+ gr.Markdown("3. 📋 Collez votre clé ci-dessous")
141
+
142
+ with gr.Row():
143
+ api_key_input = gr.Textbox(
144
+ label="🔐 Clé API Mistral",
145
+ type="password",
146
+ placeholder="Collez votre clé API Mistral ici...",
147
+ scale=4
148
+ )
149
+ api_key_status = gr.Textbox(
150
+ label="Status",
151
+ value="❌ Clé API requise",
152
+ interactive=False,
153
+ scale=2
154
+ )
155
+
156
+ with gr.Row():
157
+ # Chat section
158
+ with gr.Column(scale=3):
159
+ chatbot = gr.Chatbot(
160
+ type="messages",
161
+ height=600,
162
+ show_label=False,
163
+ placeholder="💬 Votre conversation apparaîtra ici..."
164
+ )
165
+
166
+ with gr.Row():
167
+ message_input = gr.Textbox(
168
+ placeholder="✍️ Tapez votre message...",
169
+ scale=4,
170
+ show_label=False,
171
+ max_lines=3
172
+ )
173
+ send_button = gr.Button("📤 Envoyer", variant="primary", scale=1)
174
+
175
+ # Get available models or use fallback
176
+ available_models = []
177
+ if self.client and hasattr(self.client, 'available_models'):
178
+ available_models = self.client.available_models
179
+ else:
180
+ from config import FALLBACK_MODELS
181
+ available_models = FALLBACK_MODELS
182
+
183
+ model_selector = gr.Dropdown(
184
+ choices=available_models,
185
+ value=available_models[0] if available_models else DEFAULT_MODEL,
186
+ label="🤖 Modèle Mistral",
187
+ info="Choisissez le modèle à utiliser"
188
+ )
189
+
190
+ # Impact tracking section with tabs
191
+ with gr.Column(scale=2):
192
+ gr.Markdown("## 📊 Impact Environnemental")
193
+
194
+ # Generation counter
195
+ message_count = gr.Number(
196
+ label="� Nombre de générations",
197
+ value=0,
198
+ interactive=False,
199
+ precision=0
200
+ )
201
+
202
+ # Tabs for different impact views
203
+ with gr.Tabs():
204
+ with gr.TabItem("🎯 Dernier message"):
205
+ gr.Markdown("### ⚡ **Impact de cette requête spécifique**")
206
+
207
+ # Warnings and Errors section
208
+ with gr.Group():
209
+ warnings_display = gr.Textbox(
210
+ label="⚠️ Avertissements EcoLogits",
211
+ value="",
212
+ interactive=False,
213
+ max_lines=3,
214
+ placeholder="Aucun avertissement"
215
+ )
216
+ errors_display = gr.Textbox(
217
+ label="❌ Erreurs EcoLogits",
218
+ value="",
219
+ interactive=False,
220
+ max_lines=3,
221
+ placeholder="Aucune erreur"
222
+ )
223
+
224
+ with gr.Accordion("⚡ Consommation d'énergie", open=True):
225
+ last_energy = gr.Textbox(
226
+ label="Énergie (kWh)",
227
+ value="0.000000",
228
+ interactive=False
229
+ )
230
+
231
+ with gr.Accordion("🌍 Empreinte carbone", open=True):
232
+ last_carbon = gr.Textbox(
233
+ label="Carbone (kgCO2eq)",
234
+ value="0.000000",
235
+ interactive=False
236
+ )
237
+
238
+ with gr.TabItem("📈 Session totale"):
239
+ gr.Markdown("### � **Cumul de toute la session**")
240
+
241
+ with gr.Accordion("⚡ Consommation d'énergie totale", open=True):
242
+ total_energy = gr.Textbox(
243
+ label="Énergie totale (kWh)",
244
+ value="0.000000",
245
+ interactive=False
246
+ )
247
+
248
+ with gr.Accordion("🌍 Empreinte carbone totale", open=True):
249
+ total_carbon = gr.Textbox(
250
+ label="Carbone total (kgCO2eq)",
251
+ value="0.000000",
252
+ interactive=False
253
+ )
254
+
255
+ clear_button = gr.Button("🗑️ Effacer tout", variant="secondary")
256
+
257
+ # Add some visual feedback
258
+ gr.Markdown("💡 *Utilisez le bouton 'Effacer tout' pour recommencer une nouvelle session*")
259
+
260
+ # Event handlers - Make sure output order matches exactly
261
+ outputs = [
262
+ chatbot, # 0: conversation
263
+ message_input, # 1: input field
264
+ last_energy, # 2: last energy
265
+ last_carbon, # 3: last carbon
266
+ total_energy, # 4: total energy
267
+ total_carbon, # 5: total carbon
268
+ message_count, # 6: message count
269
+ warnings_display, # 7: warnings
270
+ errors_display, # 8: errors
271
+ api_key_status # 9: API key status
272
+ ]
273
+
274
+ # Send message events
275
+ for trigger in [message_input.submit, send_button.click]:
276
+ trigger(
277
+ self.process_message,
278
+ inputs=[message_input, chatbot, model_selector, api_key_input],
279
+ outputs=outputs
280
+ )
281
+
282
+ # Clear conversation event
283
+ clear_button.click(
284
+ self.clear_conversation,
285
+ outputs=outputs[:-1] # Exclude API key status from clear
286
+ )
287
+
288
+ return interface
config.py ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ from dotenv import load_dotenv
3
+
4
+ # Load environment variables
5
+ load_dotenv()
6
+
7
+ # Configuration
8
+ MISTRAL_API_KEY = os.getenv("MISTRAL_API_KEY")
9
+ DEFAULT_MODEL = "mistral-tiny"
10
+
11
+ # Available models fallback
12
+ FALLBACK_MODELS = ["mistral-tiny", "mistral-small", "mistral-medium"]
13
+
14
+ # App configuration - Always allow user API keys for Hugging Face deployment
15
+ REQUIRE_API_KEY = True # Always require API key for public deployment
16
+ ALLOW_USER_API_KEY = True # Always allow users to provide their own key
impacts_tracker.py ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ class ImpactsTracker:
2
+ """Simple tracker for environmental impacts"""
3
+
4
+ def __init__(self):
5
+ self.reset()
6
+
7
+ def reset(self):
8
+ """Reset all counters to zero"""
9
+ self.total_energy = 0.0
10
+ self.total_carbon = 0.0 # GWP in CO2 equivalent
11
+ self.generation_count = 0
12
+ self.last_energy = 0.0
13
+ self.last_carbon = 0.0
14
+
15
+ def add_impact(self, impacts):
16
+ """Add new impact data from API response"""
17
+ # Calculate energy impact (use average of min/max)
18
+ energy_impact = (impacts.energy.value.min + impacts.energy.value.max) / 2
19
+
20
+ # Calculate carbon impact (use average of min/max)
21
+ carbon_impact = (impacts.gwp.value.min + impacts.gwp.value.max) / 2
22
+
23
+ # Store last impact for display
24
+ self.last_energy = energy_impact
25
+ self.last_carbon = carbon_impact
26
+
27
+ # Add to totals
28
+ self.total_energy += energy_impact
29
+ self.total_carbon += carbon_impact
30
+ self.generation_count += 1
31
+
32
+ def get_summary(self):
33
+ """Get a simple summary of impacts"""
34
+ return {
35
+ 'last_energy': self.last_energy,
36
+ 'last_carbon': self.last_carbon,
37
+ 'total_energy': self.total_energy,
38
+ 'total_carbon': self.total_carbon,
39
+ 'generation_count': self.generation_count
40
+ }
mistral_client.py ADDED
@@ -0,0 +1,116 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from mistralai import Mistral
2
+ from ecologits import EcoLogits
3
+ from config import MISTRAL_API_KEY, FALLBACK_MODELS
4
+
5
+ class MistralChatClient:
6
+ """Simple wrapper for Mistral API with EcoLogits tracking"""
7
+
8
+ def __init__(self, api_key=None):
9
+ # Initialize EcoLogits tracking
10
+ EcoLogits.init(providers=["mistralai"])
11
+
12
+ # Use provided API key or fall back to config
13
+ self.api_key = api_key or MISTRAL_API_KEY
14
+
15
+ if not self.api_key:
16
+ raise ValueError("Mistral API key is required. Please provide it via parameter or environment variable.")
17
+
18
+ # Initialize Mistral client
19
+ self.client = Mistral(api_key=self.api_key)
20
+
21
+ # Get available models
22
+ self.available_models = self._get_available_models()
23
+
24
+ def update_api_key(self, new_api_key):
25
+ """Update the API key and reinitialize the client"""
26
+ if not new_api_key:
27
+ raise ValueError("API key cannot be empty")
28
+
29
+ self.api_key = new_api_key
30
+ self.client = Mistral(api_key=self.api_key)
31
+ self.available_models = self._get_available_models()
32
+ return True
33
+
34
+ def _get_available_models(self):
35
+ """Get list of available text-to-text models from API"""
36
+ try:
37
+ models_response = self.client.models.list()
38
+ if models_response.data:
39
+ models = []
40
+ excluded_models = []
41
+
42
+ for model in models_response.data:
43
+ model_id = model.id
44
+
45
+ # Check if model has capabilities and supports completion_chat
46
+ if (hasattr(model, 'capabilities') and
47
+ hasattr(model.capabilities, 'completion_chat') and
48
+ model.capabilities.completion_chat):
49
+
50
+ # Exclude specialized models that aren't suitable for general chat
51
+ model_id_lower = model_id.lower()
52
+ excluded_terms = ['ocr', 'embed', 'vision', 'classifier', 'moderation']
53
+
54
+ if any(term in model_id_lower for term in excluded_terms):
55
+ excluded_models.append(f"{model_id} (specialized model)")
56
+ else:
57
+ models.append(model_id)
58
+ else:
59
+ excluded_models.append(f"{model_id} (no chat completion capability)")
60
+
61
+ # Debug output
62
+ if models:
63
+ print(f"✅ Found {len(models)} suitable chat models: {models}")
64
+ if excluded_models:
65
+ print(f"⚠️ Excluded {len(excluded_models)} models: {excluded_models[:5]}") # Show first 5
66
+
67
+ return models if models else FALLBACK_MODELS
68
+
69
+ except Exception as e:
70
+ print(f"Warning: Could not fetch models: {e}")
71
+
72
+ # Return fallback models if API call fails
73
+ print(f"📋 Using fallback models: {FALLBACK_MODELS}")
74
+ return FALLBACK_MODELS
75
+
76
+ def chat(self, messages, model):
77
+ """Send chat request and return response with impacts"""
78
+ try:
79
+ response = self.client.chat.complete(
80
+ messages=messages,
81
+ model=model
82
+ )
83
+
84
+ # Collect warnings and errors
85
+ warnings = []
86
+ errors = []
87
+
88
+ # Type none becausse Ecologits does not run in runtime, errors is not a real error.
89
+ # Source: Trust me :)
90
+ if response.impacts.has_warnings: # type: ignore
91
+ for warning in response.impacts.warnings: # type: ignore
92
+ warnings.append(str(warning))
93
+ print(f"Warning: {warning}") # type: ignore
94
+
95
+ if response.impacts.has_errors: # type: ignore
96
+ for error in response.impacts.errors: # type: ignore
97
+ errors.append(str(error))
98
+ print(f"Error: {error}") # type: ignore
99
+
100
+ # Type none becausse Ecologits does not run in runtime, errors is not a real error.
101
+ # Source: Trust me :)
102
+ return {
103
+ 'content': response.choices[0].message.content,
104
+ 'impacts': response.impacts, # type: ignore
105
+ 'warnings': warnings,
106
+ 'errors': errors
107
+ }
108
+
109
+ except Exception as e:
110
+ print(f"Error during chat completion: {e}")
111
+ return {
112
+ 'content': f"Erreur: {str(e)}",
113
+ 'impacts': None,
114
+ 'warnings': [],
115
+ 'errors': [str(e)]
116
+ }
requirements.txt CHANGED
@@ -1 +1,4 @@
1
- huggingface_hub==0.25.2
 
 
 
 
1
+ gradio>=4.0.0
2
+ mistralai
3
+ ecologits
4
+ python-dotenv