|
|
--- |
|
|
license: mit |
|
|
datasets: |
|
|
- Cynaptics/persona-chat |
|
|
- rizalHidayat/bot-dialog |
|
|
language: |
|
|
- en |
|
|
metrics: |
|
|
- accuracy |
|
|
base_model: |
|
|
- distilbert/distilgpt2 |
|
|
pipeline_tag: text-generation |
|
|
library_name: transformers |
|
|
tags: |
|
|
- llm |
|
|
- text-generation-inference |
|
|
--- |
|
|
|
|
|
# ✨DarkNeuron-AI/darkneuron-chat-v1.1 |
|
|
**DarkNeuron-Chat v1.1** is a chatbot designed for basic, friendly conversations. It provides clear and concise responses and is suitable for general use. |
|
|
|
|
|
--- |
|
|
|
|
|
## 👍Model Overview |
|
|
- **Model type:** GPT-based causal language model |
|
|
- **Purpose:** Basic conversational chatbot |
|
|
- **Training data:** Fine-tuned on [Persona-Chat](https://huggingface.co/datasets/Cynaptics/persona-chat) and [Bot-Dialog](https://huggingface.co/datasets/rizalHidayat/bot-dialog) datasets |
|
|
- **Intended audience:** General users, students, hobbyists, and researchers interested in chatbot interactions |
|
|
|
|
|
--- |
|
|
|
|
|
## 🌟Installation |
|
|
Install the latest version of Transformers: |
|
|
|
|
|
```bash |
|
|
!pip install --upgrade transformers torch |
|
|
``` |
|
|
|
|
|
--- |
|
|
|
|
|
## 👽Example Usage |
|
|
```python |
|
|
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline |
|
|
import torch, gc |
|
|
|
|
|
# Load tokenizer and model |
|
|
model_name = "DarkNeuron-AI/darkneuron-chat-v1.1" |
|
|
tokenizer = AutoTokenizer.from_pretrained(model_name) |
|
|
model = AutoModelForCausalLM.from_pretrained(model_name) |
|
|
|
|
|
# Use GPU if available |
|
|
device = 0 if torch.cuda.is_available() else -1 |
|
|
|
|
|
# Create chatbot pipeline |
|
|
chatbot = pipeline( |
|
|
"text-generation", |
|
|
model=model, |
|
|
tokenizer=tokenizer, |
|
|
device=device, |
|
|
return_full_text=False |
|
|
) |
|
|
|
|
|
# Optional: Free GPU memory |
|
|
gc.collect() |
|
|
torch.cuda.empty_cache() |
|
|
|
|
|
# Interactive chat loop |
|
|
print("Chatbot ready! Type 'exit' or 'quit' to stop.\n") |
|
|
|
|
|
while True: |
|
|
user_input = input("User: ") |
|
|
if user_input.lower() in ["exit", "quit"]: |
|
|
print("Chat ended.") |
|
|
break |
|
|
|
|
|
prompt = f"User: {user_input}\nBot:" |
|
|
response = chatbot( |
|
|
prompt, |
|
|
max_length=100, |
|
|
do_sample=True, |
|
|
temperature=0.7, |
|
|
top_p=0.9, |
|
|
num_return_sequences=1 |
|
|
) |
|
|
print(response[0]["generated_text"]) |
|
|
``` |
|
|
|
|
|
# Developed With ❤️ By DarkNeuronAI |