| language: en | |
| license: mit | |
| tags: | |
| - conversational | |
| - gpt | |
| - chatbot | |
| - finetune | |
| model_name: my-dialoGPT-model | |
| finetuned_from: microsoft/DialoGPT-small | |
| datasets: | |
| - custom | |
| # My DialoGPT Model | |
| This is a fine-tuned version of `microsoft/DialoGPT-small` on custom data about Dominica. | |
| ## Model Details | |
| - **Model Name**: DialoGPT-small | |
| - **Training Data**: Custom dataset about Dominica | |
| - **Evaluation**: Achieved `eval_loss` of 12.85 | |
| ## Usage | |
| To use this model, you can load it as follows: | |
| ```python | |
| from transformers import AutoModelForCausalLM, AutoTokenizer | |
| # Load the model and tokenizer | |
| model_name = "unknownCode/IslandBoyRepo" | |
| tokenizer = AutoTokenizer.from_pretrained(model_name) | |
| model = AutoModelForCausalLM.from_pretrained(model_name) | |
| # Generate a response | |
| input_text = "What is the capital of Dominica?" | |
| inputs = tokenizer(input_text, return_tensors="pt") | |
| outputs = model.generate(**inputs) | |
| response = tokenizer.decode(outputs[0], skip_special_tokens=True) | |
| print(response) | |
| ``` | |