# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="marswallet/supportAI")# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("marswallet/supportAI")
model = AutoModelForCausalLM.from_pretrained("marswallet/supportAI")Quick Links
pipeline_tag: text-generation library_name: transformers tags: - gpt2 - text-generation
SupportAI π€
SupportAI is a fine-tuned GPT-2 model built to assist users with support and intent understanding tasks.
It can handle short conversations, detect user intent, and provide friendly, helpful replies.
Example
Input:
Author: marswallet
License: MIT
Base Model: GPT-2
Framework: Hugging Face Transformers
- Downloads last month
- -
# Gated model: Login with a HF token with gated access permission hf auth login