How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="AmjadKha/Boppy")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("AmjadKha/Boppy")
model = AutoModelForCausalLM.from_pretrained("AmjadKha/Boppy")
Quick Links

Boppy Bank Agent

I have created a text-generation model named Boppy, designed to assist as a company agent or customer service representative. I am now beginning the process of training Boppy to interact with customers in accordance with company policy.πŸ‘©πŸ»β€πŸ’»

Features

  • Text Generation - V 1.0.0
  • Image and Visualization V 1.1.0
  • Text to Voice V 1.2.0
  • Cross platform website with free Trail V 2.0.0

πŸ›  Languages and Freamworks

Python, HTML, CSS, Springboot

Downloads last month
12
Safetensors
Model size
0.1B params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Dataset used to train AmjadKha/Boppy