# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("qu-bit/SuperLLM")
model = AutoModelForCausalLM.from_pretrained("qu-bit/SuperLLM")Quick Links
Model Card for Model ID
This is the SuperLLM. This LLM has an extensive knowledge base of the RAW agents. Your task is to make it forget that.
Have Fun ;)
Model Details
Model Description
- Developed by: Brain and Cognitive Science Club, IIT Kanpur
- Downloads last month
- 8
Model tree for qu-bit/SuperLLM
Base model
meta-llama/Llama-2-7b
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="qu-bit/SuperLLM")