# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Fischerboot/thisisamodeltoo")
model = AutoModelForCausalLM.from_pretrained("Fischerboot/thisisamodeltoo")Quick Links
output-model-directory
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: nidek+Fischerboot/goofyahhmodelqloraadapterandshit
merge_method: passthrough
dtype: bfloat16
- Downloads last month
- 9
Model tree for Fischerboot/thisisamodeltoo
Base model
concedo/KobbleTinyV2-1.1B
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Fischerboot/thisisamodeltoo")