# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("ChaoticNeutrals/Infinitely-Laydiculous-9B")
model = AutoModelForCausalLM.from_pretrained("ChaoticNeutrals/Infinitely-Laydiculous-9B")Quick Links
Credits to @Lewdiculus for the quants and merge request: https://huggingface.co/Lewdiculous/Infinitely-Laydiculus-9b-GGUF-IQ-Imatrix
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: Endevor/InfinityRP-v1-7B
layer_range: [0, 20]
- sources:
- model: l3utterfly/mistral-7b-v0.1-layla-v4
layer_range: [12, 32]
merge_method: passthrough
dtype: float16
- Downloads last month
- 10
Model tree for ChaoticNeutrals/Infinitely-Laydiculous-9B
Merge model
this model

# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="ChaoticNeutrals/Infinitely-Laydiculous-9B")