How to use hyen/CEPED-LLaMA-2-Chat-7B with Transformers:
# Load model directly from transformers import AutoTokenizer, LlamaForCausalContextLM tokenizer = AutoTokenizer.from_pretrained("hyen/CEPED-LLaMA-2-Chat-7B") model = LlamaForCausalContextLM.from_pretrained("hyen/CEPED-LLaMA-2-Chat-7B")
No model card
Files info