Creating gguf for UlizaLlama3

#1
by MeHereDude - opened
No description provided.
MeHereDude changed pull request status to open

Add UlizaLlama3 model in GGUF format

  • Model: UlizaLlama3
  • Format: GGUF (GPTQ-for-GGML Unified Format)
  • Includes tokenizer and configuration files
  • Optimized for memory-efficient inference
  • Tested compatibility with popular GGUF frameworks
  • Updated documentation for GGUF usage
  • Added performance benchmarks and model specs

Closes #pr/1:refs/pr/1

MeHereDude changed pull request status to closed

Sign up or log in to comment