--- base_model: - jdopensource/JoyAI-LLM-Flash pipeline_tag: text-generation license: other license_name: modified-mit license_link: https://huggingface.co/jdopensource/JoyAI-LLM-Flash/blob/main/LICENSE --- # JoyAI-LLM-Flash-GGUF Simple quantizations of [jdopensource/JoyAI-LLM-Flash](/jdopensource/JoyAI-LLM-Flash) using default params in `llama-quantize`. Nothing fancy