Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

archaeus06
/
SecurityLLM-Q2_K-GGUF

Transformers
GGUF
English
security
cybersecwithai
threat
vulnerability
infosec
zysec.ai
cyber security
ai4security
llmsecurity
cyber
malware analysis
exploitdev
ai4good
aisecurity
cybersec
cybersecurity
llama-cpp
gguf-my-repo
Model card Files Files and versions
xet
Community

Instructions to use archaeus06/SecurityLLM-Q2_K-GGUF with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use archaeus06/SecurityLLM-Q2_K-GGUF with Transformers:

    # Load model directly
    from transformers import AutoModel
    model = AutoModel.from_pretrained("archaeus06/SecurityLLM-Q2_K-GGUF", dtype="auto")
  • Notebooks
  • Google Colab
  • Kaggle
SecurityLLM-Q2_K-GGUF
2.72 GB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 4 commits
archaeus06's picture
archaeus06
Update README.md
c1dc001 verified 10 months ago
  • .gitattributes
    1.58 kB
    Upload securityllm-q2_k.gguf with huggingface_hub 10 months ago
  • README.md
    1.93 kB
    Update README.md 10 months ago
  • securityllm-q2_k.gguf
    2.72 GB
    xet
    Upload securityllm-q2_k.gguf with huggingface_hub 10 months ago