Introducing HFL β€” HuggingFace Local

#1
by ggalancs - opened
Owner

πŸš€ Introducing HFL β€” HuggingFace Local

Run any of the 500,000+ HuggingFace models locally with a single CLI. Think of it as an alternative AI local server for the entire HuggingFace Hub.

What it does:

  • pip install hfl β†’ hfl pull β†’ hfl run
  • OpenAI & Ollama compatible API server (hfl serve)
  • 3 backends: llama.cpp, transformers, vLLM
  • TTS support (Bark, Coqui XTTS-v2)
  • Built-in license verification & EU AI Act compliance
  • 1900 tests, 90%+ coverage

Why? Ollama is great but limited to ~500 curated models. HFL gives you access to everything
on the Hub β€” GGUF, safetensors, with automatic format conversion.

πŸ”— GitHub: https://github.com/ggalancs/hfl
πŸ“¦ PyPI: pip install hfl
πŸ€— Space: https://huggingface.co/spaces/galancs/hfl

Feedback welcome! πŸ™

Sign up or log in to comment