Darwin-4B-david / requirements.txt
SeaWolf-AI's picture
Fix gemma4 runtime error: switch to Transformers backend + Darwin-4B-David
c8a5e69
gradio>=5.0
huggingface_hub
httpx
Pillow
uvicorn
fastapi
requests
PyMuPDF
torch>=2.4.0
# Gemma4 (model_type="gemma4") is only available in the Transformers dev branch.
# PyPI releases of transformers do NOT recognize this architecture, which is
# what caused the "The checkpoint you are trying to load has model type
# `gemma4` but Transformers does not recognize this architecture" runtime
# error. Do NOT pin a PyPI version here.
transformers @ git+https://github.com/huggingface/transformers.git
accelerate>=1.0.0
sentencepiece
protobuf
openai