# Eagle 2.5 Inference Endpoint Requirements # Note: transformers and torch are pre-installed in HF Inference containers # For Eagle 2.5 support (needs recent transformers) transformers>=4.45.0 torch>=2.0.0 # CRITICAL: Eagle 2.5 uses Qwen2-VL architecture qwen-vl-utils>=0.0.8 # Video processing opencv-python-headless>=4.8.0 av>=10.0.0 # Image processing Pillow>=9.0.0 requests>=2.28.0 # Standard deps - pin numpy to avoid conflicts numpy>=1.24.0,<2.0.0 einops>=0.7.0 # For efficient attention (flash attention) accelerate>=0.25.0