energy-optimization-ppo / server /requirements.txt
Sushruth21's picture
Upload folder using huggingface_hub
e00c2a1 verified
openenv[core]>=0.2.0
fastapi>=0.115.0
uvicorn>=0.24.0