tempcontrol-ppo / README.md
Sumayyakhalid92587's picture
Upload README.md with huggingface_hub
b022a38 verified
---
library_name: stable-baselines3
tags:
- tempcontrol
- openenv
- gymnasium
- reinforcement-learning
---
# TempControl PPO — task1
Trained with Stable-Baselines3 PPO on TempControl-OpenEnv.
## Load & Run
```python
from stable_baselines3 import PPO
from tasks.task1_* import make_env
model = PPO.load("ppo_task1")
env = make_env()
obs, _ = env.reset()
action, _ = model.predict(obs)
```