nhero commited on
Commit
4ebf6a0
·
1 Parent(s): 54f0fa0

Added Cartpole-v1 model trained with PPO

Browse files
Files changed (2) hide show
  1. README.md +3 -0
  2. ppo-CartPole-v1.zip +3 -0
README.md ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ ---
2
+ {}
3
+ ---
ppo-CartPole-v1.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4b3972bec8bac4c581e1587bd462c949759d9d05ce16600b33a8654dcac33cd4
3
+ size 137447