ppo-LunarLander-v2 / my_lander_v1 /_stable_baselines3_version
pramodvadiraj's picture
Upload first version of the Lunar-Lander trained using PPO
2c3ce73
1.7.0