ppo-LunarLander-v2 / results.json
davidkh's picture
First PPO model.
218be93 verified
raw
history blame contribute delete
165 Bytes
{"mean_reward": 249.17106900000005, "std_reward": 16.551571054921247, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2024-03-28T09:54:52.563755"}