ppo-PandaPickAndPlace-v3 / results.json
OsherElhadad's picture
Initial commit
67f04ac verified
raw
history blame contribute delete
137 Bytes
{"mean_reward": -50.0, "std_reward": 0.0, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2024-05-14T10:45:20.315354"}