a2c-PandaReachDense-v2 / results.json
gauthamk28's picture
Second iteration with 5 envs
3cb0e63
raw
history blame
165 Bytes
{"mean_reward": -1.0665987974731252, "std_reward": 0.301957025177439, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-03-20T10:56:22.725681"}