llama-3.2-3B-lora-dummy-32 / all_results.json
sizhkhy's picture
Upload folder using huggingface_hub
06eca5e verified
raw
history blame contribute delete
351 Bytes
{
"epoch": 0.992,
"eval_loss": 0.029802048578858376,
"eval_runtime": 1.4041,
"eval_samples_per_second": 5.698,
"eval_steps_per_second": 1.424,
"total_flos": 5.244736447827149e+16,
"train_loss": 0.033818339808813984,
"train_runtime": 413.5784,
"train_samples_per_second": 2.418,
"train_steps_per_second": 0.075
}