Adapter model shows up as failed in requests

#490
by sumo43 - opened

Hello,

i've tried submitting an adapter model twice and it shows up in requests as "failed". Is there a way to find out the reason?

Thank you.

results json file:

/sumo43/SOLAR-10.7B-Instruct-DPO-v1.0_eval_request_False_float16_Adapter.json
{
"model": "sumo43/SOLAR-10.7B-Instruct-DPO-v1.0",
"base_model": "upstage/SOLAR-10.7B-Instruct-v1.0",
"revision": "main",
"private": false,
"precision": "float16",
"weight_type": "Adapter",
"status": "FAILED",
"submitted_time": "2023-12-20T02:38:06Z",
"model_type": "\ud83d\udd36 : fine-tuned",
"likes": 0,
"params": 0.7,
"license": "cc-by-nc-4.0",
"job_id": "185011",
"job_start_time": "2023-12-20T13:21:57.709221"
}

Open LLM Leaderboard org

Hi!
Looking at the logs, we had a connection problem - if you link to the request file directly, I'll be able to edit it and pass your model back to pending

Open LLM Leaderboard org

Done, thank you! It should go back to evaluation soon.

We're having some trouble with our cluster atm so feel free to reopen if it fails again :)

clefourrier changed discussion status to closed

Sign up or log in to comment