Failed to load model using open_clip.create_model_and_transforms

#2
by deleted - opened
deleted

Hello. I used the code provided in huggingface to load the model and got 404 Client Error. I found the model bin file in cache folder, but the error said there is no open_clip_config.json file.

Code:
model, preprocess_train, preprocess_val = open_clip.create_model_and_transforms('hf-hub:laion/CLIP-ViT-H-14-frozen-xlm-roberta-large-laion5B-s13B-b90k')
tokenizer = open_clip.get_tokenizer('hf-hub:laion/CLIP-ViT-H-14-frozen-xlm-roberta-large-laion5B-s13B-b90k')

Error:
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/laion/CLIP-ViT-H-14-frozen-xlm-roberta-large-laion5B-s13B-b90k/resolve/main/open_clip_config.json

LAION eV org
edited Aug 2

@jimmy-para hmm, looks like this model was never uploaded properly with the push_to_hub that creates the correct configs, etc soooo, the hf-hub loading mechanism (that sources config + weights from the hub) will not work. You can use the libraries internal configs like so

model, preprocess_train, preprocess_val = open_clip.create_model_and_transforms('xlm-roberta-large-ViT-H-14', pretrained='frozen_laion5b_s13b_b90k')
tokenizer = open_clip.get_tokenizer('xlm-roberta-large-ViT-H-14')

This still uses the weights in this repo, but it sources the configs from within the library, whereas having the hf-hub prefix looks for the config in the hub repository as well.

deleted

That works! Thank you so much!

Sign up or log in to comment