Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
laion
/
CLIP-ViT-g-14-laion2B-s12B-b42K
like
42
Follow
LAION eV
395
OpenCLIP
PyTorch
Safetensors
clip
arxiv:
1910.04867
License:
mit
Model card
Files
Files and versions
Community
3
Use this model
e0a996a
CLIP-ViT-g-14-laion2B-s12B-b42K
4 contributors
History:
3 commits
Ross Wightman
Update README add tokenizer/vocab/preprocessor cfg
e0a996a
over 2 years ago
.gitattributes
1.38 kB
initial commit
over 2 years ago
README.md
7.24 kB
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago
config.json
4.61 kB
Add bin files
over 2 years ago
open_clip_pytorch_model.bin
pickle
Detected Pickle imports (3)
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
5.47 GB
LFS
Add bin files
over 2 years ago
preprocessor_config.json
316 Bytes
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago
pytorch_model.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.LongStorage"
,
"collections.OrderedDict"
What is a pickle import?
5.47 GB
LFS
Add bin files
over 2 years ago
special_tokens_map.json
389 Bytes
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago
tokenizer.json
2.22 MB
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago
tokenizer_config.json
568 Bytes
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago
vocab.json
862 kB
Update README add tokenizer/vocab/preprocessor cfg
over 2 years ago