Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
AIDC-AI
/
Ovis-Clip-Qwen1_5-7B
like
2
Follow
AIDC-AI
55
Image-Text-to-Text
Transformers
PyTorch
AIDC-AI/Ovis-dataset
ovis
text-generation
MLLM
conversational
custom_code
arxiv:
2405.20797
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Use this model
a2db297
Ovis-Clip-Qwen1_5-7B
1 contributor
History:
2 commits
runninglsy
initial commit
a2db297
5 months ago
.gitattributes
1.52 kB
initial commit
5 months ago
README.md
6.19 kB
initial commit
5 months ago
added_tokens.json
80 Bytes
initial commit
5 months ago
base_visual_tokenizer.py
5.87 kB
initial commit
5 months ago
clip_visual_tokenizer.py
5.9 kB
initial commit
5 months ago
config.json
6.8 kB
initial commit
5 months ago
configuration_ovis.py
1.9 kB
initial commit
5 months ago
conversation_formatter.py
7.16 kB
initial commit
5 months ago
generation_config.json
276 Bytes
initial commit
5 months ago
merges.txt
1.67 MB
initial commit
5 months ago
modeling_ovis.py
15.9 kB
initial commit
5 months ago
preprocessor_config.json
781 Bytes
initial commit
5 months ago
pytorch_model-00001-of-00004.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.BFloat16Storage"
What is a pickle import?
4.99 GB
LFS
initial commit
5 months ago
pytorch_model-00002-of-00004.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.BFloat16Storage"
What is a pickle import?
4.98 GB
LFS
initial commit
5 months ago
pytorch_model-00003-of-00004.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.BFloat16Storage"
What is a pickle import?
4.23 GB
LFS
initial commit
5 months ago
pytorch_model-00004-of-00004.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.BFloat16Storage"
What is a pickle import?
3.19 GB
LFS
initial commit
5 months ago
pytorch_model.bin.index.json
80.1 kB
initial commit
5 months ago
special_tokens_map.json
367 Bytes
initial commit
5 months ago
tokenizer.json
7.03 MB
initial commit
5 months ago
tokenizer_config.json
1.3 kB
initial commit
5 months ago
utils.py
920 Bytes
initial commit
5 months ago
visual_tokenizer.py
11.6 kB
initial commit
5 months ago
vocab.json
2.78 MB
initial commit
5 months ago