Update README.md
1695e45
verified
-
1.52 kB
initial commit
-
5.36 kB
Upload folder using huggingface_hub (#1)
-
189 Bytes
Upload folder using huggingface_hub (#1)
-
0 Bytes
Update README.md
model.pt
Detected Pickle imports (23)
- "__builtin__.set",
- "torch.FloatStorage",
- "transformers_modules.internlm.internlm2-chat-7b-sft.907bec6e73eedfbb5b81a833b185057d1169eb3c.configuration_internlm2.InternLM2Config",
- "transformers_modules.internlm.internlm2-chat-7b-sft.907bec6e73eedfbb5b81a833b185057d1169eb3c.modeling_internlm2.InternLM2RMSNorm",
- "transformers.generation.configuration_utils.GenerationConfig",
- "torch.BFloat16Storage",
- "transformers_modules.internlm.internlm2-chat-7b-sft.907bec6e73eedfbb5b81a833b185057d1169eb3c.modeling_internlm2.InternLM2Model",
- "torch.nn.modules.activation.SiLU",
- "torch.nn.modules.sparse.Embedding",
- "transformers_modules.internlm.internlm2-chat-7b-sft.907bec6e73eedfbb5b81a833b185057d1169eb3c.modeling_internlm2.InternLM2DecoderLayer",
- "transformers_modules.internlm.internlm2-chat-7b-sft.907bec6e73eedfbb5b81a833b185057d1169eb3c.modeling_internlm2.InternLM2ForCausalLM",
- "quanto.nn.qlinear.QLinear",
- "torch.bfloat16",
- "torch._utils._rebuild_tensor_v2",
- "torch.device",
- "transformers_modules.internlm.internlm2-chat-7b-sft.907bec6e73eedfbb5b81a833b185057d1169eb3c.modeling_internlm2.InternLM2DynamicNTKScalingRotaryEmbedding",
- "collections.OrderedDict",
- "quanto.tensor.qtype.qtype",
- "torch.nn.modules.container.ModuleList",
- "transformers_modules.internlm.internlm2-chat-7b-sft.907bec6e73eedfbb5b81a833b185057d1169eb3c.modeling_internlm2.InternLM2Attention",
- "torch.int8",
- "transformers_modules.internlm.internlm2-chat-7b-sft.907bec6e73eedfbb5b81a833b185057d1169eb3c.modeling_internlm2.InternLM2MLP",
- "torch._utils._rebuild_parameter"
How to fix it?
15.5 GB
Upload folder using huggingface_hub (#1)
-
1.03 kB
Upload folder using huggingface_hub (#1)
-
713 Bytes
Upload folder using huggingface_hub (#1)
-
8.81 kB
Upload folder using huggingface_hub (#1)
-
7.81 kB
Upload folder using huggingface_hub (#1)
-
5.79 MB
Upload folder using huggingface_hub (#1)
-
1.48 MB
Upload folder using huggingface_hub (#1)
-
37.3 kB
Upload folder using huggingface_hub (#1)