Update README.md
3ca83cf
verified
-
1.52 kB
initial commit
-
5.38 kB
Update README.md
-
0 Bytes
Update README.md
model.pt
Detected Pickle imports (23)
- "torch.nn.modules.normalization.LayerNorm",
- "torch._utils._rebuild_tensor_v2",
- "torch.BFloat16Storage",
- "transformers_modules.llmware.bling-stable-lm-3b-4e1t-v0.1b1d3bf057a6e98c875a55aa6062cdfadc32ba8e.modeling_stablelm_epoch.Attention",
- "torch._utils._rebuild_parameter",
- "transformers_modules.llmware.bling-stable-lm-3b-4e1t-v0.1b1d3bf057a6e98c875a55aa6062cdfadc32ba8e.modeling_stablelm_epoch.StableLMEpochForCausalLM",
- "__builtin__.set",
- "collections.OrderedDict",
- "torch.nn.modules.container.ModuleList",
- "transformers.generation.configuration_utils.GenerationConfig",
- "transformers_modules.llmware.bling-stable-lm-3b-4e1t-v0.1b1d3bf057a6e98c875a55aa6062cdfadc32ba8e.configuration_stablelm_epoch.StableLMEpochConfig",
- "transformers_modules.llmware.bling-stable-lm-3b-4e1t-v0.1b1d3bf057a6e98c875a55aa6062cdfadc32ba8e.modeling_stablelm_epoch.DecoderLayer",
- "transformers_modules.llmware.bling-stable-lm-3b-4e1t-v0.1b1d3bf057a6e98c875a55aa6062cdfadc32ba8e.modeling_stablelm_epoch.RotaryEmbedding",
- "quanto.tensor.qtype.qtype",
- "torch.float8_e4m3fn",
- "torch.nn.modules.activation.SiLU",
- "torch.device",
- "transformers_modules.llmware.bling-stable-lm-3b-4e1t-v0.1b1d3bf057a6e98c875a55aa6062cdfadc32ba8e.modeling_stablelm_epoch.StableLMEpochModel",
- "torch.nn.modules.sparse.Embedding",
- "quanto.nn.qlinear.QLinear",
- "torch.bfloat16",
- "torch.FloatStorage",
- "transformers_modules.llmware.bling-stable-lm-3b-4e1t-v0.1b1d3bf057a6e98c875a55aa6062cdfadc32ba8e.modeling_stablelm_epoch.MLP"
How to fix it?
5.6 GB
Upload folder using huggingface_hub (#1)
-
1.04 kB
Upload folder using huggingface_hub (#1)
-
441 Bytes
Upload folder using huggingface_hub (#1)
-
2.11 MB
Upload folder using huggingface_hub (#1)
-
4.85 kB
Upload folder using huggingface_hub (#1)