Remove library name
28c3ed7 verified - 1.52 kB initial commit
- 3.72 kB Remove library name
aadb_charm.pth Detected Pickle imports (28)
- "torch.float32",
- "torch.nn.modules.normalization.LayerNorm",
- "model.MlpHead",
- "torch.nn.modules.linear.Identity",
- "model.Transformer",
- "torch.nn.modules.conv.Conv2d",
- "transformers.models.dinov2.modeling_dinov2.Dinov2LayerScale",
- "torch._utils._rebuild_parameter",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Attention",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfAttention",
- "transformers.models.dinov2.configuration_dinov2.Dinov2Config",
- "model.Model",
- "torch.FloatStorage",
- "model.PatchEmbeddings",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Layer",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Encoder",
- "collections.OrderedDict",
- "__builtin__.set",
- "torch.nn.modules.dropout.Dropout",
- "torch.nn.modules.container.ModuleList",
- "torch.nn.modules.activation.Sigmoid",
- "ml_collections.config_dict.config_dict.ConfigDict",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfOutput",
- "torch._C._nn.gelu",
- "transformers.models.dinov2.modeling_dinov2.Dinov2MLP",
- "torch.nn.modules.linear.Linear",
- "transformers.activations.GELUActivation"
How to fix it?
86.3 MB Upload pretrained models using Charm baid_charm.pth Detected Pickle imports (28)
- "torch.float32",
- "torch.nn.modules.normalization.LayerNorm",
- "model.MlpHead",
- "torch.nn.modules.linear.Identity",
- "model.Transformer",
- "torch.nn.modules.conv.Conv2d",
- "transformers.models.dinov2.modeling_dinov2.Dinov2LayerScale",
- "torch._utils._rebuild_parameter",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Attention",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfAttention",
- "transformers.models.dinov2.configuration_dinov2.Dinov2Config",
- "model.Model",
- "torch.FloatStorage",
- "model.PatchEmbeddings",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Layer",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Encoder",
- "collections.OrderedDict",
- "__builtin__.set",
- "torch.nn.modules.dropout.Dropout",
- "torch.nn.modules.container.ModuleList",
- "torch.nn.modules.activation.Sigmoid",
- "ml_collections.config_dict.config_dict.ConfigDict",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfOutput",
- "torch._C._nn.gelu",
- "transformers.models.dinov2.modeling_dinov2.Dinov2MLP",
- "torch.nn.modules.linear.Linear",
- "transformers.activations.GELUActivation"
How to fix it?
86.3 MB Upload pretrained models using Charm dino_small_pos.pt Detected Pickle imports (4)
- "collections.OrderedDict",
- "torch._utils._rebuild_tensor_v2",
- "torch._utils._rebuild_parameter",
- "torch.FloatStorage"
How to fix it?
2.11 MB Upload dino_small_pos.pt koniq10k_charm.pth Detected Pickle imports (28)
- "torch.float32",
- "torch.nn.modules.normalization.LayerNorm",
- "model.MlpHead",
- "torch.nn.modules.linear.Identity",
- "model.Transformer",
- "torch.nn.modules.conv.Conv2d",
- "transformers.models.dinov2.modeling_dinov2.Dinov2LayerScale",
- "torch._utils._rebuild_parameter",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Attention",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfAttention",
- "transformers.models.dinov2.configuration_dinov2.Dinov2Config",
- "model.Model",
- "torch.FloatStorage",
- "model.PatchEmbeddings",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Layer",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Encoder",
- "collections.OrderedDict",
- "__builtin__.set",
- "torch.nn.modules.dropout.Dropout",
- "torch.nn.modules.container.ModuleList",
- "torch.nn.modules.activation.Sigmoid",
- "ml_collections.config_dict.config_dict.ConfigDict",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfOutput",
- "torch._C._nn.gelu",
- "transformers.models.dinov2.modeling_dinov2.Dinov2MLP",
- "torch.nn.modules.linear.Linear",
- "transformers.activations.GELUActivation"
How to fix it?
86.3 MB Upload pretrained models using Charm para_charm.pth Detected Pickle imports (28)
- "torch.nn.modules.activation.Softmax",
- "torch.float32",
- "torch.nn.modules.normalization.LayerNorm",
- "model.MlpHead",
- "torch.nn.modules.linear.Identity",
- "model.Transformer",
- "torch.nn.modules.conv.Conv2d",
- "transformers.models.dinov2.modeling_dinov2.Dinov2LayerScale",
- "torch._utils._rebuild_parameter",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Attention",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfAttention",
- "transformers.models.dinov2.configuration_dinov2.Dinov2Config",
- "model.Model",
- "torch.FloatStorage",
- "model.PatchEmbeddings",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Layer",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Encoder",
- "collections.OrderedDict",
- "__builtin__.set",
- "torch.nn.modules.dropout.Dropout",
- "torch.nn.modules.container.ModuleList",
- "ml_collections.config_dict.config_dict.ConfigDict",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfOutput",
- "torch._C._nn.gelu",
- "transformers.models.dinov2.modeling_dinov2.Dinov2MLP",
- "torch.nn.modules.linear.Linear",
- "transformers.activations.GELUActivation"
How to fix it?
86.3 MB Upload pretrained models using Charm spaq_charm.pth Detected Pickle imports (28)
- "torch.float32",
- "torch.nn.modules.normalization.LayerNorm",
- "model.MlpHead",
- "torch.nn.modules.linear.Identity",
- "model.Transformer",
- "torch.nn.modules.conv.Conv2d",
- "transformers.models.dinov2.modeling_dinov2.Dinov2LayerScale",
- "torch._utils._rebuild_parameter",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Attention",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfAttention",
- "transformers.models.dinov2.configuration_dinov2.Dinov2Config",
- "model.Model",
- "torch.FloatStorage",
- "model.PatchEmbeddings",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Layer",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Encoder",
- "collections.OrderedDict",
- "__builtin__.set",
- "torch.nn.modules.dropout.Dropout",
- "torch.nn.modules.container.ModuleList",
- "torch.nn.modules.activation.Sigmoid",
- "ml_collections.config_dict.config_dict.ConfigDict",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfOutput",
- "torch._C._nn.gelu",
- "transformers.models.dinov2.modeling_dinov2.Dinov2MLP",
- "torch.nn.modules.linear.Linear",
- "transformers.activations.GELUActivation"
How to fix it?
86.3 MB Upload pretrained models using Charm tad66k_charm.pth Detected Pickle imports (28)
- "torch.float32",
- "torch.nn.modules.normalization.LayerNorm",
- "torch.nn.modules.activation.ReLU",
- "model.MlpHead",
- "torch.nn.modules.linear.Identity",
- "model.Transformer",
- "torch.nn.modules.conv.Conv2d",
- "transformers.models.dinov2.modeling_dinov2.Dinov2LayerScale",
- "torch._utils._rebuild_parameter",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Attention",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfAttention",
- "transformers.models.dinov2.configuration_dinov2.Dinov2Config",
- "model.Model",
- "torch.FloatStorage",
- "model.PatchEmbeddings",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Layer",
- "transformers.models.dinov2.modeling_dinov2.Dinov2Encoder",
- "collections.OrderedDict",
- "__builtin__.set",
- "torch.nn.modules.dropout.Dropout",
- "torch.nn.modules.container.ModuleList",
- "ml_collections.config_dict.config_dict.ConfigDict",
- "transformers.models.dinov2.modeling_dinov2.Dinov2SelfOutput",
- "torch._C._nn.gelu",
- "transformers.models.dinov2.modeling_dinov2.Dinov2MLP",
- "torch.nn.modules.linear.Linear",
- "transformers.activations.GELUActivation"
How to fix it?
86.3 MB Upload pretrained models using Charm