Update README.md
e7a4b16 verified - 1.52 kB initial commit
- 4.43 kB Update README.md
pair-B1.pt Detected Pickle imports (38)
- "model_vssp.Encoder",
- "_codecs.encode",
- "torch.nn.modules.dropout.Dropout",
- "model_vssp.FeatureWiseAffine",
- "MHTransformer_dis.MlpTransformer",
- "MHTransformer_dis.Transformer",
- "open_clip.transformer.LayerNorm",
- "torch._utils._rebuild_parameter",
- "collections.OrderedDict",
- "torch.nn.modules.container.Sequential",
- "torch.nn.modules.activation.Softmax",
- "open_clip.transformer.Transformer",
- "torch.nn.modules.conv.Conv2d",
- "open_clip.transformer.VisionTransformer",
- "regex._regex.compile",
- "MHTransformer.TransformerLayer",
- "torch.nn.modules.activation.MultiheadAttention",
- "open_clip.tokenizer._clean_lower",
- "open_clip.model.CLIP",
- "torch.nn.modules.linear.NonDynamicallyQuantizableLinear",
- "model_vssp.Backbone",
- "torch.nn.modules.sparse.Embedding",
- "torch.FloatStorage",
- "open_clip.transformer.ResidualAttentionBlock",
- "__builtin__.set",
- "MHTransformer.MultiHeadAttention",
- "torch.nn.modules.linear.Linear",
- "MHTransformer_dis.TransformerLayer",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.functional.relu",
- "MHTransformer_dis.MultiHeadAttention",
- "torch.nn.modules.linear.Identity",
- "torch.nn.modules.normalization.LayerNorm",
- "MHTransformer.MlpTransformer",
- "MHTransformer.Transformer",
- "torch.nn.modules.container.ModuleList",
- "torch.nn.modules.activation.GELU",
- "open_clip.tokenizer.SimpleTokenizer"
How to fix it?
695 MB Upload 2 files pair-B2.pt Detected Pickle imports (38)
- "collections.OrderedDict",
- "_codecs.encode",
- "MHTransformer.MultiHeadAttention",
- "MHTransformer_dis.TransformerLayer",
- "open_clip.tokenizer.SimpleTokenizer",
- "MHTransformer_dis.Transformer",
- "torch.nn.modules.linear.NonDynamicallyQuantizableLinear",
- "torch.nn.modules.activation.Softmax",
- "MHTransformer.TransformerLayer",
- "torch.nn.modules.activation.MultiheadAttention",
- "torch.nn.modules.linear.Identity",
- "torch.nn.functional.relu",
- "open_clip.model.CLIP",
- "torch._utils._rebuild_parameter",
- "MHTransformer.Transformer",
- "torch.nn.modules.dropout.Dropout",
- "open_clip.transformer.VisionTransformer",
- "model_vssp.FeatureWiseAffine",
- "torch.nn.modules.normalization.LayerNorm",
- "MHTransformer_dis.MlpTransformer",
- "open_clip.transformer.Transformer",
- "torch.nn.modules.sparse.Embedding",
- "torch.FloatStorage",
- "torch.nn.modules.activation.GELU",
- "MHTransformer.MlpTransformer",
- "model_vssp.Backbone",
- "torch.nn.modules.conv.Conv2d",
- "open_clip.transformer.ResidualAttentionBlock",
- "torch._utils._rebuild_tensor_v2",
- "open_clip.transformer.LayerNorm",
- "torch.nn.modules.linear.Linear",
- "torch.nn.modules.container.ModuleList",
- "open_clip.tokenizer._clean_lower",
- "__builtin__.set",
- "regex._regex.compile",
- "model_vssp.Encoder",
- "MHTransformer_dis.MultiHeadAttention",
- "torch.nn.modules.container.Sequential"
How to fix it?
695 MB Upload 2 files