Update README.md
c3ff41c verified - 1.52 kB initial commit
- 324 MB Long-CLIP SAE 248 tokens
- 932 MB Long-CLIP SAE 248 tokens
Long-ViT-L-14-GmP-SAE-pickle.pt Detected Pickle imports (17)
- "__builtin__.set",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.modules.conv.Conv2d",
- "torch.FloatStorage",
- "model.model_longclip.LayerNorm",
- "model.model_longclip.ResidualAttentionBlock",
- "model.model_longclip.CLIP",
- "collections.OrderedDict",
- "model.model_longclip.Transformer",
- "torch.nn.modules.container.Sequential",
- "torch.nn.modules.activation.MultiheadAttention",
- "model.model_longclip.QuickGELU",
- "torch.nn.modules.linear.NonDynamicallyQuantizableLinear",
- "torch.nn.modules.sparse.Embedding",
- "model.model_longclip.VisionTransformer",
- "torch._utils._rebuild_parameter",
- "torch.nn.modules.linear.Linear"
How to fix it?
1.71 GB Original PyTorch model / pickle - 1.71 GB Original PyTorch model
- 2.4 kB Update README.md
- 782 Bytes Add "max_position_embeddings": 248
- 525 kB Long-CLIP SAE 248 tokens
- 1.71 GB Long-CLIP SAE 248 tokens
- 335 Bytes Long-CLIP SAE 248 tokens
- 389 Bytes Long-CLIP SAE 248 tokens
- 2.22 MB Long-CLIP SAE 248 tokens
- 907 Bytes Long-CLIP SAE 248 tokens
- 961 kB Long-CLIP SAE 248 tokens