Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
PerceptCLIP
/
PerceptCLIP_Memorability
like
0
Follow
PerceptCLIP
2
English
memorability
computer_vision
perceptual_tasks
CLIP
LaMem
THINGS
arxiv:
2503.13260
Model card
Files
Files and versions
xet
Community
1e38b55
PerceptCLIP_Memorability
1.23 GB
Ctrl+K
Ctrl+K
1 contributor
History:
5 commits
Amitz244
Upload lamem_all_clip_Lora_16.0R_8.0alphaLora_32_batch_0.00005_lossmse_headmlp.pth
1e38b55
verified
about 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 year ago
README.md
2.31 kB
Create README.md
about 1 year ago
lamem_all_clip_Lora_16.0R_8.0alphaLora_32_batch_0.00005_lossmse_headmlp.pth
pickle
Detected Pickle imports (31)
"transformers.models.clip.modeling_clip.CLIPEncoderLayer"
,
"transformers.models.clip.modeling_clip.CLIPAttention"
,
"transformers.models.clip.modeling_clip.CLIPVisionEmbeddings"
,
"torch.LongStorage"
,
"torch.nn.modules.conv.Conv2d"
,
"peft.tuners.lora.config.LoraConfig"
,
"peft.tuners.lora.config.LoraRuntimeConfig"
,
"torch.nn.modules.container.ModuleDict"
,
"transformers.activations.QuickGELUActivation"
,
"__builtin__.set"
,
"torch._utils._rebuild_tensor_v2"
,
"transformers.models.clip.modeling_clip.CLIPEncoder"
,
"peft.tuners.lora.model.LoraModel"
,
"peft.tuners.lora.layer.Linear"
,
"transformers.models.clip.modeling_clip.CLIPMLP"
,
"peft.utils.peft_types.PeftType"
,
"peft.peft_model.PeftModel"
,
"BaseTask.lora_model"
,
"BaseTask.MLP"
,
"torch.nn.modules.dropout.Dropout"
,
"transformers.models.clip.modeling_clip.CLIPVisionTransformer"
,
"torch._utils._rebuild_parameter"
,
"torch.nn.modules.linear.Linear"
,
"torch.nn.modules.container.ParameterDict"
,
"collections.OrderedDict"
,
"torch.nn.modules.sparse.Embedding"
,
"torch.nn.modules.container.ModuleList"
,
"torch.nn.modules.normalization.LayerNorm"
,
"transformers.models.clip.configuration_clip.CLIPVisionConfig"
,
"torch.nn.modules.activation.ReLU"
,
"torch.FloatStorage"
How to fix it?
1.23 GB
xet
Upload lamem_all_clip_Lora_16.0R_8.0alphaLora_32_batch_0.00005_lossmse_headmlp.pth
about 1 year ago