Upload 9 files
6a1f80c verified - 1.57 kB Upload 9 files
- 24 Bytes initial commit
- 651 Bytes Upload 9 files
- 132 Bytes Upload 9 files
google_gemma-2b_grad_0.015lr_60.0pct_2500iterations.pt Detected Pickle imports (22)
- "transformers.models.gemma.modeling_gemma.GemmaModel",
- "transformers.activations.GELUActivation",
- "transformers.generation.configuration_utils.GenerationConfig",
- "transformers.modeling_rope_utils._compute_default_rope_parameters",
- "transformers.models.gemma.modeling_gemma.GemmaMLP",
- "torch.nn.modules.container.ModuleList",
- "collections.OrderedDict",
- "torch._utils._rebuild_parameter",
- "__builtin__.set",
- "transformers.generation.configuration_utils.CompileConfig",
- "transformers.models.gemma.configuration_gemma.GemmaConfig",
- "torch._utils._rebuild_tensor_v2",
- "torch.float32",
- "torch.nn.modules.linear.Linear",
- "torch._C._nn.gelu",
- "transformers.models.gemma.modeling_gemma.GemmaRMSNorm",
- "torch.FloatStorage",
- "torch.nn.modules.sparse.Embedding",
- "transformers.models.gemma.modeling_gemma.GemmaForCausalLM",
- "transformers.models.gemma.modeling_gemma.GemmaDecoderLayer",
- "transformers.models.gemma.modeling_gemma.GemmaAttention",
- "transformers.models.gemma.modeling_gemma.GemmaRotaryEmbedding"
How to fix it?
7.2 GB Upload 9 files - 4.89 GB Upload 9 files
- 2.31 GB Upload 9 files
- 13.5 kB Upload 9 files
- 522 Bytes Upload 9 files
- 34.4 MB Upload 9 files
- 40 kB Upload 9 files