Fetching metadata from the HF Docker repository... - 48 Bytes Deploy FastAPI with model
- 1.52 kB initial commit
- 41 Bytes Deploy FastAPI with model
- 50 Bytes add best.pt
- 723 Bytes Deploy FastAPI with model
- 108 Bytes Deploy FastAPI with model
- 203 Bytes initial commit
best.pt Detected Pickle imports (35)
- "ultralytics.nn.modules.block.Bottleneck",
- "numpy.dtype",
- "torch.Size",
- "torch._utils._rebuild_parameter",
- "torch.nn.modules.container.ModuleList",
- "torch.nn.modules.pooling.MaxPool2d",
- "ultralytics.nn.modules.conv.Concat",
- "__builtin__.set",
- "ultralytics.nn.tasks.SegmentationModel",
- "_codecs.encode",
- "torch.nn.modules.conv.ConvTranspose2d",
- "ultralytics.nn.modules.block.Proto",
- "torch.nn.modules.activation.SiLU",
- "torch.device",
- "torch.nn.modules.conv.Conv2d",
- "ultralytics.nn.modules.conv.Conv",
- "ultralytics.utils.loss.DFLoss",
- "ultralytics.nn.modules.head.Segment",
- "torch.nn.modules.loss.BCEWithLogitsLoss",
- "torch.nn.modules.container.Sequential",
- "torch.HalfStorage",
- "ultralytics.utils.loss.v8SegmentationLoss",
- "ultralytics.utils.tal.TaskAlignedAssigner",
- "numpy.core.multiarray.scalar",
- "ultralytics.utils.IterableSimpleNamespace",
- "torch.nn.modules.batchnorm.BatchNorm2d",
- "torch.nn.modules.upsampling.Upsample",
- "torch.LongStorage",
- "ultralytics.nn.modules.block.DFL",
- "ultralytics.nn.modules.block.C2f",
- "torch.FloatStorage",
- "ultralytics.nn.modules.block.SPPF",
- "torch._utils._rebuild_tensor_v2",
- "collections.OrderedDict",
- "ultralytics.utils.loss.BboxLoss"
How to fix it?
109 MB add best.pt - 10.3 kB Get new model
- 229 Bytes Get new model