ilessio-aiflowlab commited on
Commit
3cad919
·
verified ·
1 Parent(s): c4c80fa

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -36,3 +36,6 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
36
  exports/project_fenris_v0.1.0_fp32.engine filter=lfs diff=lfs merge=lfs -text
37
  exports/project_fenris_v0.1.0_fp16.engine filter=lfs diff=lfs merge=lfs -text
38
  exports/project_fenris_v0.1.0.onnx.data filter=lfs diff=lfs merge=lfs -text
 
 
 
 
36
  exports/project_fenris_v0.1.0_fp32.engine filter=lfs diff=lfs merge=lfs -text
37
  exports/project_fenris_v0.1.0_fp16.engine filter=lfs diff=lfs merge=lfs -text
38
  exports/project_fenris_v0.1.0.onnx.data filter=lfs diff=lfs merge=lfs -text
39
+ onnx/fenris_v0.1.0.onnx.data filter=lfs diff=lfs merge=lfs -text
40
+ tensorrt/fenris_v0.1.0_fp16.engine filter=lfs diff=lfs merge=lfs -text
41
+ tensorrt/fenris_v0.1.0_fp32.engine filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,130 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - robotics
4
+ - anima
5
+ - fenris
6
+ - infrared
7
+ - object-detection
8
+ - yolo
9
+ - irstd
10
+ - robot-flow-labs
11
+ library_name: pytorch
12
+ pipeline_tag: object-detection
13
+ license: apache-2.0
14
+ ---
15
+
16
+ # FENRIS -- AA-YOLO: Anomaly-Aware YOLO for Infrared Small Target Detection
17
+
18
+ Part of the [ANIMA Perception Suite](https://robotflowlabs.com) by Robot Flow Labs.
19
+
20
+ ## Paper
21
+
22
+ **AA-YOLO: Anomaly-Aware YOLO for Infrared Small Target Detection**
23
+ arXiv: [2510.04741](https://arxiv.org/abs/2510.04741)
24
+
25
+ ## Architecture
26
+
27
+ AA-YOLO introduces an Anomaly-Aware Detection Head (AADH) that reformulates IR target detection as a statistical anomaly detection problem. The AADH adds only +0.2M parameters and +5% FLOPs to any YOLO backbone. Key components:
28
+
29
+ 1. **Spatial Filtering Block** -- Two Conv1d layers (3x3) with BatchNorm/ReLU, compressing to C=8 channels
30
+ 2. **Statistical Anomaly Test** -- Multivariate hypothesis test computing `-ln(F_mu2)` anomaly scores
31
+ 3. **Scaled Sigmoid Activation** -- Temperature-scaled sigmoid (alpha=0.001) for sharp anomaly response
32
+ 4. **Backbone-agnostic** -- Drop-in head for YOLOv5/v7/v9 backbones
33
+
34
+ This checkpoint uses the **AA-YOLOv7t** backbone (YOLOv8n architecture as v7-tiny proxy).
35
+
36
+ ## Training Summary
37
+
38
+ | Parameter | Value |
39
+ |-----------|-------|
40
+ | Dataset | SIRST (427 images: 256 train / 85 val / 86 test) |
41
+ | Backbone | AA-YOLOv7t (YOLOv8n + AADH) |
42
+ | Epochs | 100 |
43
+ | Optimizer | SGD (lr=0.01, momentum=0.937, weight_decay=0.0005) |
44
+ | Scheduler | Cosine annealing, 3 epoch warmup |
45
+ | Precision | FP16 mixed precision |
46
+ | Best val_loss | 1.0874 (epoch 73) |
47
+ | Final train_loss | 1.0715 |
48
+ | Backbone freeze | First 10 epochs |
49
+ | Hardware | NVIDIA L4 (23GB) |
50
+ | Training time | ~2 min (100 epochs) |
51
+
52
+ ### Loss Curve
53
+
54
+ - Epoch 0: train=2.60, val=2.50
55
+ - Epoch 25: train=1.14, val=1.15
56
+ - Epoch 50: train=1.09, val=1.10
57
+ - Epoch 73 (best): train=1.08, val=1.09
58
+ - Epoch 99: train=1.07, val=1.09
59
+
60
+ ## Exported Formats
61
+
62
+ | Format | File | Size | Use Case |
63
+ |--------|------|------|----------|
64
+ | PyTorch (.pth) | `pytorch/fenris_v0.1.0.pth` | 12 MB | Training, fine-tuning |
65
+ | SafeTensors | `pytorch/fenris_v0.1.0.safetensors` | 12 MB | Fast loading, safe |
66
+ | ONNX (opset 18) | `onnx/fenris_v0.1.0_allinone.onnx` | 12.3 MB | Cross-platform inference |
67
+ | TensorRT FP32 | `tensorrt/fenris_v0.1.0_fp32.engine` | ~17 MB | Full precision inference |
68
+ | TensorRT FP16 | `tensorrt/fenris_v0.1.0_fp16.engine` | ~8 MB | Edge deployment (Jetson/L4) |
69
+ | Training checkpoint | `checkpoints/best.pth` | 24 MB | Resume training (includes optimizer state) |
70
+
71
+ ## Quick Start
72
+
73
+ ```python
74
+ import torch
75
+ from safetensors.torch import load_file
76
+
77
+ # Load SafeTensors (recommended)
78
+ state_dict = load_file("pytorch/fenris_v0.1.0.safetensors")
79
+
80
+ # Or load PyTorch
81
+ state_dict = torch.load("pytorch/fenris_v0.1.0.pth", map_location="cpu")
82
+ ```
83
+
84
+ ### ONNX Inference
85
+
86
+ ```python
87
+ import onnxruntime as ort
88
+ import numpy as np
89
+
90
+ session = ort.InferenceSession("onnx/fenris_v0.1.0_allinone.onnx")
91
+ input_data = np.random.randn(1, 3, 256, 256).astype(np.float32)
92
+ outputs = session.run(None, {"ir_input": input_data})
93
+ ```
94
+
95
+ ### TensorRT Inference
96
+
97
+ ```python
98
+ import tensorrt as trt
99
+
100
+ runtime = trt.Runtime(trt.Logger(trt.Logger.WARNING))
101
+ with open("tensorrt/fenris_v0.1.0_fp16.engine", "rb") as f:
102
+ engine = runtime.deserialize_cuda_engine(f.read())
103
+ ```
104
+
105
+ ## Datasets
106
+
107
+ - **SIRST** (427 images, 256x256) -- Primary benchmark
108
+ - **IRSTD-1k** (1,000 images, 512x512) -- Extended evaluation
109
+ - **VEDAI** (1,200 images, IR satellite) -- Cross-domain evaluation
110
+
111
+ ## Target Metrics (Paper)
112
+
113
+ | Metric | Target | Dataset |
114
+ |--------|--------|---------|
115
+ | F1 | >= 98.1% | SIRST |
116
+ | AP_s | >= 79.8% | SIRST |
117
+ | AP | >= 72.0% | IRSTD-1k |
118
+ | AP | >= 65.0% | VEDAI |
119
+
120
+ ## ANIMA Module
121
+
122
+ - **Module name:** fenris
123
+ - **Category:** perception.foundation
124
+ - **Container:** `ghcr.io/robotflow-labs/anima-fenris:0.1.0`
125
+ - **API port:** 8080
126
+ - **Health:** `/health`, `/ready`, `/info`
127
+
128
+ ## License
129
+
130
+ Apache 2.0 -- Robot Flow Labs / AIFLOW LABS LIMITED
TRAINING_REPORT.md ADDED
@@ -0,0 +1,86 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # TRAINING REPORT -- FENRIS v0.1.0
2
+
3
+ ## Summary
4
+
5
+ | Field | Value |
6
+ |-------|-------|
7
+ | Module | FENRIS (AA-YOLO) |
8
+ | Version | 0.1.0 |
9
+ | Date | 2026-04-02 |
10
+ | Status | Training Complete |
11
+
12
+ ## Configuration
13
+
14
+ | Parameter | Value |
15
+ |-----------|-------|
16
+ | Model | AA-YOLOv7t (YOLOv8n backbone + AADH heads) |
17
+ | Dataset | SIRST (427 images: 256 train / 85 val / 86 test) |
18
+ | Resolution | 256x256 |
19
+ | Epochs | 100 |
20
+ | Optimizer | SGD (lr=0.01, momentum=0.937, wd=0.0005) |
21
+ | Scheduler | Cosine annealing, 3 epoch linear warmup |
22
+ | Precision | FP16 mixed precision (CUDA) |
23
+ | Gradient clip | max_norm=10.0 |
24
+ | Batch size | 16 |
25
+ | Backbone freeze | First 10 epochs |
26
+ | Augmentation | Mosaic, HSV (h=0.015, v=0.4), horizontal flip (0.5) |
27
+ | Seed | 42 |
28
+ | Pretrained | COCO weights |
29
+
30
+ ## Hardware
31
+
32
+ | Resource | Spec |
33
+ |----------|------|
34
+ | GPU | NVIDIA L4 (23GB VRAM) |
35
+ | CUDA | 12.4 |
36
+ | PyTorch | 2.6.0 |
37
+ | Training time | ~2 minutes (100 epochs) |
38
+
39
+ ## Results
40
+
41
+ ### Training Metrics
42
+
43
+ | Epoch | train_loss | val_loss | lr_backbone | lr_head |
44
+ |-------|-----------|----------|-------------|---------|
45
+ | 0 | 2.604 | 2.501 | 0.001 | 0.010 |
46
+ | 10 | 1.167 | 1.184 | 0.010 | 0.010 |
47
+ | 25 | 1.110 | 1.118 | 0.009 | 0.009 |
48
+ | 50 | 1.094 | 1.102 | 0.005 | 0.005 |
49
+ | 73 (best) | 1.077 | 1.087 | 0.000 | 0.002 |
50
+ | 99 (last) | 1.072 | 1.090 | 0.000 | 0.000 |
51
+
52
+ ### Loss Decomposition (Best Epoch 73)
53
+
54
+ | Component | Train | Val |
55
+ |-----------|-------|-----|
56
+ | obj_loss | 0.0103 | 0.0101 |
57
+ | cls_loss | 0.0103 | 0.0102 |
58
+ | reg_loss | 1.0567 | 1.0670 |
59
+ | **total** | **1.077** | **1.087** |
60
+
61
+ ## Export Pipeline
62
+
63
+ | Format | File | Size | Status |
64
+ |--------|------|------|--------|
65
+ | PyTorch (.pth) | fenris_v0.1.0.pth | 12.0 MB | Done |
66
+ | SafeTensors | fenris_v0.1.0.safetensors | 11.9 MB | Done |
67
+ | ONNX (opset 18) | fenris_v0.1.0_allinone.onnx | 12.3 MB | Done |
68
+ | TensorRT FP32 | fenris_v0.1.0_fp32.engine | ~17 MB | Done |
69
+ | TensorRT FP16 | fenris_v0.1.0_fp16.engine | ~8 MB | Done |
70
+
71
+ ## Artifact Locations
72
+
73
+ ```
74
+ /mnt/artifacts-datai/checkpoints/project_fenris/best.pt -- best checkpoint (epoch 73)
75
+ /mnt/artifacts-datai/checkpoints/project_fenris/last.pt -- last checkpoint (epoch 99)
76
+ /mnt/artifacts-datai/models/project_fenris/ -- pth + safetensors
77
+ /mnt/artifacts-datai/exports/project_fenris/ -- ONNX + TRT engines
78
+ /mnt/artifacts-datai/logs/project_fenris/ -- training logs
79
+ ```
80
+
81
+ ## Next Steps
82
+
83
+ 1. Evaluate on TEST split (86 images) -- compute F1, AP_s vs paper targets
84
+ 2. Train additional backbones (YOLOv5n, YOLOv5s, YOLOv9t)
85
+ 3. Download VEDAI + IRSTD-1k for multi-dataset evaluation
86
+ 4. Frugal learning experiment (25 images)
anima_module.yaml CHANGED
@@ -1,57 +1,37 @@
1
- schema_version: "1.0"
2
-
3
  module:
4
  name: fenris
5
  version: 0.1.0
6
- display_name: "FENRIS — AA-YOLO Anomaly-Aware Infrared Small Target Detection"
7
- description: >
8
- Anomaly-aware YOLO for infrared small target detection with frugal
9
- field adaptation (25 images minimum). Deployable on edge hardware.
10
- category: perception.detection_ir
11
- license: Apache-2.0
12
- paper: "https://arxiv.org/abs/2510.04741"
13
-
14
  capabilities:
15
  provides:
16
- - type: detection
17
- subtype: infrared_small_target
18
- - type: anomaly_detection
19
- subtype: thermal_anomaly
20
-
21
  interface:
22
- inputs:
23
- - name: ir_frame
24
- ros2_type: sensor_msgs/msg/Image
25
- outputs:
26
- - name: detections
27
- ros2_type: std_msgs/msg/String
28
- - name: anomaly_map
29
- ros2_type: sensor_msgs/msg/Image
30
-
31
  hardware:
32
  platforms:
33
- - name: linux_x86
34
- backends: [onnxruntime_cuda, onnxruntime_cpu]
35
- - name: darwin_arm64
36
- backends: [onnxruntime_cpu, mlx]
37
-
38
  performance:
39
  profiles:
40
- - platform: linux_x86
41
- model: fenris-aayolo
42
- backend: onnxruntime_cuda
43
- fps: 30
44
- latency_p50_ms: 33
45
-
46
  safety:
47
  failure_mode: returns_empty
48
- timeout_ms: 500
49
  health_topic: /anima/fenris/health
50
-
51
  composability:
52
  pairs_well_with: []
53
  conflicts_with: []
54
-
55
  container:
56
  image: ghcr.io/robotflow-labs/anima-fenris:0.1.0
57
  port: 8080
 
1
+ schema_version: '1.0'
 
2
  module:
3
  name: fenris
4
  version: 0.1.0
5
+ display_name: FENRIS
6
+ description: FENRIS — Foundation perception module for the ANIMA robotics stack
7
+ category: perception.foundation
8
+ license: MIT
 
 
 
 
9
  capabilities:
10
  provides:
11
+ - type: perception
12
+ subtype: foundation
 
 
 
13
  interface:
14
+ inputs: []
15
+ outputs: []
 
 
 
 
 
 
 
16
  hardware:
17
  platforms:
18
+ - name: x86_64
19
+ backends:
20
+ - cuda
 
 
21
  performance:
22
  profiles:
23
+ - platform: x86_64_cuda
24
+ model: fenris-v1
25
+ backend: cuda
26
+ fps: 10
27
+ latency_p50_ms: 100
 
28
  safety:
29
  failure_mode: returns_empty
30
+ timeout_ms: 5000
31
  health_topic: /anima/fenris/health
 
32
  composability:
33
  pairs_well_with: []
34
  conflicts_with: []
 
35
  container:
36
  image: ghcr.io/robotflow-labs/anima-fenris:0.1.0
37
  port: 8080
checkpoints/best.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:633c88da1177b276b24229d2515b85f03938f3567397b87743244b86dd07603e
3
+ size 25009139
configs/base.yaml ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ project:
2
+ name: "fenris"
3
+ version: "0.1.0"
4
+ paper: "AA-YOLO: Anomaly-Aware YOLO for Infrared Small Target Detection"
5
+
6
+ paths:
7
+ data_root: "data"
8
+ outputs_root: "outputs"
9
+ weights_root: "weights"
10
+ shared_datasets_root: "/mnt/forge-data/datasets"
11
+ seed: 42
12
+
13
+ defaults:
14
+ model: "configs/model/aa_yolov7t.yaml"
15
+ train: "configs/train/default.yaml"
16
+ deploy: "configs/deploy/jetson.yaml"
configs/cuda_server.yaml ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ training:
2
+ device: "cuda"
3
+ precision: "fp16"
4
+ batch_size: 8
5
+ num_workers: 4
6
+ gradient_clip_norm: 10.0
7
+ checkpoint_interval: 1
8
+ val_interval: 1
9
+ log_interval: 10
configs/training.yaml ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ training:
2
+ dataset: "sirst"
3
+ resolution: 256
4
+ optimizer: "sgd"
5
+ lr: 0.01
6
+ momentum: 0.937
7
+ weight_decay: 0.0005
8
+ scheduler: "cosine"
9
+ warmup_epochs: 3
10
+ epochs: 100
11
+ batch_size: 16
12
+ freeze_backbone_epochs: 10
13
+ precision: "fp16"
14
+ device: "auto"
15
+ gradient_clip_norm: 10.0
16
+ num_workers: 4
17
+ checkpoint_interval: 1
18
+ val_interval: 1
19
+ log_interval: 10
20
+ pretrain: "coco"
21
+ augmentation:
22
+ mosaic: true
23
+ mixup: 0.0
24
+ hsv_h: 0.015
25
+ hsv_s: 0.0
26
+ hsv_v: 0.4
27
+ flip_lr: 0.5
28
+ frugal_learning:
29
+ enabled: false
30
+ num_images: 25
31
+ strategy: "diverse"
logs/training_history.json ADDED
@@ -0,0 +1,1402 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "train_loss": 2.6039695739746094,
4
+ "train_obj": 0.7111365795135498,
5
+ "train_cls": 0.7091652899980545,
6
+ "train_reg": 1.1836676597595215,
7
+ "train_duration_s": 2.1187828250112943,
8
+ "val_loss": 2.5012197494506836,
9
+ "val_obj": 0.6826423108577728,
10
+ "val_cls": 0.6797782778739929,
11
+ "val_reg": 1.1387991905212402,
12
+ "val_duration_s": 0.7463133980054408,
13
+ "lr_backbone": 0.0009997779521645793,
14
+ "lr_head": 0.009997557473810372
15
+ },
16
+ {
17
+ "train_loss": 2.4343828558921814,
18
+ "train_obj": 0.630094513297081,
19
+ "train_cls": 0.6232971101999283,
20
+ "train_reg": 1.180991291999817,
21
+ "train_duration_s": 0.5813549190061167,
22
+ "val_loss": 2.450488567352295,
23
+ "val_obj": 0.6570659875869751,
24
+ "val_cls": 0.6556430757045746,
25
+ "val_reg": 1.1377795338630676,
26
+ "val_duration_s": 0.2805283479974605,
27
+ "lr_backbone": 0.0009991120277927223,
28
+ "lr_head": 0.009990232305719946
29
+ },
30
+ {
31
+ "train_loss": 2.194762885570526,
32
+ "train_obj": 0.514272429049015,
33
+ "train_cls": 0.503182590007782,
34
+ "train_reg": 1.1773078739643097,
35
+ "train_duration_s": 0.5663400460034609,
36
+ "val_loss": 2.369229793548584,
37
+ "val_obj": 0.6161900460720062,
38
+ "val_cls": 0.6164024174213409,
39
+ "val_reg": 1.136637270450592,
40
+ "val_duration_s": 0.253422261972446,
41
+ "lr_backbone": 0.000998002884071386,
42
+ "lr_head": 0.009978031724785247
43
+ },
44
+ {
45
+ "train_loss": 1.963762640953064,
46
+ "train_obj": 0.40183231979608536,
47
+ "train_cls": 0.39030054211616516,
48
+ "train_reg": 1.1716298162937164,
49
+ "train_duration_s": 0.5694993390352465,
50
+ "val_loss": 2.2533944845199585,
51
+ "val_obj": 0.5588586926460266,
52
+ "val_cls": 0.5596922636032104,
53
+ "val_reg": 1.1348434686660767,
54
+ "val_duration_s": 0.25666617101524025,
55
+ "lr_backbone": 0.000996451615591515,
56
+ "lr_head": 0.009960967771506667
57
+ },
58
+ {
59
+ "train_loss": 1.7730705738067627,
60
+ "train_obj": 0.3090578243136406,
61
+ "train_cls": 0.29907427728176117,
62
+ "train_reg": 1.1649384796619415,
63
+ "train_duration_s": 0.6057713080081157,
64
+ "val_loss": 2.1021698713302612,
65
+ "val_obj": 0.48453380167484283,
66
+ "val_cls": 0.4845801740884781,
67
+ "val_reg": 1.1330559849739075,
68
+ "val_duration_s": 0.2614099200000055,
69
+ "lr_backbone": 0.000994459753267812,
70
+ "lr_head": 0.009939057285945933
71
+ },
72
+ {
73
+ "train_loss": 1.6270982921123505,
74
+ "train_obj": 0.23736201226711273,
75
+ "train_cls": 0.22961698845028877,
76
+ "train_reg": 1.1601192653179169,
77
+ "train_duration_s": 0.5821349939797074,
78
+ "val_loss": 1.9252700209617615,
79
+ "val_obj": 0.3977854996919632,
80
+ "val_cls": 0.3976485878229141,
81
+ "val_reg": 1.1298359632492065,
82
+ "val_duration_s": 0.26314096996793523,
83
+ "lr_backbone": 0.0009920292628279098,
84
+ "lr_head": 0.009912321891107011
85
+ },
86
+ {
87
+ "train_loss": 1.5165069699287415,
88
+ "train_obj": 0.18437671288847923,
89
+ "train_cls": 0.1786254420876503,
90
+ "train_reg": 1.1535048186779022,
91
+ "train_duration_s": 0.5722851819591597,
92
+ "val_loss": 1.7437663674354553,
93
+ "val_obj": 0.3092353790998459,
94
+ "val_cls": 0.30843231081962585,
95
+ "val_reg": 1.1260986328125,
96
+ "val_duration_s": 0.3186280700028874,
97
+ "lr_backbone": 0.000989162542872436,
98
+ "lr_head": 0.009880787971596802
99
+ },
100
+ {
101
+ "train_loss": 1.435431033372879,
102
+ "train_obj": 0.14593731611967087,
103
+ "train_cls": 0.14189112931489944,
104
+ "train_reg": 1.1476025879383087,
105
+ "train_duration_s": 0.5517169729573652,
106
+ "val_loss": 1.5804829597473145,
107
+ "val_obj": 0.2306409776210785,
108
+ "val_cls": 0.22930587828159332,
109
+ "val_reg": 1.1205361485481262,
110
+ "val_duration_s": 0.3158288359991275,
111
+ "lr_backbone": 0.0009858624225078836,
112
+ "lr_head": 0.009844486647586725
113
+ },
114
+ {
115
+ "train_loss": 1.376274436712265,
116
+ "train_obj": 0.11804588697850704,
117
+ "train_cls": 0.11540699563920498,
118
+ "train_reg": 1.1428215503692627,
119
+ "train_duration_s": 0.5548694359604269,
120
+ "val_loss": 1.4541865587234497,
121
+ "val_obj": 0.17012722045183182,
122
+ "val_cls": 0.1688792034983635,
123
+ "val_reg": 1.1151801347732544,
124
+ "val_duration_s": 0.292415616044309,
125
+ "lr_backbone": 0.000982132158554624,
126
+ "lr_head": 0.00980345374410087
127
+ },
128
+ {
129
+ "train_loss": 1.3296509683132172,
130
+ "train_obj": 0.09697667136788368,
131
+ "train_cls": 0.09518053196370602,
132
+ "train_reg": 1.137493759393692,
133
+ "train_duration_s": 0.6233892000163905,
134
+ "val_loss": 1.3660516738891602,
135
+ "val_obj": 0.12791810184717178,
136
+ "val_cls": 0.12692993134260178,
137
+ "val_reg": 1.111203670501709,
138
+ "val_duration_s": 0.2539145110058598,
139
+ "lr_backbone": 0.0009779754323328187,
140
+ "lr_head": 0.009757729755661011
141
+ },
142
+ {
143
+ "train_loss": 1.293496310710907,
144
+ "train_obj": 0.08113010600209236,
145
+ "train_cls": 0.08010029047727585,
146
+ "train_reg": 1.1322659254074097,
147
+ "train_duration_s": 0.7129356930381618,
148
+ "val_loss": 1.3050620555877686,
149
+ "val_obj": 0.1001579686999321,
150
+ "val_cls": 0.09942160546779633,
151
+ "val_reg": 1.1054824590682983,
152
+ "val_duration_s": 0.25578066502930596,
153
+ "lr_backbone": 0.0009733963460294011,
154
+ "lr_head": 0.009707359806323416
155
+ },
156
+ {
157
+ "train_loss": 1.2654916942119598,
158
+ "train_obj": 0.0688808262348175,
159
+ "train_cls": 0.06825098022818565,
160
+ "train_reg": 1.1283598840236664,
161
+ "train_duration_s": 0.6933196820318699,
162
+ "val_loss": 1.2663345336914062,
163
+ "val_obj": 0.08232547342777252,
164
+ "val_cls": 0.0813278779387474,
165
+ "val_reg": 1.1026811599731445,
166
+ "val_duration_s": 0.2591131550143473,
167
+ "lr_backbone": 0.0009683994186497127,
168
+ "lr_head": 0.009652393605146845
169
+ },
170
+ {
171
+ "train_loss": 1.2435141503810883,
172
+ "train_obj": 0.05926228687167168,
173
+ "train_cls": 0.058954983949661255,
174
+ "train_reg": 1.1252968907356262,
175
+ "train_duration_s": 0.7026166480500251,
176
+ "val_loss": 1.2399787306785583,
177
+ "val_obj": 0.07086878269910812,
178
+ "val_cls": 0.06995890662074089,
179
+ "val_reg": 1.099151074886322,
180
+ "val_duration_s": 0.2751111130346544,
181
+ "lr_backbone": 0.000962989581557791,
182
+ "lr_head": 0.009592885397135705
183
+ },
184
+ {
185
+ "train_loss": 1.2249413430690765,
186
+ "train_obj": 0.051921211183071136,
187
+ "train_cls": 0.05172533541917801,
188
+ "train_reg": 1.1212947964668274,
189
+ "train_duration_s": 0.6468698550015688,
190
+ "val_loss": 1.2205684185028076,
191
+ "val_obj": 0.06252048537135124,
192
+ "val_cls": 0.06160087697207928,
193
+ "val_reg": 1.0964470505714417,
194
+ "val_duration_s": 0.26240126602351665,
195
+ "lr_backbone": 0.0009571721736097084,
196
+ "lr_head": 0.009528893909706796
197
+ },
198
+ {
199
+ "train_loss": 1.209276795387268,
200
+ "train_obj": 0.04592496156692505,
201
+ "train_cls": 0.04580479208379984,
202
+ "train_reg": 1.1175470352172852,
203
+ "train_duration_s": 0.6777828430058435,
204
+ "val_loss": 1.2017838954925537,
205
+ "val_obj": 0.05445696413516998,
206
+ "val_cls": 0.05391830392181873,
207
+ "val_reg": 1.0934085845947266,
208
+ "val_duration_s": 0.2583821049774997,
209
+ "lr_backbone": 0.0009509529358847651,
210
+ "lr_head": 0.00946048229473242
211
+ },
212
+ {
213
+ "train_loss": 1.1965500116348267,
214
+ "train_obj": 0.0412014601752162,
215
+ "train_cls": 0.04116522986441851,
216
+ "train_reg": 1.1141833364963531,
217
+ "train_duration_s": 0.669373465992976,
218
+ "val_loss": 1.187636137008667,
219
+ "val_obj": 0.04838640056550503,
220
+ "val_cls": 0.048027750104665756,
221
+ "val_reg": 1.0912219882011414,
222
+ "val_duration_s": 0.24920378800015897,
223
+ "lr_backbone": 0.0009443380060197382,
224
+ "lr_head": 0.009387718066217124
225
+ },
226
+ {
227
+ "train_loss": 1.1862647533416748,
228
+ "train_obj": 0.03731937147676945,
229
+ "train_cls": 0.03731080424040556,
230
+ "train_reg": 1.111634612083435,
231
+ "train_duration_s": 0.5997722870088182,
232
+ "val_loss": 1.1731414198875427,
233
+ "val_obj": 0.04178275540471077,
234
+ "val_cls": 0.04168728552758694,
235
+ "val_reg": 1.0896714329719543,
236
+ "val_duration_s": 0.24751218897290528,
237
+ "lr_backbone": 0.0009373339121517743,
238
+ "lr_head": 0.00931067303366952
239
+ },
240
+ {
241
+ "train_loss": 1.176796555519104,
242
+ "train_obj": 0.034238395281136036,
243
+ "train_cls": 0.0342704001814127,
244
+ "train_reg": 1.1082877814769745,
245
+ "train_duration_s": 0.6568289449787699,
246
+ "val_loss": 1.1596794724464417,
247
+ "val_obj": 0.0364920049905777,
248
+ "val_cls": 0.03638902120292187,
249
+ "val_reg": 1.0867984890937805,
250
+ "val_duration_s": 0.2569818199845031,
251
+ "lr_backbone": 0.0009299475664759064,
252
+ "lr_head": 0.009229423231234974
253
+ },
254
+ {
255
+ "train_loss": 1.1702686250209808,
256
+ "train_obj": 0.031543857883661985,
257
+ "train_cls": 0.031622535083442926,
258
+ "train_reg": 1.1071022152900696,
259
+ "train_duration_s": 0.5911949379951693,
260
+ "val_loss": 1.1497790217399597,
261
+ "val_obj": 0.03231072053313255,
262
+ "val_cls": 0.032287606969475746,
263
+ "val_reg": 1.085180640220642,
264
+ "val_duration_s": 0.260109280992765,
265
+ "lr_backbone": 0.0009221862584235525,
266
+ "lr_head": 0.009144048842659082
267
+ },
268
+ {
269
+ "train_loss": 1.1634765565395355,
270
+ "train_obj": 0.029292386956512928,
271
+ "train_cls": 0.029363614041358232,
272
+ "train_reg": 1.1048205494880676,
273
+ "train_duration_s": 0.6729130910243839,
274
+ "val_loss": 1.1422107815742493,
275
+ "val_obj": 0.02840624563395977,
276
+ "val_cls": 0.028462432324886322,
277
+ "val_reg": 1.0853421092033386,
278
+ "val_duration_s": 0.2914826960186474,
279
+ "lr_backbone": 0.000914057647468726,
280
+ "lr_head": 0.00905463412215599
281
+ },
282
+ {
283
+ "train_loss": 1.1571755409240723,
284
+ "train_obj": 0.027506173588335514,
285
+ "train_cls": 0.027546332217752934,
286
+ "train_reg": 1.1021230220794678,
287
+ "train_duration_s": 0.8187461510533467,
288
+ "val_loss": 1.1355379819869995,
289
+ "val_obj": 0.026020457036793232,
290
+ "val_cls": 0.0261111157014966,
291
+ "val_reg": 1.0834064483642578,
292
+ "val_duration_s": 0.2657964060199447,
293
+ "lr_backbone": 0.0009055697555690602,
294
+ "lr_head": 0.008961267311259668
295
+ },
296
+ {
297
+ "train_loss": 1.1517212688922882,
298
+ "train_obj": 0.025889192707836628,
299
+ "train_cls": 0.025969749316573143,
300
+ "train_reg": 1.0998623073101044,
301
+ "train_duration_s": 0.6456451660487801,
302
+ "val_loss": 1.130097210407257,
303
+ "val_obj": 0.023846335709095,
304
+ "val_cls": 0.024003099650144577,
305
+ "val_reg": 1.0822477340698242,
306
+ "val_duration_s": 0.27159386803396046,
307
+ "lr_backbone": 0.0008967309592491047,
308
+ "lr_head": 0.008864040551740157
309
+ },
310
+ {
311
+ "train_loss": 1.1463713943958282,
312
+ "train_obj": 0.02448782743886113,
313
+ "train_cls": 0.02456629741936922,
314
+ "train_reg": 1.09731724858284,
315
+ "train_duration_s": 0.713437256985344,
316
+ "val_loss": 1.1271527409553528,
317
+ "val_obj": 0.022731048986315727,
318
+ "val_cls": 0.022902928292751312,
319
+ "val_reg": 1.0815187692642212,
320
+ "val_duration_s": 0.2560480340034701,
321
+ "lr_backbone": 0.0008875499813337064,
322
+ "lr_head": 0.008763049794670776
323
+ },
324
+ {
325
+ "train_loss": 1.1435035169124603,
326
+ "train_obj": 0.023242094088345766,
327
+ "train_cls": 0.02333394857123494,
328
+ "train_reg": 1.0969274640083313,
329
+ "train_duration_s": 0.7019905269844458,
330
+ "val_loss": 1.1234171390533447,
331
+ "val_obj": 0.021137049421668053,
332
+ "val_cls": 0.02136086765676737,
333
+ "val_reg": 1.0809192061424255,
334
+ "val_duration_s": 0.2609337499598041,
335
+ "lr_backbone": 0.0008780358823396348,
336
+ "lr_head": 0.008658394705735989
337
+ },
338
+ {
339
+ "train_loss": 1.1391916871070862,
340
+ "train_obj": 0.022210986353456974,
341
+ "train_cls": 0.022295547649264336,
342
+ "train_reg": 1.0946851670742035,
343
+ "train_duration_s": 0.6334107420407236,
344
+ "val_loss": 1.1215146780014038,
345
+ "val_obj": 0.020163604989647865,
346
+ "val_cls": 0.020295817404985428,
347
+ "val_reg": 1.0810552835464478,
348
+ "val_duration_s": 0.26535065000643954,
349
+ "lr_backbone": 0.000868198051533946,
350
+ "lr_head": 0.008550178566873411
351
+ },
352
+ {
353
+ "train_loss": 1.135678470134735,
354
+ "train_obj": 0.021255603525787592,
355
+ "train_cls": 0.021334389690309763,
356
+ "train_reg": 1.0930884778499603,
357
+ "train_duration_s": 0.6903555839671753,
358
+ "val_loss": 1.1178792715072632,
359
+ "val_obj": 0.019626340828835964,
360
+ "val_cls": 0.019849903881549835,
361
+ "val_reg": 1.0784029960632324,
362
+ "val_duration_s": 0.2606689669773914,
363
+ "lr_backbone": 0.0008580461976679096,
364
+ "lr_head": 0.00843850817434701
365
+ },
366
+ {
367
+ "train_loss": 1.1331453323364258,
368
+ "train_obj": 0.020402798429131508,
369
+ "train_cls": 0.02049716142937541,
370
+ "train_reg": 1.0922453701496124,
371
+ "train_duration_s": 0.595257840002887,
372
+ "val_loss": 1.1161731481552124,
373
+ "val_obj": 0.01875163894146681,
374
+ "val_cls": 0.01894507184624672,
375
+ "val_reg": 1.0784764289855957,
376
+ "val_duration_s": 0.2604646310210228,
377
+ "lr_backbone": 0.000847590339395643,
378
+ "lr_head": 0.008323493733352078
379
+ },
380
+ {
381
+ "train_loss": 1.1292453408241272,
382
+ "train_obj": 0.019646778237074614,
383
+ "train_cls": 0.019741815980523825,
384
+ "train_reg": 1.0898567736148834,
385
+ "train_duration_s": 0.7415783550241031,
386
+ "val_loss": 1.114890456199646,
387
+ "val_obj": 0.018427678383886814,
388
+ "val_cls": 0.018643934279680252,
389
+ "val_reg": 1.0778188705444336,
390
+ "val_duration_s": 0.2708348269807175,
391
+ "lr_backbone": 0.0008368407953869101,
392
+ "lr_head": 0.008205248749256017
393
+ },
394
+ {
395
+ "train_loss": 1.1263790726661682,
396
+ "train_obj": 0.01894296845421195,
397
+ "train_cls": 0.019019571598619223,
398
+ "train_reg": 1.0884165167808533,
399
+ "train_duration_s": 0.7149691899539903,
400
+ "val_loss": 1.1108486652374268,
401
+ "val_obj": 0.01785500906407833,
402
+ "val_cls": 0.01812076009809971,
403
+ "val_reg": 1.07487291097641,
404
+ "val_duration_s": 0.29119767202064395,
405
+ "lr_backbone": 0.0008258081741438392,
406
+ "lr_head": 0.008083889915582237
407
+ },
408
+ {
409
+ "train_loss": 1.1243026554584503,
410
+ "train_obj": 0.01837206445634365,
411
+ "train_cls": 0.018441373482346535,
412
+ "train_reg": 1.0874892473220825,
413
+ "train_duration_s": 0.6409165539662354,
414
+ "val_loss": 1.1089054942131042,
415
+ "val_obj": 0.017330551519989967,
416
+ "val_cls": 0.017551208846271038,
417
+ "val_reg": 1.074023723602295,
418
+ "val_duration_s": 0.293151288991794,
419
+ "lr_backbone": 0.0008145033635316127,
420
+ "lr_head": 0.007959536998847746
421
+ },
422
+ {
423
+ "train_loss": 1.1221804916858673,
424
+ "train_obj": 0.017732497304677963,
425
+ "train_cls": 0.017798212356865406,
426
+ "train_reg": 1.086649775505066,
427
+ "train_duration_s": 0.6455984530039132,
428
+ "val_loss": 1.1115541458129883,
429
+ "val_obj": 0.01705250423401594,
430
+ "val_cls": 0.01717187464237213,
431
+ "val_reg": 1.0773298144340515,
432
+ "val_duration_s": 0.29476229898864403,
433
+ "lr_backbone": 0.0008029375200334586,
434
+ "lr_head": 0.00783231272036805
435
+ },
436
+ {
437
+ "train_loss": 1.119312822818756,
438
+ "train_obj": 0.01723350491374731,
439
+ "train_cls": 0.017300761304795742,
440
+ "train_reg": 1.0847785472869873,
441
+ "train_duration_s": 0.6360316770151258,
442
+ "val_loss": 1.1099046468734741,
443
+ "val_obj": 0.016366194933652878,
444
+ "val_cls": 0.016478626057505608,
445
+ "val_reg": 1.0770598649978638,
446
+ "val_duration_s": 0.2638561730273068,
447
+ "lr_backbone": 0.0007911220577405481,
448
+ "lr_head": 0.007702342635146035
449
+ },
450
+ {
451
+ "train_loss": 1.1168349385261536,
452
+ "train_obj": 0.016793465241789818,
453
+ "train_cls": 0.0168696534819901,
454
+ "train_reg": 1.0831718146800995,
455
+ "train_duration_s": 0.649400313035585,
456
+ "val_loss": 1.1068586111068726,
457
+ "val_obj": 0.015727512072771788,
458
+ "val_cls": 0.015819577034562826,
459
+ "val_reg": 1.0753114819526672,
460
+ "val_duration_s": 0.25657067802967504,
461
+ "lr_backbone": 0.0007790686370876667,
462
+ "lr_head": 0.00756975500796434
463
+ },
464
+ {
465
+ "train_loss": 1.1161218881607056,
466
+ "train_obj": 0.016352036036551,
467
+ "train_cls": 0.016421268228441477,
468
+ "train_reg": 1.0833486020565033,
469
+ "train_duration_s": 0.6371903790277429,
470
+ "val_loss": 1.10858154296875,
471
+ "val_obj": 0.015461993869394064,
472
+ "val_cls": 0.015547101851552725,
473
+ "val_reg": 1.0775724649429321,
474
+ "val_duration_s": 0.25685583701124415,
475
+ "lr_backbone": 0.0007667891533457716,
476
+ "lr_head": 0.0074346806868034925
477
+ },
478
+ {
479
+ "train_loss": 1.1138735115528107,
480
+ "train_obj": 0.01592441461980343,
481
+ "train_cls": 0.01599123142659664,
482
+ "train_reg": 1.081957846879959,
483
+ "train_duration_s": 0.6477988329716027,
484
+ "val_loss": 1.1069831252098083,
485
+ "val_obj": 0.015292371157556772,
486
+ "val_cls": 0.015360193327069283,
487
+ "val_reg": 1.076330542564392,
488
+ "val_duration_s": 0.269641408987809,
489
+ "lr_backbone": 0.0007542957248827957,
490
+ "lr_head": 0.0072972529737107585
491
+ },
492
+ {
493
+ "train_loss": 1.1117351353168488,
494
+ "train_obj": 0.01553646707907319,
495
+ "train_cls": 0.015595772303640842,
496
+ "train_reg": 1.080602914094925,
497
+ "train_duration_s": 0.5979714900022373,
498
+ "val_loss": 1.102060079574585,
499
+ "val_obj": 0.014808832202106714,
500
+ "val_cls": 0.014897067565470934,
501
+ "val_reg": 1.0723541975021362,
502
+ "val_duration_s": 0.26181058498332277,
503
+ "lr_backbone": 0.0007416006812042824,
504
+ "lr_head": 0.007157607493247111
505
+ },
506
+ {
507
+ "train_loss": 1.1095592975616455,
508
+ "train_obj": 0.015203074552118778,
509
+ "train_cls": 0.015272679040208459,
510
+ "train_reg": 1.0790835320949554,
511
+ "train_duration_s": 0.706163518014364,
512
+ "val_loss": 1.1023824214935303,
513
+ "val_obj": 0.014575902372598648,
514
+ "val_cls": 0.014732892625033855,
515
+ "val_reg": 1.0730735659599304,
516
+ "val_duration_s": 0.26808085903758183,
517
+ "lr_backbone": 0.0007287165507856509,
518
+ "lr_head": 0.007015882058642166
519
+ },
520
+ {
521
+ "train_loss": 1.1087448596954346,
522
+ "train_obj": 0.014874656917527318,
523
+ "train_cls": 0.014934216160327196,
524
+ "train_reg": 1.078935980796814,
525
+ "train_duration_s": 0.6698619319940917,
526
+ "val_loss": 1.1041606068611145,
527
+ "val_obj": 0.014302242547273636,
528
+ "val_cls": 0.014425541274249554,
529
+ "val_reg": 1.0754327774047852,
530
+ "val_duration_s": 0.26492691203020513,
531
+ "lr_backbone": 0.0007156560487081049,
532
+ "lr_head": 0.006872216535789159
533
+ },
534
+ {
535
+ "train_loss": 1.1066694259643555,
536
+ "train_obj": 0.014565957942977548,
537
+ "train_cls": 0.014625045703724027,
538
+ "train_reg": 1.0774784088134766,
539
+ "train_duration_s": 0.6183141159708612,
540
+ "val_loss": 1.0991783142089844,
541
+ "val_obj": 0.013941123150289059,
542
+ "val_cls": 0.0140526806935668,
543
+ "val_reg": 1.071184515953064,
544
+ "val_duration_s": 0.26345953898271546,
545
+ "lr_backbone": 0.0007024320641103809,
546
+ "lr_head": 0.006726752705214195
547
+ },
548
+ {
549
+ "train_loss": 1.1053761541843414,
550
+ "train_obj": 0.014284016564488411,
551
+ "train_cls": 0.014327086973935366,
552
+ "train_reg": 1.076765090227127,
553
+ "train_duration_s": 0.6633686930290423,
554
+ "val_loss": 1.096693992614746,
555
+ "val_obj": 0.01376007916405797,
556
+ "val_cls": 0.013925152830779552,
557
+ "val_reg": 1.0690087676048279,
558
+ "val_duration_s": 0.2952593939844519,
559
+ "lr_backbone": 0.0006890576474687261,
560
+ "lr_head": 0.006579634122155992
561
+ },
562
+ {
563
+ "train_loss": 1.1040804982185364,
564
+ "train_obj": 0.013983281562104821,
565
+ "train_cls": 0.014028869336470962,
566
+ "train_reg": 1.0760683417320251,
567
+ "train_duration_s": 0.6530306320055388,
568
+ "val_loss": 1.100983440876007,
569
+ "val_obj": 0.01349617913365364,
570
+ "val_cls": 0.013564884662628174,
571
+ "val_reg": 1.0739223957061768,
572
+ "val_duration_s": 0.26189195999177173,
573
+ "lr_backbone": 0.0006755459977176529,
574
+ "lr_head": 0.006431005974894187
575
+ },
576
+ {
577
+ "train_loss": 1.1022354066371918,
578
+ "train_obj": 0.013759802794083953,
579
+ "train_cls": 0.013804371235892177,
580
+ "train_reg": 1.07467120885849,
581
+ "train_duration_s": 0.717769003007561,
582
+ "val_loss": 1.1016488671302795,
583
+ "val_obj": 0.013366122730076313,
584
+ "val_cls": 0.013427932281047106,
585
+ "val_reg": 1.074854850769043,
586
+ "val_duration_s": 0.262606100004632,
587
+ "lr_backbone": 0.0006619104492241843,
588
+ "lr_head": 0.0062810149414660316
589
+ },
590
+ {
591
+ "train_loss": 1.1011135876178741,
592
+ "train_obj": 0.01352710835635662,
593
+ "train_cls": 0.013579612132161856,
594
+ "train_reg": 1.0740068852901459,
595
+ "train_duration_s": 0.6559057739796117,
596
+ "val_loss": 1.1021131873130798,
597
+ "val_obj": 0.012921839021146297,
598
+ "val_cls": 0.01304272236302495,
599
+ "val_reg": 1.0761486291885376,
600
+ "val_duration_s": 0.25723185297101736,
601
+ "lr_backbone": 0.0006481644586284439,
602
+ "lr_head": 0.006129809044912888
603
+ },
604
+ {
605
+ "train_loss": 1.0994421243667603,
606
+ "train_obj": 0.013297662604600191,
607
+ "train_cls": 0.013349507935345173,
608
+ "train_reg": 1.0727949440479279,
609
+ "train_duration_s": 0.6866188669810072,
610
+ "val_loss": 1.0995287895202637,
611
+ "val_obj": 0.012860690709203482,
612
+ "val_cls": 0.01302369823679328,
613
+ "val_reg": 1.0736443996429443,
614
+ "val_duration_s": 0.2665874910308048,
615
+ "lr_backbone": 0.0006343215915635759,
616
+ "lr_head": 0.00597753750719934
617
+ },
618
+ {
619
+ "train_loss": 1.1004822850227356,
620
+ "train_obj": 0.013079148018732667,
621
+ "train_cls": 0.013128668069839478,
622
+ "train_reg": 1.0742745101451874,
623
+ "train_duration_s": 0.6200219679740258,
624
+ "val_loss": 1.0985276103019714,
625
+ "val_obj": 0.012564394157379866,
626
+ "val_cls": 0.01273611094802618,
627
+ "val_reg": 1.0732271075248718,
628
+ "val_duration_s": 0.25563636096194386,
629
+ "lr_backbone": 0.0006203955092681036,
630
+ "lr_head": 0.005824350601949145
631
+ },
632
+ {
633
+ "train_loss": 1.0982086956501007,
634
+ "train_obj": 0.012883494608104229,
635
+ "train_cls": 0.012929189717397094,
636
+ "train_reg": 1.0723960101604462,
637
+ "train_duration_s": 0.6565175089635886,
638
+ "val_loss": 1.096903681755066,
639
+ "val_obj": 0.012423050589859486,
640
+ "val_cls": 0.012553584761917591,
641
+ "val_reg": 1.0719270706176758,
642
+ "val_duration_s": 0.25993295601801947,
643
+ "lr_backbone": 0.0006063999551039368,
644
+ "lr_head": 0.005670399506143309
645
+ },
646
+ {
647
+ "train_loss": 1.0966042578220367,
648
+ "train_obj": 0.012703181942924857,
649
+ "train_cls": 0.012756945798173547,
650
+ "train_reg": 1.0711441040039062,
651
+ "train_duration_s": 0.6426404590019956,
652
+ "val_loss": 1.092538833618164,
653
+ "val_obj": 0.012324787676334381,
654
+ "val_cls": 0.01240294799208641,
655
+ "val_reg": 1.0678110718727112,
656
+ "val_duration_s": 0.2594450740143657,
657
+ "lr_backbone": 0.0005923487409933312,
658
+ "lr_head": 0.005515836150926648
659
+ },
660
+ {
661
+ "train_loss": 1.0957514941692352,
662
+ "train_obj": 0.012526726815849543,
663
+ "train_cls": 0.012576338136568666,
664
+ "train_reg": 1.070648431777954,
665
+ "train_duration_s": 0.6147687300108373,
666
+ "val_loss": 1.098196268081665,
667
+ "val_obj": 0.0122212003916502,
668
+ "val_cls": 0.012353878933936357,
669
+ "val_reg": 1.073621153831482,
670
+ "val_duration_s": 0.2527942130109295,
671
+ "lr_backbone": 0.0005782557337881908,
672
+ "lr_head": 0.005360813071670104
673
+ },
674
+ {
675
+ "train_loss": 1.094252109527588,
676
+ "train_obj": 0.012377958744764328,
677
+ "train_cls": 0.012425957014784217,
678
+ "train_reg": 1.0694482028484344,
679
+ "train_duration_s": 0.6785857279901393,
680
+ "val_loss": 1.0995318293571472,
681
+ "val_obj": 0.01210054662078619,
682
+ "val_cls": 0.01217495696619153,
683
+ "val_reg": 1.07525634765625,
684
+ "val_duration_s": 0.251013888977468,
685
+ "lr_backbone": 0.0005641348415851575,
686
+ "lr_head": 0.005205483257436737
687
+ },
688
+ {
689
+ "train_loss": 1.0935681462287903,
690
+ "train_obj": 0.01223357580602169,
691
+ "train_cls": 0.012275451328605413,
692
+ "train_reg": 1.0690591037273407,
693
+ "train_duration_s": 0.5970722349593416,
694
+ "val_loss": 1.0947916507720947,
695
+ "val_obj": 0.011603347025811672,
696
+ "val_cls": 0.011669113300740719,
697
+ "val_reg": 1.0715191960334778,
698
+ "val_duration_s": 0.2520985459559597,
699
+ "lr_backbone": 0.0005499999999999997,
700
+ "lr_head": 0.005050000000000002
701
+ },
702
+ {
703
+ "train_loss": 1.0919917225837708,
704
+ "train_obj": 0.012040137080475688,
705
+ "train_cls": 0.0120892443228513,
706
+ "train_reg": 1.067862331867218,
707
+ "train_duration_s": 0.5623678660485893,
708
+ "val_loss": 1.0910890102386475,
709
+ "val_obj": 0.011488612275570631,
710
+ "val_cls": 0.011575881857424974,
711
+ "val_reg": 1.0680245161056519,
712
+ "val_duration_s": 0.24487946403678507,
713
+ "lr_backbone": 0.000535865158414842,
714
+ "lr_head": 0.004894516742563268
715
+ },
716
+ {
717
+ "train_loss": 1.09165158867836,
718
+ "train_obj": 0.011925761355087161,
719
+ "train_cls": 0.01197828957810998,
720
+ "train_reg": 1.0677475035190582,
721
+ "train_duration_s": 0.6613560570403934,
722
+ "val_loss": 1.0951207280158997,
723
+ "val_obj": 0.011421271599829197,
724
+ "val_cls": 0.011516725644469261,
725
+ "val_reg": 1.0721827149391174,
726
+ "val_duration_s": 0.2617891860427335,
727
+ "lr_backbone": 0.0005217442662118087,
728
+ "lr_head": 0.004739186928329901
729
+ },
730
+ {
731
+ "train_loss": 1.090124636888504,
732
+ "train_obj": 0.011804590234532952,
733
+ "train_cls": 0.01185443508438766,
734
+ "train_reg": 1.0664656162261963,
735
+ "train_duration_s": 0.7332402160391212,
736
+ "val_loss": 1.098381221294403,
737
+ "val_obj": 0.011585013940930367,
738
+ "val_cls": 0.011710518971085548,
739
+ "val_reg": 1.075085699558258,
740
+ "val_duration_s": 0.25381539697991684,
741
+ "lr_backbone": 0.0005076512590066682,
742
+ "lr_head": 0.004584163849073356
743
+ },
744
+ {
745
+ "train_loss": 1.089239090681076,
746
+ "train_obj": 0.01168104913085699,
747
+ "train_cls": 0.01173282298259437,
748
+ "train_reg": 1.0658252239227295,
749
+ "train_duration_s": 0.721786139998585,
750
+ "val_loss": 1.0967810153961182,
751
+ "val_obj": 0.011446605902165174,
752
+ "val_cls": 0.011577938217669725,
753
+ "val_reg": 1.073756456375122,
754
+ "val_duration_s": 0.2623707629973069,
755
+ "lr_backbone": 0.0004936000448960629,
756
+ "lr_head": 0.0044296004938566965
757
+ },
758
+ {
759
+ "train_loss": 1.0874614119529724,
760
+ "train_obj": 0.011547031346708536,
761
+ "train_cls": 0.011599087854847312,
762
+ "train_reg": 1.064315289258957,
763
+ "train_duration_s": 0.6615807820344344,
764
+ "val_loss": 1.098645806312561,
765
+ "val_obj": 0.011310380883514881,
766
+ "val_cls": 0.011397925205528736,
767
+ "val_reg": 1.0759375095367432,
768
+ "val_duration_s": 0.25482997501967475,
769
+ "lr_backbone": 0.00047960449073189583,
770
+ "lr_head": 0.004275649398050859
771
+ },
772
+ {
773
+ "train_loss": 1.0874675512313843,
774
+ "train_obj": 0.011456360341981053,
775
+ "train_cls": 0.011506328359246254,
776
+ "train_reg": 1.0645048320293427,
777
+ "train_duration_s": 0.7092476990073919,
778
+ "val_loss": 1.0954411029815674,
779
+ "val_obj": 0.011209859512746334,
780
+ "val_cls": 0.011301718186587095,
781
+ "val_reg": 1.0729295015335083,
782
+ "val_duration_s": 0.25820010603638366,
783
+ "lr_backbone": 0.00046567840843642365,
784
+ "lr_head": 0.004122462492800665
785
+ },
786
+ {
787
+ "train_loss": 1.0867061614990234,
788
+ "train_obj": 0.011337194824591279,
789
+ "train_cls": 0.011388532817363739,
790
+ "train_reg": 1.0639804005622864,
791
+ "train_duration_s": 0.7600768760312349,
792
+ "val_loss": 1.092163324356079,
793
+ "val_obj": 0.011121332179754972,
794
+ "val_cls": 0.011220748536288738,
795
+ "val_reg": 1.0698212385177612,
796
+ "val_duration_s": 0.24854238500120118,
797
+ "lr_backbone": 0.0004518355413715557,
798
+ "lr_head": 0.003970190955087117
799
+ },
800
+ {
801
+ "train_loss": 1.0855147242546082,
802
+ "train_obj": 0.011251067044213414,
803
+ "train_cls": 0.01130642113275826,
804
+ "train_reg": 1.062957227230072,
805
+ "train_duration_s": 0.5924538800027221,
806
+ "val_loss": 1.0908886194229126,
807
+ "val_obj": 0.010713071562349796,
808
+ "val_cls": 0.01079885195940733,
809
+ "val_reg": 1.0693767070770264,
810
+ "val_duration_s": 0.2614563400275074,
811
+ "lr_backbone": 0.00043808955077581505,
812
+ "lr_head": 0.0038189850585339698
813
+ },
814
+ {
815
+ "train_loss": 1.0854755640029907,
816
+ "train_obj": 0.011138282716274261,
817
+ "train_cls": 0.011196570470929146,
818
+ "train_reg": 1.063140720129013,
819
+ "train_duration_s": 0.6342899229493923,
820
+ "val_loss": 1.0924379229545593,
821
+ "val_obj": 0.010924513451755047,
822
+ "val_cls": 0.010975581593811512,
823
+ "val_reg": 1.0705378651618958,
824
+ "val_duration_s": 0.25443152902880684,
825
+ "lr_backbone": 0.0004244540022823466,
826
+ "lr_head": 0.003668994025105817
827
+ },
828
+ {
829
+ "train_loss": 1.0838780403137207,
830
+ "train_obj": 0.011061613215133548,
831
+ "train_cls": 0.011112936306744814,
832
+ "train_reg": 1.0617035031318665,
833
+ "train_duration_s": 0.6520260230172426,
834
+ "val_loss": 1.0966288447380066,
835
+ "val_obj": 0.010903269052505493,
836
+ "val_cls": 0.010974057950079441,
837
+ "val_reg": 1.0747514963150024,
838
+ "val_duration_s": 0.25178048899397254,
839
+ "lr_backbone": 0.00041094235253127357,
840
+ "lr_head": 0.003520365877844013
841
+ },
842
+ {
843
+ "train_loss": 1.0844574570655823,
844
+ "train_obj": 0.010999046266078949,
845
+ "train_cls": 0.01103839767165482,
846
+ "train_reg": 1.0624200403690338,
847
+ "train_duration_s": 0.726587513985578,
848
+ "val_loss": 1.0942053198814392,
849
+ "val_obj": 0.010877885855734348,
850
+ "val_cls": 0.01097976602613926,
851
+ "val_reg": 1.072347640991211,
852
+ "val_duration_s": 0.2589392969966866,
853
+ "lr_backbone": 0.00039756793588961866,
854
+ "lr_head": 0.003373247294785809
855
+ },
856
+ {
857
+ "train_loss": 1.0838465988636017,
858
+ "train_obj": 0.010919145308434963,
859
+ "train_cls": 0.010967453708872199,
860
+ "train_reg": 1.0619600117206573,
861
+ "train_duration_s": 0.6386721750022843,
862
+ "val_loss": 1.0933923125267029,
863
+ "val_obj": 0.010830741375684738,
864
+ "val_cls": 0.010916871018707752,
865
+ "val_reg": 1.0716447234153748,
866
+ "val_duration_s": 0.2731027320260182,
867
+ "lr_backbone": 0.00038434395129189486,
868
+ "lr_head": 0.0032277834642108465
869
+ },
870
+ {
871
+ "train_loss": 1.0825149714946747,
872
+ "train_obj": 0.01084492146037519,
873
+ "train_cls": 0.01089848205447197,
874
+ "train_reg": 1.0607715547084808,
875
+ "train_duration_s": 0.5869391480227932,
876
+ "val_loss": 1.0929441452026367,
877
+ "val_obj": 0.010647550225257874,
878
+ "val_cls": 0.010688469745218754,
879
+ "val_reg": 1.0716081261634827,
880
+ "val_duration_s": 0.254797725006938,
881
+ "lr_backbone": 0.0003712834492143485,
882
+ "lr_head": 0.0030841179413578362
883
+ },
884
+ {
885
+ "train_loss": 1.0817044377326965,
886
+ "train_obj": 0.010774188674986362,
887
+ "train_cls": 0.010819253977388144,
888
+ "train_reg": 1.0601109862327576,
889
+ "train_duration_s": 0.6478097430081107,
890
+ "val_loss": 1.0884382724761963,
891
+ "val_obj": 0.010617390740662813,
892
+ "val_cls": 0.010685324668884277,
893
+ "val_reg": 1.0671355724334717,
894
+ "val_duration_s": 0.25446167401969433,
895
+ "lr_backbone": 0.0003583993187957171,
896
+ "lr_head": 0.0029423925067528912
897
+ },
898
+ {
899
+ "train_loss": 1.080708533525467,
900
+ "train_obj": 0.010708574671298265,
901
+ "train_cls": 0.010766215156763792,
902
+ "train_reg": 1.0592337250709534,
903
+ "train_duration_s": 0.6646836430300027,
904
+ "val_loss": 1.0874670147895813,
905
+ "val_obj": 0.01057868218049407,
906
+ "val_cls": 0.010633224621415138,
907
+ "val_reg": 1.0662551522254944,
908
+ "val_duration_s": 0.24431595200439915,
909
+ "lr_backbone": 0.00034570427511720377,
910
+ "lr_head": 0.0028027470262892446
911
+ },
912
+ {
913
+ "train_loss": 1.0805813074111938,
914
+ "train_obj": 0.010638549691066146,
915
+ "train_cls": 0.0106942153070122,
916
+ "train_reg": 1.05924853682518,
917
+ "train_duration_s": 0.6282756889704615,
918
+ "val_loss": 1.0960173606872559,
919
+ "val_obj": 0.01041469257324934,
920
+ "val_cls": 0.01053499523550272,
921
+ "val_reg": 1.0750676989555359,
922
+ "val_duration_s": 0.2510881240013987,
923
+ "lr_backbone": 0.00033321084665422786,
924
+ "lr_head": 0.0026653193131965097
925
+ },
926
+ {
927
+ "train_loss": 1.0813067555427551,
928
+ "train_obj": 0.010572672821581364,
929
+ "train_cls": 0.010619609151035547,
930
+ "train_reg": 1.060114473104477,
931
+ "train_duration_s": 0.5874384730122983,
932
+ "val_loss": 1.0937111973762512,
933
+ "val_obj": 0.010667934082448483,
934
+ "val_cls": 0.01077366154640913,
935
+ "val_reg": 1.0722696185112,
936
+ "val_duration_s": 0.24807530600810423,
937
+ "lr_backbone": 0.0003209313629123327,
938
+ "lr_head": 0.002530244992035663
939
+ },
940
+ {
941
+ "train_loss": 1.0799786746501923,
942
+ "train_obj": 0.010522706666961312,
943
+ "train_cls": 0.010575430700555444,
944
+ "train_reg": 1.0588805079460144,
945
+ "train_duration_s": 0.5833320490201004,
946
+ "val_loss": 1.0950977206230164,
947
+ "val_obj": 0.010467872489243746,
948
+ "val_cls": 0.010559740476310253,
949
+ "val_reg": 1.074070155620575,
950
+ "val_duration_s": 0.27491738699609414,
951
+ "lr_backbone": 0.00030887794225945123,
952
+ "lr_head": 0.0023976573648539666
953
+ },
954
+ {
955
+ "train_loss": 1.0784000158309937,
956
+ "train_obj": 0.010487049585208297,
957
+ "train_cls": 0.010533560067415237,
958
+ "train_reg": 1.0573793947696686,
959
+ "train_duration_s": 0.6865631359978579,
960
+ "val_loss": 1.0941764116287231,
961
+ "val_obj": 0.010453047696501017,
962
+ "val_cls": 0.010582357179373503,
963
+ "val_reg": 1.0731409788131714,
964
+ "val_duration_s": 0.2480600299895741,
965
+ "lr_backbone": 0.000297062479966541,
966
+ "lr_head": 0.0022676872796319543
967
+ },
968
+ {
969
+ "train_loss": 1.0783608257770538,
970
+ "train_obj": 0.010425732005387545,
971
+ "train_cls": 0.010479548247531056,
972
+ "train_reg": 1.0574555397033691,
973
+ "train_duration_s": 0.5828883610083722,
974
+ "val_loss": 1.0952659845352173,
975
+ "val_obj": 0.010551390703767538,
976
+ "val_cls": 0.01067020883783698,
977
+ "val_reg": 1.074044406414032,
978
+ "val_duration_s": 0.3004449989530258,
979
+ "lr_backbone": 0.000285496636468387,
980
+ "lr_head": 0.0021404630011522597
981
+ },
982
+ {
983
+ "train_loss": 1.0789496004581451,
984
+ "train_obj": 0.01039236900396645,
985
+ "train_cls": 0.010446090484037995,
986
+ "train_reg": 1.0581111311912537,
987
+ "train_duration_s": 0.636702542018611,
988
+ "val_loss": 1.0927815437316895,
989
+ "val_obj": 0.010326582007110119,
990
+ "val_cls": 0.010452665854245424,
991
+ "val_reg": 1.0720022916793823,
992
+ "val_duration_s": 0.286596956953872,
993
+ "lr_backbone": 0.0002741918258561604,
994
+ "lr_head": 0.002016110084417767
995
+ },
996
+ {
997
+ "train_loss": 1.0784678757190704,
998
+ "train_obj": 0.010355423670262098,
999
+ "train_cls": 0.010415055323392153,
1000
+ "train_reg": 1.0576973855495453,
1001
+ "train_duration_s": 0.670749606040772,
1002
+ "val_loss": 1.0917505025863647,
1003
+ "val_obj": 0.010243823751807213,
1004
+ "val_cls": 0.0102810924872756,
1005
+ "val_reg": 1.0712255835533142,
1006
+ "val_duration_s": 0.28577073698397726,
1007
+ "lr_backbone": 0.0002631592046130895,
1008
+ "lr_head": 0.0018947512507439866
1009
+ },
1010
+ {
1011
+ "train_loss": 1.0774260759353638,
1012
+ "train_obj": 0.010301690315827727,
1013
+ "train_cls": 0.010350615251809359,
1014
+ "train_reg": 1.0567737817764282,
1015
+ "train_duration_s": 0.6408779509947635,
1016
+ "val_loss": 1.0891329050064087,
1017
+ "val_obj": 0.010188950225710869,
1018
+ "val_cls": 0.010258184745907784,
1019
+ "val_reg": 1.06868577003479,
1020
+ "val_duration_s": 0.25149831897579134,
1021
+ "lr_backbone": 0.00025240966060435663,
1022
+ "lr_head": 0.0017765062666479248
1023
+ },
1024
+ {
1025
+ "train_loss": 1.077297866344452,
1026
+ "train_obj": 0.010261506540700793,
1027
+ "train_cls": 0.010318043641746044,
1028
+ "train_reg": 1.0567183196544647,
1029
+ "train_duration_s": 0.5692864760058001,
1030
+ "val_loss": 1.087362289428711,
1031
+ "val_obj": 0.01012952160090208,
1032
+ "val_cls": 0.010204828344285488,
1033
+ "val_reg": 1.0670279264450073,
1034
+ "val_duration_s": 0.27686441596597433,
1035
+ "lr_backbone": 0.00024195380233208997,
1036
+ "lr_head": 0.0016614918256529917
1037
+ },
1038
+ {
1039
+ "train_loss": 1.076753944158554,
1040
+ "train_obj": 0.010230402229353786,
1041
+ "train_cls": 0.010283713694661856,
1042
+ "train_reg": 1.0562398135662079,
1043
+ "train_duration_s": 0.6017453559907153,
1044
+ "val_loss": 1.088329255580902,
1045
+ "val_obj": 0.01021224306896329,
1046
+ "val_cls": 0.010249018669128418,
1047
+ "val_reg": 1.0678679943084717,
1048
+ "val_duration_s": 0.2567237349576317,
1049
+ "lr_backbone": 0.0002318019484660536,
1050
+ "lr_head": 0.001549821433126591
1051
+ },
1052
+ {
1053
+ "train_loss": 1.0760108828544617,
1054
+ "train_obj": 0.010215932503342628,
1055
+ "train_cls": 0.01026985957287252,
1056
+ "train_reg": 1.0555251240730286,
1057
+ "train_duration_s": 0.6224485790007748,
1058
+ "val_loss": 1.0940824151039124,
1059
+ "val_obj": 0.009986888617277145,
1060
+ "val_cls": 0.010065140668302774,
1061
+ "val_reg": 1.0740303993225098,
1062
+ "val_duration_s": 0.29469223803607747,
1063
+ "lr_backbone": 0.00022196411766036485,
1064
+ "lr_head": 0.0014416052942640147
1065
+ },
1066
+ {
1067
+ "train_loss": 1.0762045085430145,
1068
+ "train_obj": 0.010187903651967645,
1069
+ "train_cls": 0.010244743432849646,
1070
+ "train_reg": 1.0557718575000763,
1071
+ "train_duration_s": 0.5926657950039953,
1072
+ "val_loss": 1.0940901637077332,
1073
+ "val_obj": 0.010075243655592203,
1074
+ "val_cls": 0.010163875762373209,
1075
+ "val_reg": 1.0738510489463806,
1076
+ "val_duration_s": 0.2580490759573877,
1077
+ "lr_backbone": 0.00021245001866629313,
1078
+ "lr_head": 0.0013369502053292258
1079
+ },
1080
+ {
1081
+ "train_loss": 1.0768308341503143,
1082
+ "train_obj": 0.010161776561290026,
1083
+ "train_cls": 0.01022589672356844,
1084
+ "train_reg": 1.0564431846141815,
1085
+ "train_duration_s": 0.6551358830183744,
1086
+ "val_loss": 1.0939911603927612,
1087
+ "val_obj": 0.010133659932762384,
1088
+ "val_cls": 0.0102345016784966,
1089
+ "val_reg": 1.07362300157547,
1090
+ "val_duration_s": 0.29113550001056865,
1091
+ "lr_backbone": 0.00020326904075089485,
1092
+ "lr_head": 0.0012359594482598444
1093
+ },
1094
+ {
1095
+ "train_loss": 1.0753932297229767,
1096
+ "train_obj": 0.010130809852853417,
1097
+ "train_cls": 0.010180694749578834,
1098
+ "train_reg": 1.055081695318222,
1099
+ "train_duration_s": 0.6152588650002144,
1100
+ "val_loss": 1.0921041369438171,
1101
+ "val_obj": 0.009994576219469309,
1102
+ "val_cls": 0.010113760363310575,
1103
+ "val_reg": 1.0719957947731018,
1104
+ "val_duration_s": 0.26981011603493243,
1105
+ "lr_backbone": 0.0001944302444309393,
1106
+ "lr_head": 0.0011387326887403332
1107
+ },
1108
+ {
1109
+ "train_loss": 1.075397551059723,
1110
+ "train_obj": 0.010107427136972547,
1111
+ "train_cls": 0.01016512792557478,
1112
+ "train_reg": 1.0551249980926514,
1113
+ "train_duration_s": 0.6341630900278687,
1114
+ "val_loss": 1.092411994934082,
1115
+ "val_obj": 0.010178622789680958,
1116
+ "val_cls": 0.01029387442395091,
1117
+ "val_reg": 1.071939468383789,
1118
+ "val_duration_s": 0.26076968701090664,
1119
+ "lr_backbone": 0.00018594235253127368,
1120
+ "lr_head": 0.0010453658778440112
1121
+ },
1122
+ {
1123
+ "train_loss": 1.0759019255638123,
1124
+ "train_obj": 0.010073260869830847,
1125
+ "train_cls": 0.010128507856279612,
1126
+ "train_reg": 1.0557001829147339,
1127
+ "train_duration_s": 0.6586457779631019,
1128
+ "val_loss": 1.0933113098144531,
1129
+ "val_obj": 0.010056169237941504,
1130
+ "val_cls": 0.010130063630640507,
1131
+ "val_reg": 1.0731250643730164,
1132
+ "val_duration_s": 0.2609155040117912,
1133
+ "lr_backbone": 0.0001778137415764472,
1134
+ "lr_head": 0.0009559511573409199
1135
+ },
1136
+ {
1137
+ "train_loss": 1.0736617147922516,
1138
+ "train_obj": 0.010056819068267941,
1139
+ "train_cls": 0.010112468153238297,
1140
+ "train_reg": 1.0534924268722534,
1141
+ "train_duration_s": 0.7188562179799192,
1142
+ "val_loss": 1.0909684300422668,
1143
+ "val_obj": 0.010229456704109907,
1144
+ "val_cls": 0.010270406492054462,
1145
+ "val_reg": 1.070468544960022,
1146
+ "val_duration_s": 0.2786150809843093,
1147
+ "lr_backbone": 0.00017005243352409327,
1148
+ "lr_head": 0.0008705767687650269
1149
+ },
1150
+ {
1151
+ "train_loss": 1.0741604566574097,
1152
+ "train_obj": 0.010056846775114536,
1153
+ "train_cls": 0.01011435384862125,
1154
+ "train_reg": 1.0539892315864563,
1155
+ "train_duration_s": 0.6007559039862826,
1156
+ "val_loss": 1.0907172560691833,
1157
+ "val_obj": 0.010114184580743313,
1158
+ "val_cls": 0.010199124924838543,
1159
+ "val_reg": 1.0704039335250854,
1160
+ "val_duration_s": 0.282732998020947,
1161
+ "lr_backbone": 0.00016266608784822528,
1162
+ "lr_head": 0.0007893269663304788
1163
+ },
1164
+ {
1165
+ "train_loss": 1.073331505060196,
1166
+ "train_obj": 0.010053273756057024,
1167
+ "train_cls": 0.010111166629940271,
1168
+ "train_reg": 1.053167074918747,
1169
+ "train_duration_s": 0.5945498479995877,
1170
+ "val_loss": 1.0940272212028503,
1171
+ "val_obj": 0.010090101510286331,
1172
+ "val_cls": 0.010209796950221062,
1173
+ "val_reg": 1.073727309703827,
1174
+ "val_duration_s": 0.27672431099927053,
1175
+ "lr_backbone": 0.00015566199398026135,
1176
+ "lr_head": 0.0007122819337828756
1177
+ },
1178
+ {
1179
+ "train_loss": 1.0739899277687073,
1180
+ "train_obj": 0.010033116675913334,
1181
+ "train_cls": 0.010085215559229255,
1182
+ "train_reg": 1.0538715720176697,
1183
+ "train_duration_s": 0.5826158979907632,
1184
+ "val_loss": 1.0947926044464111,
1185
+ "val_obj": 0.009957995265722275,
1186
+ "val_cls": 0.010084709618240595,
1187
+ "val_reg": 1.0747498869895935,
1188
+ "val_duration_s": 0.2896934269811027,
1189
+ "lr_backbone": 0.00014904706411523447,
1190
+ "lr_head": 0.0006395177052675797
1191
+ },
1192
+ {
1193
+ "train_loss": 1.0756808519363403,
1194
+ "train_obj": 0.010029536904767156,
1195
+ "train_cls": 0.010087540140375495,
1196
+ "train_reg": 1.0555637776851654,
1197
+ "train_duration_s": 0.6605965399648994,
1198
+ "val_loss": 1.0930689573287964,
1199
+ "val_obj": 0.009971830062568188,
1200
+ "val_cls": 0.010093713644891977,
1201
+ "val_reg": 1.0730034112930298,
1202
+ "val_duration_s": 0.2566853419994004,
1203
+ "lr_backbone": 0.00014282782639029126,
1204
+ "lr_head": 0.0005711060902932044
1205
+ },
1206
+ {
1207
+ "train_loss": 1.074019432067871,
1208
+ "train_obj": 0.009983490919694304,
1209
+ "train_cls": 0.010042589157819748,
1210
+ "train_reg": 1.0539933443069458,
1211
+ "train_duration_s": 0.6085525890230201,
1212
+ "val_loss": 1.093157708644867,
1213
+ "val_obj": 0.010002858936786652,
1214
+ "val_cls": 0.010100905783474445,
1215
+ "val_reg": 1.0730539560317993,
1216
+ "val_duration_s": 0.2617441299953498,
1217
+ "lr_backbone": 0.00013701041844220855,
1218
+ "lr_head": 0.0005071146028642946
1219
+ },
1220
+ {
1221
+ "train_loss": 1.0729943811893463,
1222
+ "train_obj": 0.009978290181607008,
1223
+ "train_cls": 0.010032006539404392,
1224
+ "train_reg": 1.0529840886592865,
1225
+ "train_duration_s": 0.5984370200312696,
1226
+ "val_loss": 1.0916684865951538,
1227
+ "val_obj": 0.009926843922585249,
1228
+ "val_cls": 0.010000049602240324,
1229
+ "val_reg": 1.0717415809631348,
1230
+ "val_duration_s": 0.2610543890041299,
1231
+ "lr_backbone": 0.00013160058135028687,
1232
+ "lr_head": 0.00044760639485315595
1233
+ },
1234
+ {
1235
+ "train_loss": 1.0726170241832733,
1236
+ "train_obj": 0.009990750113502145,
1237
+ "train_cls": 0.010056375758722425,
1238
+ "train_reg": 1.05256986618042,
1239
+ "train_duration_s": 0.6084673820296302,
1240
+ "val_loss": 1.0910411477088928,
1241
+ "val_obj": 0.009894676506519318,
1242
+ "val_cls": 0.009971363469958305,
1243
+ "val_reg": 1.0711750984191895,
1244
+ "val_duration_s": 0.2592308110324666,
1245
+ "lr_backbone": 0.00012660365397059852,
1246
+ "lr_head": 0.0003926401936765841
1247
+ },
1248
+ {
1249
+ "train_loss": 1.0729031264781952,
1250
+ "train_obj": 0.009963654447346926,
1251
+ "train_cls": 0.01002525701187551,
1252
+ "train_reg": 1.0529142022132874,
1253
+ "train_duration_s": 0.638382842997089,
1254
+ "val_loss": 1.092672049999237,
1255
+ "val_obj": 0.009945690631866455,
1256
+ "val_cls": 0.010007770732045174,
1257
+ "val_reg": 1.0727185606956482,
1258
+ "val_duration_s": 0.24983607500325888,
1259
+ "lr_backbone": 0.00012202456766718089,
1260
+ "lr_head": 0.0003422702443389901
1261
+ },
1262
+ {
1263
+ "train_loss": 1.0738952159881592,
1264
+ "train_obj": 0.009953690459951758,
1265
+ "train_cls": 0.010014323517680168,
1266
+ "train_reg": 1.0539271831512451,
1267
+ "train_duration_s": 0.6041064040036872,
1268
+ "val_loss": 1.0932231545448303,
1269
+ "val_obj": 0.009914373513311148,
1270
+ "val_cls": 0.010006637778133154,
1271
+ "val_reg": 1.073302149772644,
1272
+ "val_duration_s": 0.25684191699838266,
1273
+ "lr_backbone": 0.00011786784144537566,
1274
+ "lr_head": 0.00029654625589913247
1275
+ },
1276
+ {
1277
+ "train_loss": 1.07302787899971,
1278
+ "train_obj": 0.009954406414180994,
1279
+ "train_cls": 0.010016958694905043,
1280
+ "train_reg": 1.0530565083026886,
1281
+ "train_duration_s": 0.6564388199476525,
1282
+ "val_loss": 1.0898024439811707,
1283
+ "val_obj": 0.010050252545624971,
1284
+ "val_cls": 0.010167492553591728,
1285
+ "val_reg": 1.0695846676826477,
1286
+ "val_duration_s": 0.2662258879863657,
1287
+ "lr_backbone": 0.00011413757749211605,
1288
+ "lr_head": 0.0002555133524132768
1289
+ },
1290
+ {
1291
+ "train_loss": 1.0738294422626495,
1292
+ "train_obj": 0.009950487175956368,
1293
+ "train_cls": 0.01001494494266808,
1294
+ "train_reg": 1.0538640022277832,
1295
+ "train_duration_s": 0.6045228720176965,
1296
+ "val_loss": 1.0913274884223938,
1297
+ "val_obj": 0.010098685510456562,
1298
+ "val_cls": 0.010210573207587004,
1299
+ "val_reg": 1.0710182785987854,
1300
+ "val_duration_s": 0.2637907830066979,
1301
+ "lr_backbone": 0.00011083745712756369,
1302
+ "lr_head": 0.0002192120284032008
1303
+ },
1304
+ {
1305
+ "train_loss": 1.0736822187900543,
1306
+ "train_obj": 0.009967819089069963,
1307
+ "train_cls": 0.010026306379586458,
1308
+ "train_reg": 1.0536880791187286,
1309
+ "train_duration_s": 0.6839520380017348,
1310
+ "val_loss": 1.090309739112854,
1311
+ "val_obj": 0.01013380428776145,
1312
+ "val_cls": 0.010297831613570452,
1313
+ "val_reg": 1.069878101348877,
1314
+ "val_duration_s": 0.2530603529885411,
1315
+ "lr_backbone": 0.00010797073717209007,
1316
+ "lr_head": 0.00018767810889299088
1317
+ },
1318
+ {
1319
+ "train_loss": 1.0723851025104523,
1320
+ "train_obj": 0.009945232421159744,
1321
+ "train_cls": 0.009999534348025918,
1322
+ "train_reg": 1.052440345287323,
1323
+ "train_duration_s": 0.5901737670064904,
1324
+ "val_loss": 1.0915802121162415,
1325
+ "val_obj": 0.010108488146215677,
1326
+ "val_cls": 0.010223244316875935,
1327
+ "val_reg": 1.071248471736908,
1328
+ "val_duration_s": 0.25922492099925876,
1329
+ "lr_backbone": 0.00010554024673218805,
1330
+ "lr_head": 0.00016094271405406863
1331
+ },
1332
+ {
1333
+ "train_loss": 1.073385775089264,
1334
+ "train_obj": 0.009947009617462754,
1335
+ "train_cls": 0.01000955537892878,
1336
+ "train_reg": 1.0534292161464691,
1337
+ "train_duration_s": 0.5940516680129804,
1338
+ "val_loss": 1.0927410125732422,
1339
+ "val_obj": 0.010028076823800802,
1340
+ "val_cls": 0.01014708075672388,
1341
+ "val_reg": 1.0725658535957336,
1342
+ "val_duration_s": 0.25309768196893856,
1343
+ "lr_backbone": 0.000103548384408485,
1344
+ "lr_head": 0.0001390322284933351
1345
+ },
1346
+ {
1347
+ "train_loss": 1.0736308693885803,
1348
+ "train_obj": 0.009940758347511292,
1349
+ "train_cls": 0.009997991379350424,
1350
+ "train_reg": 1.053692102432251,
1351
+ "train_duration_s": 0.6786643419764005,
1352
+ "val_loss": 1.089927077293396,
1353
+ "val_obj": 0.010120156221091747,
1354
+ "val_cls": 0.010192124638706446,
1355
+ "val_reg": 1.069614827632904,
1356
+ "val_duration_s": 0.24875205097487196,
1357
+ "lr_backbone": 0.000101997115928614,
1358
+ "lr_head": 0.00012196827521475404
1359
+ },
1360
+ {
1361
+ "train_loss": 1.0726266205310822,
1362
+ "train_obj": 0.009929253719747066,
1363
+ "train_cls": 0.009985961951315403,
1364
+ "train_reg": 1.052711397409439,
1365
+ "train_duration_s": 0.6735394049901515,
1366
+ "val_loss": 1.088631808757782,
1367
+ "val_obj": 0.009968684520572424,
1368
+ "val_cls": 0.010040625929832458,
1369
+ "val_reg": 1.0686225295066833,
1370
+ "val_duration_s": 0.27209188998676836,
1371
+ "lr_backbone": 0.0001008879722072778,
1372
+ "lr_head": 0.00010976769428005579
1373
+ },
1374
+ {
1375
+ "train_loss": 1.0724263489246368,
1376
+ "train_obj": 0.009923317935317755,
1377
+ "train_cls": 0.009979999158531427,
1378
+ "train_reg": 1.0525230169296265,
1379
+ "train_duration_s": 0.6331688630161807,
1380
+ "val_loss": 1.0892634987831116,
1381
+ "val_obj": 0.010087566915899515,
1382
+ "val_cls": 0.010170313064008951,
1383
+ "val_reg": 1.0690056085586548,
1384
+ "val_duration_s": 0.40267478697933257,
1385
+ "lr_backbone": 0.00010022204783542078,
1386
+ "lr_head": 0.00010244252618962859
1387
+ },
1388
+ {
1389
+ "train_loss": 1.0715472996234894,
1390
+ "train_obj": 0.00991839193738997,
1391
+ "train_cls": 0.009972854517400265,
1392
+ "train_reg": 1.0516560673713684,
1393
+ "train_duration_s": 0.5961767269764096,
1394
+ "val_loss": 1.090360939502716,
1395
+ "val_obj": 0.009901531971991062,
1396
+ "val_cls": 0.009973867796361446,
1397
+ "val_reg": 1.070485532283783,
1398
+ "val_duration_s": 0.2593897999613546,
1399
+ "lr_backbone": 0.0001,
1400
+ "lr_head": 0.0001
1401
+ }
1402
+ ]
onnx/fenris_v0.1.0.onnx ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:50dfccbb474b044a25c4ee03b26b7370c5acf5c13d9747708c3e0823057fcf6e
3
+ size 473293
onnx/fenris_v0.1.0.onnx.data ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:03b9b04232e89d80f2d00711201885fccda00d73ccc254856a45642bec96db9d
3
+ size 12386304
onnx/fenris_v0.1.0_allinone.onnx ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aab42616dc3619e8b305d1b7960510e699f7b09053383605c3fa172b79d5ef36
3
+ size 12851926
pytorch/fenris_v0.1.0.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f43d531fd64b1f7470687b7056328547aa1f8aa35b957e3c24f2c8fac3bd8ddf
3
+ size 12571083
pytorch/fenris_v0.1.0.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3d7b62c2bee148962d86c9dbdba75d759f3fd35b5a567e0d43de3f0fdbfcdf87
3
+ size 12487872
tensorrt/fenris_v0.1.0_fp16.engine ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:380de29737c4d1415c71c9f5331a8d4d119d277e956ce4523c604bd427f9f0c0
3
+ size 8712836
tensorrt/fenris_v0.1.0_fp16.engine.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "mocked": false,
3
+ "format": "tensorrt_engine",
4
+ "precision": "fp16",
5
+ "max_batch_size": 8,
6
+ "onnx_path": "/mnt/artifacts-datai/exports/project_fenris/onnx/fenris_v0.1.0_allinone.onnx",
7
+ "source_checkpoint": "/mnt/artifacts-datai/checkpoints/project_fenris/best.pt",
8
+ "source_config": "/mnt/forge-data/modules/02_defense/project_fenris/configs/base.yaml"
9
+ }
tensorrt/fenris_v0.1.0_fp32.engine ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:77a25de86912f0f5c0621388254d1bf980c3940e8036afba12dafd1071402e0e
3
+ size 17576068
tensorrt/fenris_v0.1.0_fp32.engine.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "mocked": false,
3
+ "format": "tensorrt_engine",
4
+ "precision": "fp32",
5
+ "max_batch_size": 8,
6
+ "onnx_path": "/mnt/artifacts-datai/exports/project_fenris/onnx/fenris_v0.1.0_allinone.onnx",
7
+ "source_checkpoint": "/mnt/artifacts-datai/checkpoints/project_fenris/best.pt",
8
+ "source_config": "/mnt/forge-data/modules/02_defense/project_fenris/configs/base.yaml"
9
+ }