diff --git "a/ANAzT4oBgHgl3EQfF_sv/content/tmp_files/load_file.txt" "b/ANAzT4oBgHgl3EQfF_sv/content/tmp_files/load_file.txt" new file mode 100644--- /dev/null +++ "b/ANAzT4oBgHgl3EQfF_sv/content/tmp_files/load_file.txt" @@ -0,0 +1,1803 @@ +filepath=/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf,len=1802 +page_content='Correlation Loss: Enforcing Correlation between Classification and Localization Fehmi Kahraman*,1, Kemal Oksuz*,1, Sinan Kalkan†,1,2, Emre Akbas†,1,2 1Dept.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' of Computer Engineering.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2METU Center for Robotics and Artificial Intelligence (ROMER) Middle East Technical University (METU), Ankara, Turkey {fehmi.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='kahraman 01, kemal.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='oksuz, skalkan, eakbas}@metu.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='edu.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='tr Abstract Object detectors are conventionally trained by a weighted sum of classification and localization losses.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Recent studies (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', predicting IoU with an auxiliary head, Generalized Fo- cal Loss, Rank & Sort Loss) have shown that forcing these two loss terms to interact with each other in non-conventional ways creates a useful inductive bias and improves perfor- mance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Inspired by these works, we focus on the correlation between classification and localization and make two main contributions: (i) We provide an analysis about the effects of correlation between classification and localization tasks in object detectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We identify why correlation affects the performance of various NMS-based and NMS-free detectors, and we devise measures to evaluate the effect of correla- tion and use them to analyze common detectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (ii) Moti- vated by our observations, e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', that NMS-free detectors can also benefit from correlation, we propose Correlation Loss, a novel plug-in loss function that improves the performance of various object detectors by directly optimizing correla- tion coefficients: E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', Correlation Loss on Sparse R-CNN, an NMS-free method, yields 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 AP gain on COCO and 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 AP gain on Cityscapes dataset.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Our best model on Sparse R-CNN reaches 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 AP without test-time augmentation on COCO test-dev, reaching state-of-the-art.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Code is available at: https://github.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='com/fehmikahraman/CorrLoss.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1 Introduction Most object detectors optimize a weighted sum of classifi- cation and localization losses during training.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Results from recent work suggest that performance improves when these two loss functions are forced to interact with each other in non-conventional ways as illustrated in Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' For example, training an auxiliary (aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=') head to regress the localization qualities of the positive examples, e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' centerness, IoU or mask-IoU, has proven useful (Jiang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2018;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Kim and Lee 2020;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Tian et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1(b)).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Other methods remove such auxiliary heads and aim directly to enforce correlation1 in the classification or localization task during training;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', Average LRP Loss (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) These authors contributed equally.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' †Equal contribution for senior authorship.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Copyright © 2023, Association for the Advancement of Artificial Intelligence (www.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='aaai.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='org).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' All rights reserved.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1In the rest of the paper, “correlation” will refer to the correla- tion between classification scores and IoUs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' weighs the examples in the localization task by ranking them with respect to (wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=') their classification scores (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1(c)).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Using localization quality as an additional supervision sig- nal for classification has been more commonly adopted (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1(d)) (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Liu et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021) in two main ways: (i) Score-based ap- proaches aim to regress the localization qualities (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019, 2020;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021) in the classification score, and (ii) ranking-based approaches enforce the classifier to rank the confidence scores wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' the localization qualities (Liu et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Improving correlation seems to have a positive effect on performance of a variety of object detectors, as shown in Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' However, the effect of correlation on object detectors has not been thoroughly studied.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We fill this gap in this pa- per and first identify that correlation affects the performance of object detectors at two levels: (i) Image-level correlation, the correlation between the classification scores and local- ization qualities (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', IoU for the rest of the paper) of the de- tections in a single image before post-processing, which is important to promote NMS performance, and (ii) Class-level correlation, the correlation over the entire dataset for each class after post-processing, which is related to the COCO- style Average Precision (AP).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Moreover, we quantitatively define correlation at each level to enable analyses on how well an object detector captures correlation (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', βcls in Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2(a)).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Then, we provide an analysis on both levels of correlation and draw important observations using common models.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Finally, to better exploit correlation, we introduce a more direct mechanism to enforce correlation: Correlation Loss, a simple plug-in and detector-independent loss term (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1(e)), improving performance for a wide range of ob- ject detectors including NMS-free detectors, aligning with our analysis (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2(b)).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Similar to the novel loss functions (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021), our Correlation Loss boosts the performance without an aux- iliary head, but different from them, it is a simple plug-in technique that can easily be incorporated into any object de- tector, whether NMS-based or NMS-free.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Our main contributions are: (1) We identify how corre- lation affects NMS-based and NMS-free detectors, and de- sign quantitative measures to analyze a detector wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' corre- lation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (2) We analyze the effects of correlation at different levels on various object detectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (3) We propose Correla- arXiv:2301.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='01019v1 [cs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='CV] 3 Jan 2023 ̂𝑠 ← ̂𝑠×%ℓ Cls Loc Aux ℒ!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' "# ℒ"$!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=" ℒ%&' (b) Auxiliary Head (d) Novel Cls." metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss (a) No correlation Cls Loc ℒ!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' "# ℒ"$!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' ̂𝑠 %𝐵 ̂𝑠 %𝐵 %ℓ Cls Loc ℒ!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' "# ℒ"$!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' ̂𝑠 %𝐵 (c) Novel Loc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss Cls Loc ℒ!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' "# ℒ"$!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' ̂𝑠 %𝐵 Cls Loc ℒ!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' "# ℒ"$!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' ̂𝑠 %𝐵 ℒ!' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='$(( (e) Correlation Loss (Ours) Legend ̂𝑠 : Classification Scores )𝐵 : Box Coordinates )ℓ : Localization Quality (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' centerness) ℒ : A Loss Function Figure 1: Different ways of handling the classification and localization tasks from the perspective of correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (a) Conven- tional case of optimizing the two tasks independently (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', Chen et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (b) An additional auxiliary head predicts centerness (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) or IoU (Jiang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2018;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Kim and Lee 2020), which introduces additional learnable parameters.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (c) Novel loss functions replace the standard localization loss (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) or (d) novel classification loss (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a) by more complicated ones to leverage correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (e) Our Correlation Loss explicitly opti- mizes a correlation coefficient.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' It is a simple, plug-in loss function which does not introduce additional parameters and has the flexibility to supervise classification or localisation head as well as both.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Black and colored arrows respectively denote the loss functions (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', during training) & the network outputs (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', during inference).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 APC 40 41 42 43 44 45 46 47 cls FL Aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' QFL RS FL Aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' QFL RS w/o Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Loss w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Loss (Ours) (a) Detection vs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Correlation ATSS Sparse RCNN YOLACT 25 30 35 40 45 50 APC +1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 AP +2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8% gain +1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 AP +4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2% gain +0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 AP +2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4% gain Baseline Ours (b) Effect of our Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss Figure 2: (a) Detection performance, measured by COCO- style AP (APC) vs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' correlation quality, measured by class- level correlation (βcls - see Section 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 for details).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' The methods proposed to improve the correlation between classi- fication and localization tasks also improve APC.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Compare using aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head, QFL, RS Loss with the baseline ATSS only using Focal Loss (FL – all in red dots) to see the positive cor- relation between APC and βcls.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Our Correlation Loss as a plug-in loss function explicitly optimizes a correlation coef- ficient and improves the detection performance (APC) over different settings of ATSS (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' using FL, aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head, QFL, RS Loss) consistently owing to increasing βcls, validating our hypothesis (compare green stars with red dots).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (b) Our Cor- relation Loss is simple-to-use and improves various meth- ods (i) NMS-based ATSS (w/o aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head) by 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1APC, (ii) NMS-free Sparse R-CNN by 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6APC and (iii) YOLACT, an instance segmentation method by 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7APC.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' tion Loss as a plug-in loss function to optimize correlation explicitly.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Thanks to its simplicity, our loss function can be easily incorporated into a diverse set of object detectors and improves the performance of e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', Sparse R-CNN up to 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 AP and 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0AP75, suggesting, for the first time, that NMS- free detectors can also benefit from correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Our best model yields 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 AP, reaching state-of-the art.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2 Background and Related Work Object Detection Pipeline.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We group object detectors wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' their usage of NMS (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3 presents overview & notation): 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' NMS-based Detectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' To detect all objects with dif- ferent scales, locations and aspect ratios;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' most methods (He et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2017;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Kong et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Law and Deng 2018;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Lin et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Ren et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2017;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Tian et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) employ a large number of object hypotheses (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', anchors, points), which are labeled as positive (a.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='k.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='a.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' foreground) or negative (a.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='k.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='a.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' background) during training, based on whether/how they match GT boxes (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020, 2019).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In this setting, there is no restriction for an ob- ject to be predicted by multiple object hypotheses, causing duplicates.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Accordingly, during inference, NMS picks the detection with the largest confidence score among the detec- tions that overlap more than a predetermined IoU threshold to avoid duplicate detections.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' NMS-free Detectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' An emerging research direction is to remove the need for doing NMS, simplifying the detec- tion pipeline (Carion et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Dai et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Roh et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2022;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b,a;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhu et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' This is achieved by ensuring a one-to-one matching between the GTs and de- tections, which supervises the detector to avoid duplicates in the first place.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Methods Enforcing Correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' One common way to ensure correlation is to use an additional auxiliary head, su- pervised by the localization quality of a detection such as centerness (Tian et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020), IoU (Jiang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2018), mask IoU (Huang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019) or uncertainty (He et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019), during training.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' During inference, the pre- dictions of the auxiliary head are then combined with those of the classifier to improve detection performance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Recent methods show that the auxiliary head can be removed, and either (i) the regressor can prioritize the positive examples (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) or (ii) the classifier can be supervised to prioritize detections with confidence scores.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' The latter is en- sured either by regressing the IoUs by the classifier (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021) or by training the classifier to rank confidence scores (Liu et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a) wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' IoUs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Unlike these methods, TOOD (Feng et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021) takes correlation into account mainly while designing the model, particularly the detection head, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', not the loss function.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Correlation Coefficients.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Correlation coefficients mea- sure the strength and direction of the “relation” between two Ƹ𝑠𝑝𝑟𝑒 𝐼 (score) \u0de0𝐵𝑝𝑟𝑒 𝐼 (box) Image, I Post-processing Object Detector Remove background NMS Top-k For each class c in image I For each image I Collect Ƹ𝑠𝑝𝑜𝑠𝑡 𝑐 & \u0de0𝐵𝑝𝑜𝑠𝑡 𝑐 over all images Ƹ𝑠𝑝𝑜𝑠𝑡 𝐼,𝑐 (score) \u0de0𝐵𝑝𝑜𝑠𝑡 𝐼,𝑐 (box) Figure 3: Object detection pipeline and notation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Given an input image, I, NMS-based detectors yield raw detections before post-processing, each of which has a predicted bounding box (BB) and an array of confidence scores over GT classes.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We denote the confidence scores and the predicted BBs pertaining to the positive detections, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', the detections matching with GT objects during training, by ˆsI pre and ˆBI pre, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' To obtain final detections, raw detections are post-processed in three steps: (i) Detections with low confidence scores, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', background, are removed, (ii) duplicates are eliminated by NMS, and (iii) top-k scoring detections are kept.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' As for these final detections, we denote the confidence scores and BBs of true positive detections for class c in a single image I by ˆsI,c post and ˆBI,c post respectively, and over the entire dataset by ˆsc post and ˆBc post.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' As for NMS-free detectors;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' NMS, dashed gray box in post-processing, is excluded, hence post-processing is lighter.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' sets, X = {x1, .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='..' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', xN} and Y = {y1, .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='..' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', yN}.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Different re- lations are evaluated by different correlation coefficients: (i) Pearson correlation coefficient, denoted by α(·, ·), measures the linear relationship between the sets, (ii) Spearman corre- lation coefficient, β(·, ·), corresponds to the ranking relation- ship and (iii) Concordance correlation coefficient, γ(·, ·), is more strict, measuring the similarity of the values and max- imized when xi = yi for all i ∈ 1, .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='..' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' All correlation coefficients have a range of [−1, +1] where positive/neg- ative correlation corresponds to increasing/decreasing rela- tion, while 0 implies no correlation between X and Y .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Comparative Summary.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In this paper, we comprehen- sively identify and analyze the effect of explicitly correlat- ing classification and localization in object detectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Unlike other methods that also enforce correlation, some of which are tested only on a single architecture (Huang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Jiang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2018;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Tian et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019), we propose a simple solu- tion by directly optimizing the correlation coefficient, which is auxiliary-head free and easily applicable to all object de- tectors, whether NMS-based or NMS-free.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Also, ours is the first to work on NMS-free detectors in this context.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3 Effects of Correlation on Object Detectors This section presents why maximizing correlation is impor- tant for object detectors, introduces measures to evaluate ob- ject detectors wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' correlation and provides an analysis on methods designed for improving correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 How Correlation Affects Object Detectors Detectors are affected by correlation at two levels (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 4): Image-level Correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' This level of correlation corre- sponds to the correlation between the classification scores and IoUs of the detections in a single image before post- processing, and accordingly, we measure it with the Spear- man correlation coefficient2, β(·, ·), averaged over images.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2While analyzing object detectors in terms of correlation, we employ Spearman correlation coefficient, β(·, ·), to measure the relation between the ranks of the values (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', scores and IoUs) in- stead of the values themselves, and aim to isolate the correlation quality from the localization and classification performances.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Denoting the set of images to be evaluated by I and IoUs between the BBs of the positive detections ( ˆBI pre, Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3) and their associated GTs by IoUI pre, image-level correlation is measured as follows: βimg = 1 |I| � I∈I β(IoUI pre, ˆsI pre).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (1) Maximizing image-level correlation is important for NMS-based detectors since NMS aims to suppress dupli- cates, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', to keep only a single detection for each GT when there is more than one.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' More particularly among overlap- ping detections (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', dark and light green detections in the detector output image in Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 4(a)), NMS picks the one with the larger score, and hence, if there is positive correlation between the confidence scores and IoUs of those overlap- ping detections, then the one with the best IoU (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', dark green detection in Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 4(a)) will survive and detection per- formance will increase.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Class-level Correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' This level of correlation indi- cates the correlation between the classification scores and IoUs of the detections obtained after post-processing for each class.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Since class-level correlation is related to COCO- style AP, APC, we average β(·, ·) over classes to be consis- tent with the computation of APC: βcls = 1 |C| � c∈C β(IoUc post, ˆsc post), (2) where C is the set of classes in the dataset and IoUc post is the set IoUs of BBs of true positives for class c ( ˆBc post, Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Class-level correlation affects the performance of all de- tectors since it is directly related to APC, the performance measure itself.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' To be more specific, APC for a single class is defined as the average of APs computed over 10 differ- ent IoU thresholds, IoU ∈ {0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='50, 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='55, .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='..' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='95}, validating the true positives.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' For a specific threshold IoU, the detec- tions are first sorted with respect to the classification scores, and then precision and recall pairs are calculated on each detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Using these pairs, a precision-recall (PR) curve is obtained, and the area under the PR curve corresponds to Positively correlated NMS Solid BBs : Ground truths Dashed BBs : Detections before post-processing ( Ƹ𝑠𝑝𝑟𝑒 𝐼 , \u0de0𝐵𝑝𝑟𝑒 𝐼 in Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3) Positively correlated - High APC (b) Class-level Correlation for better AP AP Calculation 𝐼𝑜𝑈 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='80 N/A 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='60 N/A 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='50 Ƹ𝑠 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='80 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='70 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='60 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='55 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='50 𝐼𝑜𝑈 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='80 N/A 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='60 N/A 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='50 Ƹ𝑠 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='50 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='55 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='60 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='70 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='80 Negatively correlated - Low APC Precision Pos.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Detections APIoU P50 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='00 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='67 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='60 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='76 P75 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='00 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='00 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='00 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='33 Precision Pos.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Detections APIoU P50 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='00 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='67 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='60 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='76 P75 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='20 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='00 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='00 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='07 Negatively correlated Solid BBs : Ground truths Dashed BBs : Detections after post-processing ( Ƹ𝑠𝑝𝑜𝑠𝑡 𝐼,𝑐 , \u0de0𝐵𝑝𝑜𝑠𝑡 𝐼,𝑐 in Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3) (a) Image-level Correlation for better NMS High IoU Low IoU Figure 4: How correlation affects detection performance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (a) Image-level correlation: Given detections before post-processing, NMS benefits from image-level correlation, thereby yielding detections with better IoU.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Compare IoUs of detections in “posi- tively correlated” (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', when the dark-colored ones have larger score) and “negatively correlated” (i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', when the light-colored ones have larger score) outputs after NMS.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (b) Class-level correlation: Given detections after post-processing, APs with larger IoUs and COCO-style AP benefit from positive class-level correlation (compare APIoU columns in “positively correlated” and “negatively correlated” outputs after AP Calculation to see lower AP75 for the “negatively correlated” output in the red cell).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' PIoU: Precision computed on a detection using the threshold IoU, True positives are color-coded in tables and input, white cells: false positives, and hence their IoU is not available, N/A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' the single AP value, APIoU.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' When the correlation between classification and localization is maximized among true pos- itives, larger precision values are obtained on the same de- tections in larger IoU values (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' P75 of orange detection is 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='00 and 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='20 with positive and negative correlation respec- tively in Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 4(b)).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 Analyses of Object Detectors wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Correlation Dataset and Implementation Details.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Unless otherwise specified;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' we (i) employ the widely-used COCO dataset (Lin et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2014) by training the models on trainval35K (115K images), testing on minival (5k images), comparing with SOTA on test-dev (20k images), (ii) build upon the mmde- tection framework (Chen et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019), (iii) rely on AP-based measures and also use Optimal LRP (oLRP) (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b), βimg (Eq.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1) and βcls (Eq.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2) to provide more in- sights, (iv) keep the standard configuration of the models, (v) use a ResNet-50 backbone with FPN (Lin et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2017), (vi) train models on 4 GPUs (A100 or V100 type GPUs) with 4 images on each GPU (16 batch size).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Analysis Setup.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We conduct experiments to analyze the effects of the image-level (βimg – Table 1) and class-level (βcls – Table 2) correlations.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' For both analyses, we com- pare three sets of methods, all of which are incorporated into the common ATSS baseline (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) (see Sec.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2 for a discussion of these methods): (i) AP Loss and Focal Loss as methods not enforcing correlation, (ii) using an auxiliary head to enforce correlation, and (iii) Quality Fo- cal Loss (QFL), aLRP Loss and Rank & Sort Loss as recent loss functions enforcing correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In our class-level anal- ysis, we also employ NMS-free methods to demonstrate the effects of correlation on that approach.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We compare the methods based on (i) their AP-based per- formance, (ii) our proposed measures for correlation (Eqs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1 and 2), and finally (iii) lower/upper bounds, AP+1 C /AP−1 C , obtained by modifying the ranking of the confidence scores pertaining to the GT classes of the positive detections to minimize/maximize Eq.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1 in Table 1 and Eq.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2 in Table 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' More particularly, in Table 1, given ˆsI pre and ˆBI pre (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3), we collect the GT class probabilities of positive detections and change their ranking in ˆsI pre within an image follow- ing the ranking order of IoUs (computed using ˆBI pre), and in Table 2, we do the same operation class-wise for true posi- tives given ˆsc post and ˆBc post (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' To decouple other types of errors as much as possible;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' in Table 1, we do not modify the scores of the negative detections, the predicted BBs and the scores of non-GT classes of the positive detections, and in Table 2, we do not modify the scores of the false positives and the predicted BBs of the true positives.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Note that achiev- ing the upper bound in (iii) for image-level correlation also corresponds to perfectly minimizing RS Loss.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Observations.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We observe in Tables 1 and 2 that: (1) Our proposed measures in Eqs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1 and 2 can measure the improvements in correlation consistently.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In Tables 1 and 2, (i) aLRP Loss and RS Loss are proposed to improve AP Loss and (ii) aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head and QFL are proposed to improve Fo- cal Loss.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In both tables, the proposed methods are shown to improve their baselines in terms of βimg and βcls, suggest- ing that our measures can consistently evaluate image-level and class-level correlations respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (2) NMS-free detectors can also potentially benefit from correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' All detectors, including NMS-free ones, can ex- ploit class-level correlation (compare APC and AP+1 C to see ∼ 10 points gap in Table 2).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Still, existing methods do not enforce this correlation on NMS-free detectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (3) Existing methods enforcing correlation have still a large room for improvement.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Considering that βimg ∈ [27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2%, 33.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8%] (Table 1) and βcls ∈ [37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5%, 47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0%] (Table 2), there is still room for improvement wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Performance Modify ranking of scores Method APC AP50 AP75 βimg AP−1 C AP−1 50 AP−1 75 AP+1 C AP+1 50 AP+1 75 Not Enforcing Correlation ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' AP Loss (Chen et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 24.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 19.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 72.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Focal Loss (Lin et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 25.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 21.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 60.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 Using Aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Head ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' ctr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 32.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 15.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 64.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 Using Novel Loss ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' aLRP Loss (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 33.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 22.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 17.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' QFL (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 33.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 25.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 21.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 60.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' RS Loss (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 26.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 21.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 71.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 Table 1: Evaluation of NMS-based detectors in terms of image-level correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' See Eq.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1 for βimg.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' AP+1 IoU and AP−1 IoU refer to the upper & lower bound APs (see analysis setup for details).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' The values are in %.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Our βimg captures correlation consistently, e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' that (i) Focal Loss is improved by ctr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head and QFL and (ii) AP Loss is improved by aLRP Loss and RS Loss wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' βimg.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Also, there is still room for improvement for object detectors wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' βimg with a range between 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2% and 33.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8%.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Performance Modify ranking of scores Method APC AP50 AP75 βcls AP−1 C AP−1 50 AP−1 75 AP+1 C AP+1 50 AP+1 75 Not Enforcing Correlation NMS-free Detectors Sparse R-CNN (Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 DETR (Carion et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 60.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 32.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 60.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 60.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 NMS-based Detectors ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' AP Loss (Chen et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 26.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Focal Loss (Lin et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 Using Aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Head ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' ctr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 Using Novel Loss ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' aLRP Loss (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 25.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' QFL (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 45.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 ATSS w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' RS Loss (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 31.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 Table 2: Evaluation of detectors wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' class-level correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' See Eq.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2 for βcls.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' AP+1 IoU & AP−1 IoU denote upper & lower bound APs (analysis setup for details).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Values are in %.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' NMS-free detectors can also benefit from class-level correlation (compare AP+1 C with APC for Sparse R-CNN), and as in βimg (c.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='f.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Table 1 and its caption), βcls measures the correlation consistently.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' AP+1 50 = AP−1 50 = AP50 since only modifying TPs validated from IoU=0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='50 does not effect AP50 (see Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 4(b) for an example).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (4) While significantly important, improving correlation may not always imply performance improvement.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' For exam- ple, aLRP Loss in Table 1 has the largest correlation but the lowest APC.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Such a situation can arise, for example, when a method does not have good localization performance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In the extreme case, assume a detector yields perfect βimg, image- level ranking correlation, but the IoUs of all positive exam- ples are less than 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='50 implying no TP at all.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Hence, boost- ing the correlation, while simultaneously preserving a good performance in each branch, is critical.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 4 Correlation Loss: A Novel Loss Function for Object Detection Correlation (Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=') Loss is a simple plug-in loss function to improve correlation of classification and localization tasks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Correlation Loss is unique in that it can be easily incorpo- rated into any object detector, whether NMS-based or NMS- free (see Observation (2) - Sec.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2), and improves perfor- mance without affecting the model size, inference time and with negligible effect on training time (Sec.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Further- more, from a fundamental perspective, Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss can su- pervise both of the classification and localisation heads for a better correlation while existing methods generally focus on a single head such as classification (Fig.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Definition.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Given an object detector with loss function LOD, our Correlation Loss (Lcorr) is simply added using a weighting hyper-parameter λcorr: LOD + λcorrLcorr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (3) Lcorr is the Correlation Loss defined as: Lcorr = 1 − ρ( ˆ IoU,ˆs), (4) where ρ(·, ·) is a correlation coefficient;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' ˆs and ˆ IoU are the confidence scores of the GT class and IoUs of the predicted BBs pertaining to the positive examples in the batch.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Practical Usage.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' To avoid promoting trivial cases with high correlation but low performance (Observation (4) - Sec.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2), similar to QFL (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) and RS Loss (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a), we only use the gradients of Lcorr wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' classification score, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', we backpropagate the gradi- ents through only the classifier.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We mainly adopt two dif- ferent correlation coefficients for ρ(·, ·) and obtain two ver- sions of Correlation Loss: (i) Concordance Loss, defined as the Correlation Loss when Concordance correlation coeffi- cient is optimized (ρ(·, ·) = γ(·, ·)), which aims to match Method APC ↑AP50 ↑AP75 ↑ oLRP ↓ NMS-based Retina Net (Lin et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 36.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 Fovea Box (Kong et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 36.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 ATSS (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 PAA (Kim and Lee 2020) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 44.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 NMS-free Sparse R-CNN (Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 Table 3: Comparison on detectors not considering correla- tion.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Accordingly, we remove aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' heads from ATSS (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) and PAA (Kim and Lee 2020) for fair compari- son (see Table 6 for comparison with aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' heads and novel loss functions).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We use ResNet-50 and train the models for 12 epochs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Simply incorporating our Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss provides (i) ∼ 1APC improvement for NMS-based detectors consis- tently and (ii) ∼ 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5APC on the NMS-free Sparse R-CNN.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' the confidence scores with IoUs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (ii) Spearman Loss as Cor- relation Loss when Spearman correlation coefficient is op- timized (ρ(·, ·) = β(·, ·)), thereby enforcing the ranking of the classification scores considering IoUs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' To tackle the non-differentiability of ranking operation while computing Spearman Loss, we leverage the differentiable sorting oper- ation from Blondel et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (Blondel et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' When apply- ing our Correlation Loss to NMS-free methods, which use an iterative multi-stage loss function, we incorporate Lcorr to every stage.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 5 Experimental Evaluation We evaluate Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss on (i) the COCO dataset with five different object detectors of various types (Sparse R-CNN as NMS-free, FoveaBox as anchor-free, RetinaNet as anchor- based, ATSS and PAA using auxiliary head), and one in- stance segmentation method, YOLACT;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and (ii) an addi- tional dataset (Cityscapes) for the method with the largest gain, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', Sparse R-CNN.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 Comparison with Methods Not Considering Correlation We train these five object detectors and the instance segmen- tation method (Tables 3 and 5) with and without our Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss (as Concordance Loss or Spearman Loss).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' NMS-based Detectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Table 3 suggests ∼ 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0APC gain on NMS-based detectors: (i) Spearman Loss (λcorr = 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1) improves RetinaNet by 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0APC and oLRP, (ii) Concor- dance Loss (λcorr = 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2) enhances anchor-free FoveaBox by 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7APC, and (iii) Concordance Loss (λcorr = 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3) im- proves ATSS and PAA by ∼ 1APC and ∼ 1oLRP.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' NMS-free Detectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Our results in Table 3 suggest that Sparse R-CNN, an NMS-free method, can also benefit from Method AP AP50 AP75 Sparse R-CNN 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 63.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 64.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 Table 4: Results on Cityscapes dataset.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Method APmask C APmask 50 APmask 75 YOLACT (Bolya et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019) 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 Table 5: Comparison with YOLACT.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' our Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss: (i) Both Concordance (λcorr = 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3) and Spearman Losses (λcorr = 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2) improve baseline;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (ii) Spearman Loss improves APC significantly by up to 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (iii) as hypothesized, the gains are owing to APs with larger IoUs, e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', AP75 improves by up to 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0, and (iv) gains persist in a stronger setting of Sparse R-CNN (Appendix).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Cityscapes dataset.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' To see the effect of Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss over different scenarios, we train Sparse R-CNN with Spear- man Loss (the model that has the best gain over baseline in Table 3), on the Cityscapes dataset (Cordts et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2016) (λcorr = 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6), a dataset for autonomous driving object de- tection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Table 4 presents that (i) Spearman Loss also im- proves baseline Sparse R-CNN on Cityscapes by 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 AP and (ii) our gain mainly originates from APs with larger IoUs, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' AP75 improves by more than 3 points, from 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 to 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Instance Segmentation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We train YOLACT (Bolya et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019) as an instance segmentation method with Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss and observed 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 mask AP gain using Spearman Loss (λcorr = 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 - Table 5), implying 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7% relative gain.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 Comparison with Methods Enforcing Correlation Table 6 compares Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' with using an aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020), QFL (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) and RS Loss (Ok- suz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a) on the common ATSS baseline (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' detection and correlation: Detection Performance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Reaching 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8APC without an aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head, Concordance Loss (Table 6) outperforms using an aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head, which introduces additional learnable parameters (39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 vs 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3APC), and reaches on-par performance with the recently proposed, relatively complicated loss functions, Aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='QFLRS LossOurs APC AP50 AP75 oLRP ↓ βimg ↑βcls ↑ 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 ✓ 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 ✓ 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 33.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 45.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 ✓ 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 ✓ 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 31.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 45.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 ✓ ✓ 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 31.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 44.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 ✓ ✓ 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 34.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 45.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 ✓ ✓ 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 33.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 46.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 Table 6: Comparison with methods enforcing correlation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loss (i) reaches similar results with existing methods on ATSS, (ii) is complementary to those methods thanks to its simple design and (iii) once combined with RS Loss, out- performs compared methods.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Method Backbone Epochs APC AP50 AP75 APS APM APL Venue NMS-based ATSS (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) ResNet-101-DCN 24 46.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 64.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 50.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 CVPR 2020 GFLv2 (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019) ResNet-101-DCN 24 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 66.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 60.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 CVPR 2021 aLRP Loss (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) ResNeXt-101-DCN 100 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 NeurIPS 2020 VFNet (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021) ResNet-101-DCN 24 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 CVPR 2021 DW (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2022) ResNet-101-DCN 24 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 63.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 CVPR 2022 TOOD (Feng et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021) ResNet-101-DCN 24 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 ICCV 2021 RS-Mask R-CNN+ (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a) ResNeXt-101-DCN 36 50.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 31.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 63.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 ICCV 2021 NMS-free TSP R-CNN (Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021c) ResNet-101-DCN 96 47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 66.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 59.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 ICCV 2021 Sparse R-CNN (Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b) ResNeXt-101-DCN 36 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 50.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 CVPR 2021 Dynamic DETR (Dai et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021) ResNeXt-101-DCN 36 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 ICCV 2021 Deformable DETR (Zhu et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021) ResNeXt-101-DCN 50 50.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 64.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 ICLR 2021 Ours Corr-Sparse R-CNN ResNet-101-DCN 36 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 52.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 64.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 Corr-Sparse R-CNN ResNeXt-101-DCN 36 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 31.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 66.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 Table 7: SOTA comparison on COCO test-dev.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Our Corr-Sparse R-CNN (i) performs on-par or better compared to recent NMS- based methods, all of which also enforce correlation, and (ii) outperforms NMS-free methods by a notable margin.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Results are obtained from papers.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' QFL (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) and RS Loss (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Be- sides, owing to its simple usage, Concordance Loss is com- plementary to existing methods: It yields 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0APC with an aux.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' head (+0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 APC) and 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2APC with RS Loss (+0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 APC) without introducing additional learnable parameters.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Correlation Analysis.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' To provide insight, we report βimg (Eq.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 1) and βcls (Eq.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2) in Table 6: Our Concordance Loss (i) improves baseline correlation significantly, enhancing βimg (from 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3% to 31.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6%) and βcls (from 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3% to 45.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2%) both by ∼ 5%, and (ii) results in better correlation than all methods wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' βimg and βcls once combined with QFL and RS Loss respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' This set of results confirms that Con- cordance Loss improves correlation between classification and localization tasks in both image-level and class-level.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 Comparison with SOTA Here, we prefer Sparse R-CNN owing to its competitive de- tection performance and our large gains.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We train our “Corr- Sparse R-CNN” for 36 epochs with DCNv2 (Zhu et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019) and multiscale training by randomly resizing the shorter side within [480, 960] similar to common practice (Oksuz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Table 7 presents the results on COCO test-dev (Lin et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2014): NMS-based Methods.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' On the common ResNet-101- DCN backbone and with similar data augmentation, our Corr-Sparse R-CNN yields 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6APC at 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 fps (on a V100 GPU) outperforming recent NMS-based methods, all of which also enforce correlation, e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=', (i) RS-R-CNN (Ok- suz et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a) by 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8APC, (ii) GFLv2 (Li et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019) by more than 1APC, and (iii) VFNet (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021) in terms of not only APC but also efficiency (with 12.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 fps on a V100 GPU).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' On ResNeXt-101-DCN, our Corr-Sparse R- CNN provides 51.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0APC at 6.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 fps, surpassing all methods including RS-Mask R-CNN+ (50.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2APC at 6.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 fps), addi- tionally using masks and Carafe FPN (Wang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' NMS-free Methods.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Our Corr-Sparse R-CNN outper- forms (i) TSP R-CNN (Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021c) by more than 2APC on ResNet-101-DCN with significantly less training, (ii) Sparse R-CNN (Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b) by ∼ 2APC and De- formable DETR (Zhu et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021), a recent strong NMS-free method, by ∼ 1APC on ResNeXt-101-DCN.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 Ablation & Hyper-parameter Analyses Optimizing Different Correlation Coefficients.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spearman Loss yields better localization performance, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' the lowest localization error wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' oLRPLoc in all experiments while it rarely yields the best oLRPFP or oLRPFN, implying its contribution to classification to be weaker than Concordance Loss (see Appendix for components of oLRP).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We also tried Pearson Correlation Coefficient on ATSS and Sparse R-CNN but it performed worse compared to either using Spearman or Concordance (Appendix).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Backpropagating Through Different Heads.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' On Sparse R-CNN, we observed that the performance degrades when we backpropagate either only localization head (37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 AP) or both heads (38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 AP).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Hence, we preferred backpropagating the gradients only through the classification head (39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 AP).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Effect on Training Time.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Using Spearman or Concor- dance Loss to train Sparse R-CNN, computing the loss for 6 times each iteration, increases iteration time 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='50 sec to 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='51 sec on V100 GPUs, suggesting a negligible overhead.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Sensitivity to λcorr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We found it sufficient to search over {0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1, 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2, 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4, 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5, 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6} to tune λcorr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Appendix presents empirical results for grid search.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 Additional Material This paper is accompanied by an Appendix containing (i) the effect of Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Loss on Sparse R-CNN using its stronger set- ting, (ii) components of oLRP for detectors in Table 3, (iii) results when Pearson Correlation Coefficient is optimized, (iv) our grid search to tune λcorr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 6 Conclusion In this paper, we defined measures to evaluate object detec- tors wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' correlation, provided analyses on several methods and proposed Correlation Loss as an auxiliary loss function to enforce correlation for object detectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Our extensive experiments on six detectors show that Correlation Loss.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' consistently improves the detection and correlation perfor- mances, and reaches SOTA results.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Acknowledgments This work was supported by the Scientific and Technolog- ical Research Council of Turkey (T ¨UB˙ITAK) (under grant 120E494).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We also gratefully acknowledge the computa- tional resources kindly provided by T ¨UB˙ITAK ULAKBIM High Performance and Grid Computing Center (TRUBA) and METU Robotics and Artificial Intelligence Center (ROMER).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Dr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Akbas is supported by the “Young Scientist Awards Program (BAGEP)” of Science Academy, Turkey.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' References Blondel, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Teboul, O.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Berthet, Q.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Djolonga, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Fast differentiable sorting and ranking.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In International Conference on Machine Learning (ICML).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Bolya, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhou, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Xiao, F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Lee, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' YOLACT: Real-time Instance Segmentation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Interna- tional Conference on Computer Vision (ICCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Carion, N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Massa, F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Synnaeve, G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Usunier, N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Kirillov, A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Zagoruyko, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' End-to-End Object Detection with Transformers.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In European Conference on Computer Vision (ECCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Chen, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Lin, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' li, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' See, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Zou, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' AP-Loss for Accurate One-Stage Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 1–1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Chen, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Pang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Cao, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Xiong, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Sun, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Feng, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Liu, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Xu, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Cheng, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhu, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Cheng, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhao, Q.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Lu, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhu, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wu, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Dai, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Shi, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Ouyang, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loy, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Lin, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' MMDetection: Open MMLab Detection Toolbox and Benchmark.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' arXiv, 1906.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='07155.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Cordts, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Omran, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Ramos, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Rehfeld, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Enzweiler, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Benenson, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Franke, U.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Roth, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Schiele, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2016.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' The Cityscapes Dataset for Semantic Urban Scene Understanding.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE Conference on Computer Vision and Pattern Recognition (CVPR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Dai, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Chen, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Yang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Yuan, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Zhang, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Dynamic DETR: End-to-End Object Detection With Dynamic Attention.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF International Con- ference on Computer Vision (ICCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Feng, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhong, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Gao, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Scott, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Huang, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' TOOD: Task-aligned One-stage Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In The International Conference on Computer Vision (ICCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' He, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Gkioxari, G.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Dollar, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Girshick, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2017.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Mask R-CNN.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF International Conference on Com- puter Vision (ICCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' He, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhu, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Savvides, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Zhang, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Bounding Box Regression With Uncertainty for Accurate Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Huang, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Huang, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Gong, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Huang, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Wang, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Mask Scoring R-CNN.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Jiang, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Luo, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Mao, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Xiao, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Jiang, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2018.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Acquisition of Localization Confidence for Accurate Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In The European Conference on Computer Vision (ECCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Kim, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Lee, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Probabilistic Anchor As- signment with IoU Prediction for Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In The European Conference on Computer Vision (ECCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Kong, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Sun, F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Liu, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Jiang, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Shi, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' FoveaBox: Beyound Anchor-Based Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' IEEE Transactions on Image Processing, 29: 7389–7398.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Law, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Deng, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2018.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' CornerNet: Detecting Objects as Paired Keypoints.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In The European Conference on Com- puter Vision (ECCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' He, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Zhang, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2022.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' A Dual Weight- ing Label Assignment Scheme for Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Hu, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Tang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Yang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Generalized Focal Loss V2: Learning Reliable Lo- calization Quality Estimation for Dense Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wu, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Chen, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Hu, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Tang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Yang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Generalized Focal Loss: Learning Qualified and Distributed Bounding Boxes for Dense Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In Advances in Neural Information Processing Systems (NeurIPS).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Lin, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Doll´ar, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Girshick, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' He, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Hariharan, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Belongie, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2017.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Feature Pyramid Networks for Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Lin, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='-Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' ;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Goyal, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Girshick, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' He, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Doll´ar, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Focal Loss for Dense Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 42(2): 318–327.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Lin, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='-Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' ;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Maire, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Belongie, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Hays, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Perona, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Ra- manan, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Doll´ar, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Zitnick, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2014.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Microsoft COCO: Common Objects in Context.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In The European Con- ference on Computer Vision (ECCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Liu, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zheng, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Tian, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Shan, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' RankDetNet: Delving Into Ranking Constraints for Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 264–273.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Oksuz, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Cam, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Akbas, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Kalkan, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' A Ranking-based, Balanced Loss Function Unifying Classifi- cation and Localisation in Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In Advances in Neural Information Processing Systems (NeurIPS).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Oksuz, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Cam, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Akbas, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Kalkan, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Rank & Sort Loss for Object Detection and Instance Seg- mentation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In The International Conference on Computer Vision (ICCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Oksuz, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Cam, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Kalkan, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Akbas, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' One Metric to Measure them All: Localisation Recall Pre- cision (LRP) for Evaluating Visual Detection Tasks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' IEEE Transactions on Pattern Analysis and Machine Intelligence, 1–1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Ren, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' He, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Girshick, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Sun, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2017.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 39(6): 1137–1149.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Roh, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Shin, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Shin, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Kim, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2022.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Sparse DETR: Efficient End-to-End Object Detection with Learnable Spar- sity.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In The International Conference on Learning Repre- sentations (ICLR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Sun, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Jiang, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Xie, E.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Shao, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Yuan, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Luo, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021a.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' What Makes for End-to-End Object De- tection?' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In International Conference on Machine Learning (ICML).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Sun, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Jiang, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Kong, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Xu, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhan, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Tomizuka, M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Yuan, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Luo, P.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' SparseR-CNN: End-to-End Object Detection with Learnable Proposals.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Conference on Com- puter Vision and Pattern Recognition (CVPR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Sun, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Cao, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Yang, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Kitani, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' M.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021c.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Re- thinking Transformer-Based Set Prediction for Object De- tection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF International Conference on Com- puter Vision (ICCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Tian, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Shen, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Chen, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and He, T.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' FCOS: Fully Convolutional One-Stage Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF International Conference on Computer Vision (ICCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Chen, K.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Xu, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Liu, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Loy, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Lin, D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' CARAFE: Content-Aware ReAssembly of FEatures.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF International Conference on Computer Vision (ICCV).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Dayoub, F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and S¨underhauf, N.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' VarifocalNet: An IoU-aware Dense Object Detector.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Chi, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Yao, Y.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Lei, Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Li, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Z.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Bridging the Gap Between Anchor-Based and Anchor-Free Detection via Adaptive Training Sample Selection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhang, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wan, F.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Liu, C.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Ji, R.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Ye, Q.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' FreeAn- chor: Learning to Match Anchors for Visual Object Detec- tion.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In Advances in Neural Information Processing Systems (NeurIPS).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhu, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Hu, H.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Lin, S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Dai, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Deformable Con- vNets V2: More Deformable, Better Results.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Zhu, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Su, W.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Lu, L.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Li, B.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Wang, X.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=';' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' and Dai, J.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Deformable {DETR}: Deformable Transformers for End- to-End Object Detection.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In International Conference on Learning Representations (ICLR).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' APPENDIX Sensitivity to λcorr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' In Table A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8, we see that (i) λcorr = 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 provides the best performance overall, (ii) the perfor- mance is not very sensitive to λcorr and (iii) a grid search over {0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1, 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2, 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4, 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5, 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6} is sufficient (outside of this range, performance drops).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' The effect of Corr.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Loss on Sparse R-CNN using its stronger setting.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Following Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b) Method Dataset 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 ATSS COCO 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 YOLACT COCO 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 Sparse R-CNN COCO 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 Sparse R-CNN Cityscapes 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 Table A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8: Grid search to tune λcorr on different models.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We present the results for concordance correlation coeffi- cient for ATSS and YOLACT, and spearman correlation co- efficient for Sparse R-CNN models.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 corresponds to not including our Correlation Loss.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Method AP AP50 AP75 Sparse R-CNN (Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b) 45.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 64.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 45.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 64.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 46.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 64.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 50.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 Table A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9: Comparison with stronger Sparse R-CNN.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' (Table A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9), we train Sparse R-CNN with 36 epochs train- ing, 300 proposals, multi-scale training and random crop- ping.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Table A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 presents that the improvement of our Spear- man Loss on this strong baseline is ∼ 1 AP points.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Using Pearson Correlation Coefficient.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' We tried opti- mizing pearson correlation coefficient as well and observed that while it has similar performance with concordance cor- relation coefficient on ATSS and spearman correlation coef- ficient on Sparse R-CNN, it does not outperform the other two in both of the cases (Table A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='10).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Considering the sim- ilarities of spearman and concordance correlation coeffi- cients in terms of scoring the relation of the values, we preferred concordance correlation coefficient over spearman correlation coefficient due to the fact that concordance corre- lation coefficient enforces the scores to be equal to the IoUs imposing a tighter constraint than pearson correlation coef- ficient.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' The components of oLRP.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Table A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='11 shows the the components of oLRP for different detectors corresponding to Table 3 in the paper.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' As discussed in the paper, Spearman Loss yields better localization performance, i.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' the lowest localization error wrt.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' oLRPLoc in all experiments while it rarely yields the best oLRPFP or oLRPFN, implying its contribution to classification to be weaker than Concordance Loss.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Method APC AP50 AP75 ATSS w/o aux head 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Pearson Corr 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 Sparse-RCNN 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Pearson Corr 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 Table A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='10: Effect of using Pearson correlation coefficient.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Method APC ↑ AP50 ↑ AP75 ↑ oLRP ↓ oLRPLoc ↓ oLRPFP ↓ oLRPFN ↓ Retina Net (Lin et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 36.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 32.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 49.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 31.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 Fovea Box (Kong et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 36.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 17.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 70.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 31.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 ATSS (Zhang et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2020) 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 15.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 46.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 15.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 31.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 46.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 PAA (Kim and Lee 2020) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 15.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 30.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 44.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 15.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 46.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 43.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 67.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 14.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 29.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 46.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 Sparse R-CNN (Sun et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' 2021b) 37.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 69.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 16.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='0 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='6 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Conc.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 38.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='9 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='8 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 15.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 47.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='2 w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content=' Spear.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='Corr (Ours) 39.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='7 42.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='5 68.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 15.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='3 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='1 48.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='4 Table A.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'} +page_content='11: Components of oLRP for Table 3 in the paper.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/ANAzT4oBgHgl3EQfF_sv/content/2301.01019v1.pdf'}