Partitioned Saliency Ranking with Dense Pyramid Transformers
- URL: http://arxiv.org/abs/2308.00236v1
- Date: Tue, 1 Aug 2023 02:33:10 GMT
- Title: Partitioned Saliency Ranking with Dense Pyramid Transformers
- Authors: Chengxiao Sun, Yan Xu, Jialun Pei, Haopeng Fang and He Tang
- Abstract summary: Saliency ranking has emerged as a challenging task focusing on assessing the degree of saliency at instance-level.
Previous approaches undertake the saliency ranking by directly sorting the rank scores of salient instances, which have not explicitly resolved the inherent ambiguities.
We propose the ranking by partition paradigm, which segments unordered salient instances into partitions and then ranks them based on the correlations among these partitions.
- Score: 4.449304130658638
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, saliency ranking has emerged as a challenging task focusing
on assessing the degree of saliency at instance-level. Being subjective, even
humans struggle to identify the precise order of all salient instances.
Previous approaches undertake the saliency ranking by directly sorting the rank
scores of salient instances, which have not explicitly resolved the inherent
ambiguities. To overcome this limitation, we propose the ranking by partition
paradigm, which segments unordered salient instances into partitions and then
ranks them based on the correlations among these partitions. The ranking by
partition paradigm alleviates ranking ambiguities in a general sense, as it
consistently improves the performance of other saliency ranking models.
Additionally, we introduce the Dense Pyramid Transformer (DPT) to enable global
cross-scale interactions, which significantly enhances feature interactions
with reduced computational burden. Extensive experiments demonstrate that our
approach outperforms all existing methods. The code for our method is available
at \url{https://github.com/ssecv/PSR}.
Related papers
- AGRaME: Any-Granularity Ranking with Multi-Vector Embeddings [53.78802457488845]
We introduce the idea of any-granularity ranking, which leverages multi-vector embeddings to rank at varying levels of granularity.
We demonstrate the application of proposition-level ranking to post-hoc citation addition in retrieval-augmented generation.
arXiv Detail & Related papers (2024-05-23T20:04:54Z) - Rethinking Object Saliency Ranking: A Novel Whole-flow Processing
Paradigm [22.038715439842044]
This paper proposes a new paradigm for saliency ranking, which aims to completely focus on ranking salient objects by their "importance order"
The proposed approach outperforms existing state-of-the-art methods on the widely-used SALICON set.
arXiv Detail & Related papers (2023-12-06T01:51:03Z) - Found in the Middle: Permutation Self-Consistency Improves Listwise Ranking in Large Language Models [63.714662435555674]
Large language models (LLMs) exhibit positional bias in how they use context.
We propose permutation self-consistency, a form of self-consistency over ranking list outputs of black-box LLMs.
Our approach improves scores from conventional inference by up to 7-18% for GPT-3.5 and 8-16% for LLaMA v2 (70B)
arXiv Detail & Related papers (2023-10-11T17:59:02Z) - Bipartite Ranking Fairness through a Model Agnostic Ordering Adjustment [54.179859639868646]
We propose a model agnostic post-processing framework xOrder for achieving fairness in bipartite ranking.
xOrder is compatible with various classification models and ranking fairness metrics, including supervised and unsupervised fairness metrics.
We evaluate our proposed algorithm on four benchmark data sets and two real-world patient electronic health record repositories.
arXiv Detail & Related papers (2023-07-27T07:42:44Z) - Heuristic Search for Rank Aggregation with Application to Label Ranking [16.275063634853584]
We propose an effective hybrid evolutionary ranking algorithm to solve the rank aggregation problem.
The algorithm features a semantic crossover based on concordant pairs and a late acceptance local search reinforced by an efficient incremental evaluation technique.
Experiments are conducted to assess the algorithm, indicating a highly competitive performance on benchmark instances.
arXiv Detail & Related papers (2022-01-11T11:43:17Z) - Instance-Level Relative Saliency Ranking with Graph Reasoning [126.09138829920627]
We present a novel unified model to segment salient instances and infer relative saliency rank order.
A novel loss function is also proposed to effectively train the saliency ranking branch.
experimental results demonstrate that our proposed model is more effective than previous methods.
arXiv Detail & Related papers (2021-07-08T13:10:42Z) - PiRank: Learning To Rank via Differentiable Sorting [85.28916333414145]
We propose PiRank, a new class of differentiable surrogates for ranking.
We show that PiRank exactly recovers the desired metrics in the limit of zero temperature.
arXiv Detail & Related papers (2020-12-12T05:07:36Z) - Pseudo-Convolutional Policy Gradient for Sequence-to-Sequence
Lip-Reading [96.48553941812366]
Lip-reading aims to infer the speech content from the lip movement sequence.
Traditional learning process of seq2seq models suffers from two problems.
We propose a novel pseudo-convolutional policy gradient (PCPG) based method to address these two problems.
arXiv Detail & Related papers (2020-03-09T09:12:26Z) - StochasticRank: Global Optimization of Scale-Free Discrete Functions [28.224889996383396]
In this paper, we introduce a powerful and efficient framework for direct optimization of ranking metrics.
We show that classic smoothing approaches may introduce bias and present a universal solution for a proper debiasing.
Our framework applies to any scale-free discrete loss function.
arXiv Detail & Related papers (2020-03-04T15:27:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.