Universal Multi-Domain Translation via Diffusion Routers
- URL: http://arxiv.org/abs/2510.03252v1
- Date: Fri, 26 Sep 2025 07:32:43 GMT
- Title: Universal Multi-Domain Translation via Diffusion Routers
- Authors: Duc Kieu, Kien Do, Tuan Hoang, Thao Minh Le, Tung Kieu, Dang Nguyen, Thin Nguyen,
- Abstract summary: Multi-domain translation (MDT) aims to learn translations between multiple domains, yet existing approaches either require fully aligneds or can only handle domain pairs seen in training.<n>We introduce universal MDT (UMDT), a generalization of MDT that seeks to translate between any pair of $K$ domains using only $K-1$ paired with a central domain.<n>We propose Diffusion Router (DR), a unified diffusion-based framework that models all central$leftarrow$non-central translations with a single noise predictor.
- Score: 23.85537575452933
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multi-domain translation (MDT) aims to learn translations between multiple domains, yet existing approaches either require fully aligned tuples or can only handle domain pairs seen in training, limiting their practicality and excluding many cross-domain mappings. We introduce universal MDT (UMDT), a generalization of MDT that seeks to translate between any pair of $K$ domains using only $K-1$ paired datasets with a central domain. To tackle this problem, we propose Diffusion Router (DR), a unified diffusion-based framework that models all central$\leftrightarrow$non-central translations with a single noise predictor conditioned on the source and target domain labels. DR enables indirect non-central translations by routing through the central domain. We further introduce a novel scalable learning strategy with a variational-bound objective and an efficient Tweedie refinement procedure to support direct non-central mappings. Through evaluation on three large-scale UMDT benchmarks, DR achieves state-of-the-art results for both indirect and direct translations, while lowering sampling cost and unlocking novel tasks such as sketch$\leftrightarrow$segmentation. These results establish DR as a scalable and versatile framework for universal translation across multiple domains.
Related papers
- LADB: Latent Aligned Diffusion Bridges for Semi-Supervised Domain Translation [54.690154688667086]
Diffusion models excel at generating high-quality outputs but face challenges in data-scarce domains.<n>We propose Latent Aligned Diffusion Bridges (LADB), a semi-supervised framework for sample-to-sample translation.
arXiv Detail & Related papers (2025-09-10T14:23:07Z) - Large Language Model for Multi-Domain Translation: Benchmarking and Domain CoT Fine-tuning [55.107329995417786]
Large language models (LLMs) have demonstrated impressive general understanding and generation abilities.
We establish a benchmark for multi-domain translation, featuring 25 German$Leftrightarrow$English and 22 Chinese$Leftrightarrow$English test sets.
We propose a domain Chain of Thought (CoT) fine-tuning technique that utilizes the intrinsic multi-domain intelligence of LLMs to improve translation performance.
arXiv Detail & Related papers (2024-10-03T16:15:04Z) - Towards Identifiable Unsupervised Domain Translation: A Diversified Distribution Matching Approach [17.561012410096833]
Unsupervised domain translation (UDT) aims to find functions that convert samples from one domain to another without changing the high-level semantic meaning.<n>This study delves into the core identifiability inquiry and introduces an MPA elimination theory.<n>Our theory leads to a UDT learner using distribution matching over auxiliary variable-induced subsets of the domains.
arXiv Detail & Related papers (2024-01-18T01:07:00Z) - Multiple Noises in Diffusion Model for Semi-Supervised Multi-Domain Translation [1.9510388605988505]
We introduce Multi-Domain Diffusion (MDD) to solve the challenge of multi-domain translation.<n>MDD reconstructs missing views for new data objects, and enables learning in semi-supervised contexts.<n>We evaluate our approach through domain translation experiments on BL3NDT, a multi-domain synthetic dataset.
arXiv Detail & Related papers (2023-09-25T15:31:16Z) - UOD: Universal One-shot Detection of Anatomical Landmarks [16.360644135635333]
We develop a domain-adaptive one-shot landmark detection framework for handling multi-domain medical images, named Universal One-shot Detection (UOD)
UOD consists of two stages and two corresponding universal models which are designed as combinations of domain-specific modules and domain-shared modules.
We investigate both qualitatively and quantitatively the proposed UOD on three widely-used public X-ray datasets in different anatomical domains.
arXiv Detail & Related papers (2023-06-13T08:19:14Z) - Domain Translation via Latent Space Mapping [1.1470070927586016]
We introduce a new unified framework called Latent Space Mapping (model)
Unlike existing approaches, we propose to further regularize each latent space using available domains by learning each dependency between pairs of domains.
arXiv Detail & Related papers (2022-12-06T23:09:40Z) - Non-Parametric Unsupervised Domain Adaptation for Neural Machine
Translation [61.27321597981737]
$k$NN-MT has shown the promising capability of directly incorporating the pre-trained neural machine translation (NMT) model with domain-specific token-level $k$-nearest-neighbor retrieval.
We propose a novel framework that directly uses in-domain monolingual sentences in the target language to construct an effective datastore for $k$-nearest-neighbor retrieval.
arXiv Detail & Related papers (2021-09-14T11:50:01Z) - Iterative Domain-Repaired Back-Translation [50.32925322697343]
In this paper, we focus on the domain-specific translation with low resources, where in-domain parallel corpora are scarce or nonexistent.
We propose a novel iterative domain-repaired back-translation framework, which introduces the Domain-Repair model to refine translations in synthetic bilingual data.
Experiments on adapting NMT models between specific domains and from the general domain to specific domains demonstrate the effectiveness of our proposed approach.
arXiv Detail & Related papers (2020-10-06T04:38:09Z) - Mutual Learning Network for Multi-Source Domain Adaptation [73.25974539191553]
We propose a novel multi-source domain adaptation method, Mutual Learning Network for Multiple Source Domain Adaptation (ML-MSDA)
Under the framework of mutual learning, the proposed method pairs the target domain with each single source domain to train a conditional adversarial domain adaptation network as a branch network.
The proposed method outperforms the comparison methods and achieves the state-of-the-art performance.
arXiv Detail & Related papers (2020-03-29T04:31:43Z) - Universal-RCNN: Universal Object Detector via Transferable Graph R-CNN [117.80737222754306]
We present a novel universal object detector called Universal-RCNN.
We first generate a global semantic pool by integrating all high-level semantic representation of all the categories.
An Intra-Domain Reasoning Module learns and propagates the sparse graph representation within one dataset guided by a spatial-aware GCN.
arXiv Detail & Related papers (2020-02-18T07:57:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.