End-to-End Deep Learning for Predicting Metric Space-Valued Outputs
- URL: http://arxiv.org/abs/2509.23544v1
- Date: Sun, 28 Sep 2025 00:46:12 GMT
- Title: End-to-End Deep Learning for Predicting Metric Space-Valued Outputs
- Authors: Yidong Zhou, Su I Iao, Hans-Georg Müller,
- Abstract summary: We introduce E2M, a deep learning framework for predicting metric space-valued outputs.<n>E2M performs prediction via a weighted Fr'teche means over training outputs, where the weights are learned by a neural network conditioned on the input.<n>We show that E2M consistently achieves state-of-the-art performance, with its advantages becoming more pronounced at larger sample sizes.
- Score: 4.855663359344747
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many modern applications involve predicting structured, non-Euclidean outputs such as probability distributions, networks, and symmetric positive-definite matrices. These outputs are naturally modeled as elements of general metric spaces, where classical regression techniques that rely on vector space structure no longer apply. We introduce E2M (End-to-End Metric regression), a deep learning framework for predicting metric space-valued outputs. E2M performs prediction via a weighted Fr\'echet means over training outputs, where the weights are learned by a neural network conditioned on the input. This construction provides a principled mechanism for geometry-aware prediction that avoids surrogate embeddings and restrictive parametric assumptions, while fully preserving the intrinsic geometry of the output space. We establish theoretical guarantees, including a universal approximation theorem that characterizes the expressive capacity of the model and a convergence analysis of the entropy-regularized training objective. Through extensive simulations involving probability distributions, networks, and symmetric positive-definite matrices, we show that E2M consistently achieves state-of-the-art performance, with its advantages becoming more pronounced at larger sample sizes. Applications to human mortality distributions and New York City taxi networks further demonstrate the flexibility and practical utility of the framework.
Related papers
- Equivariant Evidential Deep Learning for Interatomic Potentials [55.6997213490859]
Uncertainty quantification is critical for assessing the reliability of machine learning interatomic potentials in molecular dynamics simulations.<n>Existing UQ approaches for MLIPs are often limited by high computational cost or suboptimal performance.<n>We propose textitEquivariant Evidential Deep Learning for Interatomic Potentials ($texte2$IP), a backbone-agnostic framework that models atomic forces and their uncertainty jointly.
arXiv Detail & Related papers (2026-02-11T02:00:25Z) - SIGMA: Scalable Spectral Insights for LLM Collapse [51.863164847253366]
We introduce SIGMA (Spectral Inequalities for Gram Matrix Analysis), a unified framework for model collapse.<n>By utilizing benchmarks that deriving and deterministic bounds on the matrix's spectrum, SIGMA provides a mathematically grounded metric to track the contraction of the representation space.<n>We demonstrate that SIGMA effectively captures the transition towards states, offering both theoretical insights into the mechanics of collapse.
arXiv Detail & Related papers (2026-01-06T19:47:11Z) - DFNN: A Deep Fréchet Neural Network Framework for Learning Metric-Space-Valued Responses [0.25778694761493826]
deep Fr'echet neural networks (DFNNs) are an end-to-end deep learning framework for predicting non-Euclidean responses from Euclidean predictors.<n>We establish a universal approximation theorem for DFNNs, advancing the state-of-the-art of neural network approximation theory.<n> Empirical studies on synthetic distributional and network-valued responses, as well as a real-world application to predicting employment occupational compositions, demonstrate that DFNNs consistently outperform existing methods.
arXiv Detail & Related papers (2025-10-20T00:57:30Z) - Neural Optimal Transport Meets Multivariate Conformal Prediction [58.43397908730771]
We propose a framework for conditional vectorile regression (CVQR)<n>CVQR combines neural optimal transport with quantized optimization, and apply it to predictions.
arXiv Detail & Related papers (2025-09-29T19:50:19Z) - Equivariant Representation Learning for Symmetry-Aware Inference with Guarantees [20.285132886770146]
We introduce an equivariant representation learning framework that simultaneously addresses regression, conditional probability estimation, and uncertainty quantification.<n>Grounded in operator and group representation theory, our framework approximates the spectral decomposition of the conditional expectation operator.<n> Empirical evaluations on synthetic datasets and real-world robotics applications confirm the potential of our approach.
arXiv Detail & Related papers (2025-05-26T10:47:23Z) - Wrapped Gaussian on the manifold of Symmetric Positive Definite Matrices [4.678796432640703]
Circular and non-flat data distributions are prevalent across diverse domains of data science.<n>A principled approach to accounting for the underlying geometry of such data is pivotal.<n>This work lays the groundwork for extending classical machine learning and statistical methods to more complex and structured data.
arXiv Detail & Related papers (2025-02-03T16:46:46Z) - Deep Fréchet Regression [4.915744683251151]
We propose a flexible regression model capable of handling high-dimensional predictors without imposing parametric assumptions.<n>The proposed model outperforms existing methods for non-Euclidean responses.
arXiv Detail & Related papers (2024-07-31T07:54:14Z) - Adaptive Log-Euclidean Metrics for SPD Matrix Learning [73.12655932115881]
We propose Adaptive Log-Euclidean Metrics (ALEMs), which extend the widely used Log-Euclidean Metric (LEM)
The experimental and theoretical results demonstrate the merit of the proposed metrics in improving the performance of SPD neural networks.
arXiv Detail & Related papers (2023-03-26T18:31:52Z) - Machine Learning and Polymer Self-Consistent Field Theory in Two Spatial
Dimensions [0.491574468325115]
A computational framework that leverages data from self-consistent field theory simulations with deep learning is presented.
A generative adversarial network (GAN) is introduced to efficiently and accurately predict saddle point, local average monomer density fields.
This GAN approach yields important savings of both memory and computational cost.
arXiv Detail & Related papers (2022-12-16T04:30:16Z) - Random Forest Weighted Local Fréchet Regression with Random Objects [18.128663071848923]
We propose a novel random forest weighted local Fr'echet regression paradigm.<n>Our first method uses these weights as the local average to solve the conditional Fr'echet mean.<n>Second method performs local linear Fr'echet regression, both significantly improving existing Fr'echet regression methods.
arXiv Detail & Related papers (2022-02-10T09:10:59Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Machine learning for causal inference: on the use of cross-fit
estimators [77.34726150561087]
Doubly-robust cross-fit estimators have been proposed to yield better statistical properties.
We conducted a simulation study to assess the performance of several estimators for the average causal effect (ACE)
When used with machine learning, the doubly-robust cross-fit estimators substantially outperformed all of the other estimators in terms of bias, variance, and confidence interval coverage.
arXiv Detail & Related papers (2020-04-21T23:09:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.