Advancing Supervised Learning with the Wave Loss Function: A Robust and Smooth Approach
- URL: http://arxiv.org/abs/2404.18101v2
- Date: Sun, 13 Oct 2024 16:57:39 GMT
- Title: Advancing Supervised Learning with the Wave Loss Function: A Robust and Smooth Approach
- Authors: Mushir Akhtar, M. Tanveer, Mohd. Arshad,
- Abstract summary: We present a novel contribution to the realm of supervised machine learning: an asymmetric loss function named wave loss.
We incorporate the proposed wave loss function into the least squares setting of support vector machines (SVM) and twin support vector machines (TSVM)
To empirically showcase the effectiveness of the proposed Wave-SVM and Wave-TSVM, we evaluate them on benchmark UCI and KEEL datasets.
- Score: 0.0
- License:
- Abstract: Loss function plays a vital role in supervised learning frameworks. The selection of the appropriate loss function holds the potential to have a substantial impact on the proficiency attained by the acquired model. The training of supervised learning algorithms inherently adheres to predetermined loss functions during the optimization process. In this paper, we present a novel contribution to the realm of supervised machine learning: an asymmetric loss function named wave loss. It exhibits robustness against outliers, insensitivity to noise, boundedness, and a crucial smoothness property. Theoretically, we establish that the proposed wave loss function manifests the essential characteristic of being classification-calibrated. Leveraging this breakthrough, we incorporate the proposed wave loss function into the least squares setting of support vector machines (SVM) and twin support vector machines (TSVM), resulting in two robust and smooth models termed Wave-SVM and Wave-TSVM, respectively. To address the optimization problem inherent in Wave-SVM, we utilize the adaptive moment estimation (Adam) algorithm. It is noteworthy that this paper marks the first instance of the Adam algorithm application to solve an SVM model. Further, we devise an iterative algorithm to solve the optimization problems of Wave-TSVM. To empirically showcase the effectiveness of the proposed Wave-SVM and Wave-TSVM, we evaluate them on benchmark UCI and KEEL datasets (with and without feature noise) from diverse domains. Moreover, to exemplify the applicability of Wave-SVM in the biomedical domain, we evaluate it on the Alzheimer Disease Neuroimaging Initiative (ADNI) dataset. The experimental outcomes unequivocally reveal the prowess of Wave-SVM and Wave-TSVM in achieving superior prediction accuracy against the baseline models.
Related papers
- A Stochastic Approach to Bi-Level Optimization for Hyperparameter Optimization and Meta Learning [74.80956524812714]
We tackle the general differentiable meta learning problem that is ubiquitous in modern deep learning.
These problems are often formalized as Bi-Level optimizations (BLO)
We introduce a novel perspective by turning a given BLO problem into a ii optimization, where the inner loss function becomes a smooth distribution, and the outer loss becomes an expected loss over the inner distribution.
arXiv Detail & Related papers (2024-10-14T12:10:06Z) - GL-TSVM: A robust and smooth twin support vector machine with guardian loss function [0.0]
We introduce the guardian loss (G-loss) a novel loss function distinguished by its asymmetric, bounded, and smooth characteristics.
To adhere to the structural risk minimization (SRM) principle, we incorporate a regularization term into the objective function of GL-TSVM.
The experimental analysis on UCI and KEEL datasets substantiates the effectiveness of the proposed GL-TSVM.
arXiv Detail & Related papers (2024-08-29T08:14:20Z) - Enhancing Multiview Synergy: Robust Learning by Exploiting the Wave Loss Function with Consensus and Complementarity Principles [0.0]
This paper introduces Wave-MvSVM, a novel multiview support vector machine framework leveraging the wave loss (W-loss) function.
Wave-MvSVM ensures a more comprehensive and resilient learning process by integrating both consensus and complementarity principles.
Extensive empirical evaluations across diverse datasets demonstrate the superior performance of Wave-MvSVM.
arXiv Detail & Related papers (2024-08-13T11:25:22Z) - Wave-RVFL: A Randomized Neural Network Based on Wave Loss Function [0.0]
We propose the Wave-RVFL, an RVFL model incorporating the wave loss function.
The Wave-RVFL exhibits robustness against noise and outliers by preventing over-penalization of deviations.
Empirical results affirm the superior performance and robustness of the Wave-RVFL compared to baseline models.
arXiv Detail & Related papers (2024-08-05T20:46:54Z) - Equation Discovery with Bayesian Spike-and-Slab Priors and Efficient Kernels [57.46832672991433]
We propose a novel equation discovery method based on Kernel learning and BAyesian Spike-and-Slab priors (KBASS)
We use kernel regression to estimate the target function, which is flexible, expressive, and more robust to data sparsity and noises.
We develop an expectation-propagation expectation-maximization algorithm for efficient posterior inference and function estimation.
arXiv Detail & Related papers (2023-10-09T03:55:09Z) - RoBoSS: A Robust, Bounded, Sparse, and Smooth Loss Function for
Supervised Learning [0.0]
We propose a novel robust, bounded, sparse, and smooth (RoBoSS) loss function for supervised learning.
We introduce a new robust algorithm named $mathcalL_rbss$-SVM to generalize well to unseen data.
We evaluate the proposed $mathcalL_rbss$-SVM on $88$ real-world UCI and KEEL datasets from diverse domains.
arXiv Detail & Related papers (2023-09-05T13:59:50Z) - PINQI: An End-to-End Physics-Informed Approach to Learned Quantitative MRI Reconstruction [0.7199733380797579]
Quantitative Magnetic Resonance Imaging (qMRI) enables the reproducible measurement of biophysical parameters in tissue.
The challenge lies in solving a nonlinear, ill-posed inverse problem to obtain desired tissue parameter maps from acquired raw data.
We propose PINQI, a novel qMRI reconstruction method that integrates the knowledge about the signal, acquisition model, and learned regularization into a single end-to-end trainable neural network.
arXiv Detail & Related papers (2023-06-19T15:37:53Z) - Optimization of a Hydrodynamic Computational Reservoir through Evolution [58.720142291102135]
We interface with a model of a hydrodynamic system, under development by a startup, as a computational reservoir.
We optimized the readout times and how inputs are mapped to the wave amplitude or frequency using an evolutionary search algorithm.
Applying evolutionary methods to this reservoir system substantially improved separability on an XNOR task, in comparison to implementations with hand-selected parameters.
arXiv Detail & Related papers (2023-04-20T19:15:02Z) - MotionHint: Self-Supervised Monocular Visual Odometry with Motion
Constraints [70.76761166614511]
We present a novel self-supervised algorithm named MotionHint for monocular visual odometry (VO)
Our MotionHint algorithm can be easily applied to existing open-sourced state-of-the-art SSM-VO systems.
arXiv Detail & Related papers (2021-09-14T15:35:08Z) - Fast Distributionally Robust Learning with Variance Reduced Min-Max
Optimization [85.84019017587477]
Distributionally robust supervised learning is emerging as a key paradigm for building reliable machine learning systems for real-world applications.
Existing algorithms for solving Wasserstein DRSL involve solving complex subproblems or fail to make use of gradients.
We revisit Wasserstein DRSL through the lens of min-max optimization and derive scalable and efficiently implementable extra-gradient algorithms.
arXiv Detail & Related papers (2021-04-27T16:56:09Z) - Estimating Average Treatment Effects with Support Vector Machines [77.34726150561087]
Support vector machine (SVM) is one of the most popular classification algorithms in the machine learning literature.
We adapt SVM as a kernel-based weighting procedure that minimizes the maximum mean discrepancy between the treatment and control groups.
We characterize the bias of causal effect estimation arising from this trade-off, connecting the proposed SVM procedure to the existing kernel balancing methods.
arXiv Detail & Related papers (2021-02-23T20:22:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.