Efficient Neural Neighborhood Search for Pickup and Delivery Problems
- URL: http://arxiv.org/abs/2204.11399v1
- Date: Mon, 25 Apr 2022 02:27:59 GMT
- Title: Efficient Neural Neighborhood Search for Pickup and Delivery Problems
- Authors: Yining Ma, Jingwen Li, Zhiguang Cao, Wen Song, Hongliang Guo, Yuejiao
Gong, Yeow Meng Chee
- Abstract summary: We present an efficient Neural Neighborhood Search (N2S) approach for pickup and delivery problems ( PDPs)
In specific, we design a powerful Synthesis Attention that allows the vanilla self-attention to synthesize various types of features regarding a route solution.
We also exploit two customized decoders that automatically learn to perform removal and reinsertion of a pickup-delivery node pair to tackle the precedence constraint.
- Score: 33.26295166939733
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present an efficient Neural Neighborhood Search (N2S) approach for pickup
and delivery problems (PDPs). In specific, we design a powerful Synthesis
Attention that allows the vanilla self-attention to synthesize various types of
features regarding a route solution. We also exploit two customized decoders
that automatically learn to perform removal and reinsertion of a
pickup-delivery node pair to tackle the precedence constraint. Additionally, a
diversity enhancement scheme is leveraged to further ameliorate the
performance. Our N2S is generic, and extensive experiments on two canonical PDP
variants show that it can produce state-of-the-art results among existing
neural methods. Moreover, it even outstrips the well-known LKH3 solver on the
more constrained PDP variant. Our implementation for N2S is available online.
Related papers
- Joint Transmit and Pinching Beamforming for PASS: Optimization-Based or Learning-Based? [89.05848771674773]
A novel antenna system ()-enabled downlink multi-user multiple-input single-output (MISO) framework is proposed.
It consists of multiple waveguides, which equip numerous low-cost antennas, named (PAs)
The positions of PAs can be reconfigured to both spanning large-scale path and space.
arXiv Detail & Related papers (2025-02-12T18:54:10Z) - An Efficient Diffusion-based Non-Autoregressive Solver for Traveling Salesman Problem [21.948190231334088]
We propose DEITSP, a diffusion model with efficient iterations tailored for Traveling Salesman Problems.
We introduce a one-step diffusion model that integrates the controlled discrete noise addition process with self-consistency enhancement.
We also design a dual-modality graph transformer to bolster the extraction and fusion of features from node and edge modalities.
arXiv Detail & Related papers (2025-01-23T15:47:04Z) - PDHG-Unrolled Learning-to-Optimize Method for Large-Scale Linear Programming [36.13745722329505]
We propose an FOM-unrolled neural network (NN) called PDHG-Net to solve large-scale linear programming problems.
Experiments show that our approach can accelerate solving, achieving up to a 3$times$ speedup compared to FOMs for large-scale LP problems.
arXiv Detail & Related papers (2024-06-04T02:39:42Z) - Learning to Search Feasible and Infeasible Regions of Routing Problems
with Flexible Neural k-Opt [30.510841880901655]
We present Neural k-Opt (NeuOpt), a novel learning-to-search (L2S) solver for routing problems.
It learns to perform flexible k-opt exchanges based on a tailored action factorization method and a customized recurrent dual-stream decoder.
arXiv Detail & Related papers (2023-10-27T16:51:41Z) - Reinforcement Learning for Node Selection in Branch-and-Bound [52.2648997215667]
Current state-of-the-art selectors utilize either hand-crafted ensembles that automatically switch between naive sub-node selectors, or learned node selectors that rely on individual node data.
We propose a novel simulation technique that uses reinforcement learning (RL) while considering the entire tree state, rather than just isolated nodes.
arXiv Detail & Related papers (2023-09-29T19:55:56Z) - Versatile Neural Processes for Learning Implicit Neural Representations [57.090658265140384]
We propose Versatile Neural Processes (VNP), which largely increases the capability of approximating functions.
Specifically, we introduce a bottleneck encoder that produces fewer and informative context tokens, relieving the high computational cost.
We demonstrate the effectiveness of the proposed VNP on a variety of tasks involving 1D, 2D and 3D signals.
arXiv Detail & Related papers (2023-01-21T04:08:46Z) - MultiAuto-DeepONet: A Multi-resolution Autoencoder DeepONet for
Nonlinear Dimension Reduction, Uncertainty Quantification and Operator
Learning of Forward and Inverse Stochastic Problems [12.826754199680474]
A new data-driven method for operator learning of differential equations(SDE) is proposed in this paper.
The central goal is to solve forward and inverse problems more effectively using limited data.
arXiv Detail & Related papers (2022-04-07T03:53:49Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Enhanced Exploration in Neural Feature Selection for Deep Click-Through
Rate Prediction Models via Ensemble of Gating Layers [7.381829794276824]
The goal of neural feature selection (NFS) is to choose a relatively small subset of features with the best explanatory power.
Gating approach inserts a set of differentiable binary gates to drop less informative features.
To improve the exploration capacity of gradient-based solutions, we propose a simple but effective ensemble learning approach.
arXiv Detail & Related papers (2021-12-07T04:37:05Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.