Deep learning four decades of human migration
- URL: http://arxiv.org/abs/2506.22821v2
- Date: Thu, 03 Jul 2025 10:46:56 GMT
- Title: Deep learning four decades of human migration
- Authors: Thomas Gaskin, Guy J. Abel,
- Abstract summary: We present a novel and detailed dataset on origin-destination annual migration flows and stocks between 230 countries and regions.<n>Our flow estimates are further disaggregated by country of birth, providing a comprehensive picture of migration over the last 35 years.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a novel and detailed dataset on origin-destination annual migration flows and stocks between 230 countries and regions, spanning the period from 1990 to the present. Our flow estimates are further disaggregated by country of birth, providing a comprehensive picture of migration over the last 35 years. The estimates are obtained by training a deep recurrent neural network to learn flow patterns from 18 covariates for all countries, including geographic, economic, cultural, societal, and political information. The recurrent architecture of the neural network means that the entire past can influence current migration patterns, allowing us to learn long-range temporal correlations. By training an ensemble of neural networks and additionally pushing uncertainty on the covariates through the trained network, we obtain confidence bounds for all our estimates, allowing researchers to pinpoint the geographic regions most in need of additional data collection. We validate our approach on various test sets of unseen data, demonstrating that it significantly outperforms traditional methods estimating five-year flows while delivering a significant increase in temporal resolution. The model is fully open source: all training data, neural network weights, and training code are made public alongside the migration estimates, providing a valuable resource for future studies of human migration.
Related papers
- Diffusion-Based Neural Network Weights Generation [80.89706112736353]
D2NWG is a diffusion-based neural network weights generation technique that efficiently produces high-performing weights for transfer learning.
Our method extends generative hyper-representation learning to recast the latent diffusion paradigm for neural network weights generation.
Our approach is scalable to large architectures such as large language models (LLMs), overcoming the limitations of current parameter generation techniques.
arXiv Detail & Related papers (2024-02-28T08:34:23Z) - Mesh-Wise Prediction of Demographic Composition from Satellite Images
Using Multi-Head Convolutional Neural Network [0.0]
This paper proposes a multi-head Convolutional Neural Network model with transfer learning from pre-trained ResNet50 for estimating mesh-wise demographics of Japan.
Satellite images from Landsat-8/OLI and Suomi NPP/VIIRS-DNS as inputs and census demographics as labels.
The trained model was performed on a testing dataset with a test score of at least 0.8914 in $textR2$ for all the demographic composition groups, and the estimated demographic composition was generated and visualised for 2022 as a non-census year.
arXiv Detail & Related papers (2023-08-25T15:41:05Z) - FairMobi-Net: A Fairness-aware Deep Learning Model for Urban Mobility
Flow Generation [2.30238915794052]
We present a novel, fairness-aware deep learning model, FairMobi-Net, for inter-region human flow prediction.
We validate the model using comprehensive human mobility datasets from four U.S. cities, predicting human flow at the census-tract level.
The model maintains a high degree of accuracy consistently across diverse regions, addressing the previous fairness concern.
arXiv Detail & Related papers (2023-07-20T19:56:30Z) - Spatial Implicit Neural Representations for Global-Scale Species Mapping [72.92028508757281]
Given a set of locations where a species has been observed, the goal is to build a model to predict whether the species is present or absent at any location.
Traditional methods struggle to take advantage of emerging large-scale crowdsourced datasets.
We use Spatial Implicit Neural Representations (SINRs) to jointly estimate the geographical range of 47k species simultaneously.
arXiv Detail & Related papers (2023-06-05T03:36:01Z) - Predicting COVID-19 pandemic by spatio-temporal graph neural networks: A
New Zealand's study [16.3773496061049]
We propose a novel deep learning architecture named Attention-based Multiresolution Graph Neural Networks (ATMGNN)
Our method can capture the multiscale structures of the spatial graph via a learning to cluster algorithm in a data-driven manner.
For a future work, we plan to extend our work for real-time prediction and global scale.
arXiv Detail & Related papers (2023-05-12T19:00:17Z) - A Double Machine Learning Trend Model for Citizen Science Data [0.0]
We describe a novel modeling approach designed to estimate species population trends while controlling for the interannual confounding common in citizen science data.
The approach is based on Double Machine Learning, a statistical framework that uses machine learning methods to estimate population change and the propensity scores used to adjust for confounding discovered in the data.
arXiv Detail & Related papers (2022-10-27T15:08:05Z) - The BUTTER Zone: An Empirical Study of Training Dynamics in Fully
Connected Neural Networks [0.562479170374811]
We present an empirical dataset surveying the deep learning phenomenon on fully-connected feed-forward perceptron neural networks.
The dataset records the per-epoch training and generalization performance of 483 thousand distinct hyper parameter choices.
Repeating each experiment an average of 24 times resulted in 11 million total training runs and 40 billion epochs recorded.
arXiv Detail & Related papers (2022-07-25T21:45:32Z) - Strict baselines for Covid-19 forecasting and ML perspective for USA and
Russia [105.54048699217668]
Covid-19 allows researchers to gather datasets accumulated over 2 years and to use them in predictive analysis.
We present the results of a consistent comparative study of different types of methods for predicting the dynamics of the spread of Covid-19 based on regional data for two countries: the United States and Russia.
arXiv Detail & Related papers (2022-07-15T18:21:36Z) - On Generalizing Beyond Domains in Cross-Domain Continual Learning [91.56748415975683]
Deep neural networks often suffer from catastrophic forgetting of previously learned knowledge after learning a new task.
Our proposed approach learns new tasks under domain shift with accuracy boosts up to 10% on challenging datasets such as DomainNet and OfficeHome.
arXiv Detail & Related papers (2022-03-08T09:57:48Z) - Invariance Learning in Deep Neural Networks with Differentiable Laplace
Approximations [76.82124752950148]
We develop a convenient gradient-based method for selecting the data augmentation.
We use a differentiable Kronecker-factored Laplace approximation to the marginal likelihood as our objective.
arXiv Detail & Related papers (2022-02-22T02:51:11Z) - Dynamic Neural Diversification: Path to Computationally Sustainable
Neural Networks [68.8204255655161]
Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks.
We explore the diversity of the neurons within the hidden layer during the learning process.
We analyze how the diversity of the neurons affects predictions of the model.
arXiv Detail & Related papers (2021-09-20T15:12:16Z) - Neural Networks and Value at Risk [59.85784504799224]
We perform Monte-Carlo simulations of asset returns for Value at Risk threshold estimation.
Using equity markets and long term bonds as test assets, we investigate neural networks.
We find our networks when fed with substantially less data to perform significantly worse.
arXiv Detail & Related papers (2020-05-04T17:41:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.