Comparative Evaluation of Neural Network Architectures for Generalizable Human Spatial Preference Prediction in Unseen Built Environments
- URL: http://arxiv.org/abs/2510.10954v1
- Date: Mon, 13 Oct 2025 03:04:48 GMT
- Title: Comparative Evaluation of Neural Network Architectures for Generalizable Human Spatial Preference Prediction in Unseen Built Environments
- Authors: Maral Doctorarastoo, Katherine A. Flanigan, Mario Bergés, Christopher McComb,
- Abstract summary: The capacity to predict human spatial preferences within built environments is instrumental for developing Cyber-Physical-Social Infrastructure Systems.<n>Deep learning models have shown promise in learning complex spatial and contextual dependencies.<n>It remains unclear which neural network architectures are most effective at generalizing to unseen layouts.
- Score: 1.411614392022118
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The capacity to predict human spatial preferences within built environments is instrumental for developing Cyber-Physical-Social Infrastructure Systems (CPSIS). A significant challenge in this domain is the generalizability of preference models, particularly their efficacy in predicting preferences within environmental configurations not encountered during training. While deep learning models have shown promise in learning complex spatial and contextual dependencies, it remains unclear which neural network architectures are most effective at generalizing to unseen layouts. To address this, we conduct a comparative study of Graph Neural Networks, Convolutional Neural Networks, and standard feedforward Neural Networks using synthetic data generated from a simplified and synthetic pocket park environment. Beginning with this illustrative case study, allows for controlled analysis of each model's ability to transfer learned preference patterns to unseen spatial scenarios. The models are evaluated based on their capacity to predict preferences influenced by heterogeneous physical, environmental, and social features. Generalizability score is calculated using the area under the precision-recall curve for the seen and unseen layouts. This generalizability score is appropriate for imbalanced data, providing insights into the suitability of each neural network architecture for preference-aware human behavior modeling in unseen built environments.
Related papers
- A Multi-Resolution Benchmark Framework for Spatial Reasoning Assessment in Neural Networks [40.69150418258213]
This paper presents preliminary results in the definition of a comprehensive benchmark framework designed to evaluate spatial reasoning capabilities in neural networks.<n>The framework is currently being used to study the capabilities of nnU-Net.
arXiv Detail & Related papers (2025-08-18T09:04:13Z) - Rethinking Inductive Bias in Geographically Neural Network Weighted Regression [1.3597551064547502]
This work revisits the inductive biases in Geographically Neural Network Weighted Regression.<n>We introduce local receptive fields, sequential context, and self-attention into spatial regression.<n>We show that GNNWR outperforms classic methods in capturing nonlinear and complex spatial relationships.
arXiv Detail & Related papers (2025-07-14T06:13:18Z) - Generalized Factor Neural Network Model for High-dimensional Regression [50.554377879576066]
We tackle the challenges of modeling high-dimensional data sets with latent low-dimensional structures hidden within complex, non-linear, and noisy relationships.<n>Our approach enables a seamless integration of concepts from non-parametric regression, factor models, and neural networks for high-dimensional regression.
arXiv Detail & Related papers (2025-02-16T23:13:55Z) - Set-based Neural Network Encoding Without Weight Tying [91.37161634310819]
We propose a neural network weight encoding method for network property prediction.<n>Our approach is capable of encoding neural networks in a model zoo of mixed architecture.<n>We introduce two new tasks for neural network property prediction: cross-dataset and cross-architecture.
arXiv Detail & Related papers (2023-05-26T04:34:28Z) - On the Generalization of PINNs outside the training domain and the
Hyperparameters influencing it [1.3927943269211593]
PINNs are Neural Network architectures trained to emulate solutions of differential equations without the necessity of solution data.
We perform an empirical analysis of the behavior of PINN predictions outside their training domain.
We assess whether the algorithmic setup of PINNs can influence their potential for generalization and showcase the respective effect on the prediction.
arXiv Detail & Related papers (2023-02-15T09:51:56Z) - A Local Optima Network Analysis of the Feedforward Neural Architecture
Space [0.0]
Local optima network (LON) analysis is a derivative of the fitness landscape of candidate solutions.
LONs may provide a viable paradigm by which to analyse and optimise neural architectures.
arXiv Detail & Related papers (2022-06-02T08:09:17Z) - Temporal Convolution Domain Adaptation Learning for Crops Growth
Prediction [5.966652553573454]
We construct an innovative network architecture based on domain adaptation learning to predict crops growth curves with limited available crop data.
We are the first to use the temporal convolution filters as the backbone to construct a domain adaptation network architecture.
Results show that the proposed temporal convolution-based network architecture outperforms all benchmarks not only in accuracy but also in model size and convergence rate.
arXiv Detail & Related papers (2022-02-24T14:22:36Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.