Solving nonograms using Neural Networks
- URL: http://arxiv.org/abs/2501.05882v1
- Date: Fri, 10 Jan 2025 11:34:22 GMT
- Title: Solving nonograms using Neural Networks
- Authors: José María Buades Rubio, Antoni Jaume-i-Capó, David López González, Gabriel Moyà Alcover,
- Abstract summary: Nonograms are logic puzzles in which cells in a grid must be colored or left blank according to the numbers in its headers.
In this study, we analyze different techniques to solve this type of logical problem using an Heuristic Algorithm, Genetic Algorithm, and Heuristic Algorithm with Neural Network.
- Score: 0.7499722271664147
- License:
- Abstract: Nonograms are logic puzzles in which cells in a grid must be colored or left blank according to the numbers that are located in its headers. In this study, we analyze different techniques to solve this type of logical problem using an Heuristic Algorithm, Genetic Algorithm, and Heuristic Algorithm with Neural Network. Furthermore, we generate a public dataset to train the neural networks. We published this dataset and the code of the algorithms. Combination of the heuristic algorithm with a neural network obtained the best results. From state of the art review, no previous works used neural network to solve nonograms, nor combined a network with other algorithms to accelerate the resolution process.
Related papers
- LinSATNet: The Positive Linear Satisfiability Neural Networks [116.65291739666303]
This paper studies how to introduce the popular positive linear satisfiability to neural networks.
We propose the first differentiable satisfiability layer based on an extension of the classic Sinkhorn algorithm for jointly encoding multiple sets of marginal distributions.
arXiv Detail & Related papers (2024-07-18T22:05:21Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Discrete Neural Algorithmic Reasoning [18.497863598167257]
We propose to force neural reasoners to maintain the execution trajectory as a combination of finite predefined states.
trained with supervision on the algorithm's state transitions, such models are able to perfectly align with the original algorithm.
arXiv Detail & Related papers (2024-02-18T16:03:04Z) - The Deep Equilibrium Algorithmic Reasoner [20.375241527453447]
We show that graph neural networks (GNNs) can learn to execute classical algorithms.
We conjecture and empirically validate that one can train a network to solve algorithmic problems by directly finding the equilibrium.
arXiv Detail & Related papers (2024-02-09T14:46:50Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - The Clock and the Pizza: Two Stories in Mechanistic Explanation of
Neural Networks [59.26515696183751]
We show that algorithm discovery in neural networks is sometimes more complex.
We show that even simple learning problems can admit a surprising diversity of solutions.
arXiv Detail & Related papers (2023-06-30T17:59:13Z) - Towards Better Out-of-Distribution Generalization of Neural Algorithmic
Reasoning Tasks [51.8723187709964]
We study the OOD generalization of neural algorithmic reasoning tasks.
The goal is to learn an algorithm from input-output pairs using deep neural networks.
arXiv Detail & Related papers (2022-11-01T18:33:20Z) - Graph Neural Networks are Dynamic Programmers [0.0]
Graph neural networks (GNNs) are claimed to align with dynamic programming (DP)
Here we show, using methods from theory and abstract algebra, that there exists an intricate connection between GNNs and DP.
arXiv Detail & Related papers (2022-03-29T13:27:28Z) - A biologically plausible neural network for local supervision in
cortical microcircuits [17.00937011213428]
We derive an algorithm for training a neural network which avoids explicit error and backpropagation.
Our algorithm maps onto a neural network that bears a remarkable resemblance to the connectivity structure and learning rules of the cortex.
arXiv Detail & Related papers (2020-11-30T17:35:22Z) - Neural Thompson Sampling [94.82847209157494]
We propose a new algorithm, called Neural Thompson Sampling, which adapts deep neural networks for both exploration and exploitation.
At the core of our algorithm is a novel posterior distribution of the reward, where its mean is the neural network approximator, and its variance is built upon the neural tangent features of the corresponding neural network.
arXiv Detail & Related papers (2020-10-02T07:44:09Z) - Neural Bipartite Matching [19.600193617583955]
This report describes how neural execution is applied to a complex algorithm.
It is achieved via neural execution based only on features generated from a single GNN.
The evaluation shows strongly generalising results with the network achieving optimal matching almost 100% of the time.
arXiv Detail & Related papers (2020-05-22T17:50:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.