FOON Creation and Traversal for Recipe Generation
- URL: http://arxiv.org/abs/2210.07335v1
- Date: Thu, 13 Oct 2022 20:11:22 GMT
- Title: FOON Creation and Traversal for Recipe Generation
- Authors: Raj Patel
- Abstract summary: FOON stands for functional object-oriented network.
The network first needs to be created by having a human creates action nodes as well as input and output nodes in a.txt file.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Task competition by robots is still off from being completely dependable and
usable. One way a robot may decipher information given to it and accomplish
tasks is by utilizing FOON, which stands for functional object-oriented
network. The network first needs to be created by having a human creates action
nodes as well as input and output nodes in a .txt file. After the network is
sizeable, utilization of this network allows for traversal of the network in a
variety of ways such as choosing steps via iterative deepening searching by
using the first seen valid option. Another mechanism is heuristics, such as
choosing steps based on the highest success rate or lowest amount of input
ingredients. Via any of these methods, a program can traverse the network given
an output product, and derive the series of steps that need to be taken to
produce the output.
Related papers
- Using evolutionary computation to optimize task performance of unclocked, recurrent Boolean circuits in FPGAs [0.0]
We show an alternative learning approach for unclocked, recurrent networks in FPGAs.
We use evolutionary computation to evolve the functions of network nodes.
We obtain an accuracy improvement of 30% on an image classification task.
arXiv Detail & Related papers (2024-03-19T19:11:00Z) - OFA$^2$: A Multi-Objective Perspective for the Once-for-All Neural
Architecture Search [79.36688444492405]
Once-for-All (OFA) is a Neural Architecture Search (NAS) framework designed to address the problem of searching efficient architectures for devices with different resources constraints.
We aim to give one step further in the search for efficiency by explicitly conceiving the search stage as a multi-objective optimization problem.
arXiv Detail & Related papers (2023-03-23T21:30:29Z) - Improved Tree Search for Automatic Program Synthesis [91.3755431537592]
A key element is being able to perform an efficient search in the space of valid programs.
Here, we suggest a variant of MCTS that leads to state of the art results on two vastly different DSLs.
arXiv Detail & Related papers (2023-03-13T15:09:52Z) - Knowledge Retrieval [0.0]
This paper mainly focusses on Functional Object Oriented Network which is structured knowledge representation using the input output and motion nodes.
Different algorithms to traverse the tree in order to get the best output are also discussed in this paper.
arXiv Detail & Related papers (2022-10-22T20:41:09Z) - Community detection using low-dimensional network embedding algorithms [1.052782170493037]
We rigorously understand the performance of two major algorithms, DeepWalk and node2vec, in recovering communities for canonical network models.
We prove that, given some fixed co-occurrence window, node2vec using random walks with a low non-backtracking probability can succeed for much sparser networks.
arXiv Detail & Related papers (2021-11-04T14:57:43Z) - Optimized Quantum Networks [68.8204255655161]
Quantum networks offer the possibility to generate different kinds of entanglement prior to network requests.
We utilize this to design entanglement-based quantum networks tailored to their desired functionality.
arXiv Detail & Related papers (2021-07-21T18:00:07Z) - Artificial Neural Networks generated by Low Discrepancy Sequences [59.51653996175648]
We generate artificial neural networks as random walks on a dense network graph.
Such networks can be trained sparse from scratch, avoiding the expensive procedure of training a dense network and compressing it afterwards.
We demonstrate that the artificial neural networks generated by low discrepancy sequences can achieve an accuracy within reach of their dense counterparts at a much lower computational complexity.
arXiv Detail & Related papers (2021-03-05T08:45:43Z) - Decoupled and Memory-Reinforced Networks: Towards Effective Feature
Learning for One-Step Person Search [65.51181219410763]
One-step methods have been developed to handle pedestrian detection and identification sub-tasks using a single network.
There are two major challenges in the current one-step approaches.
We propose a decoupled and memory-reinforced network (DMRNet) to overcome these problems.
arXiv Detail & Related papers (2021-02-22T06:19:45Z) - Learning to Branch for Multi-Task Learning [12.49373126819798]
We present an automated multi-task learning algorithm that learns where to share or branch within a network.
We propose a novel tree-structured design space that casts a tree branching operation as a gumbel-softmax sampling procedure.
arXiv Detail & Related papers (2020-06-02T19:23:21Z) - A "Network Pruning Network" Approach to Deep Model Compression [62.68120664998911]
We present a filter pruning approach for deep model compression using a multitask network.
Our approach is based on learning a a pruner network to prune a pre-trained target network.
The compressed model produced by our approach is generic and does not need any special hardware/software support.
arXiv Detail & Related papers (2020-01-15T20:38:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.