Disturbances in Influence of a Shepherding Agent is More Impactful than
Sensorial Noise During Swarm Guidance
- URL: http://arxiv.org/abs/2008.12708v2
- Date: Sat, 3 Oct 2020 05:33:03 GMT
- Title: Disturbances in Influence of a Shepherding Agent is More Impactful than
Sensorial Noise During Swarm Guidance
- Authors: Hung The Nguyen, Matthew Garratt, Lam Thu Bui, and Hussein Abbass
- Abstract summary: The impact of noise on shepherding is not a well-studied problem.
We evaluate noise in the sensorial information received by the shepherd about the location of sheep.
Second, we evaluate noise in the ability of the sheepdog to influence sheep due to disturbance forces occurring during actuation.
- Score: 0.2624902795082451
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The guidance of a large swarm is a challenging control problem. Shepherding
offers one approach to guide a large swarm using a few shepherding agents
(sheepdogs). While noise is an inherent characteristic in many real-world
problems, the impact of noise on shepherding is not a well-studied problem. We
study two forms of noise. First, we evaluate noise in the sensorial information
received by the shepherd about the location of sheep. Second, we evaluate noise
in the ability of the sheepdog to influence sheep due to disturbance forces
occurring during actuation. We study both types of noise in this paper, and
investigate the performance of Str\"{o}mbom's approach under these actuation
and perception noises. To ensure that the parameterisation of the algorithm
creates a stable performance, we need to run a large number of simulations,
while increasing the number of random episodes until stability is achieved. We
then systematically study the impact of sensorial and actuation noise on
performance. Str\"{o}mbom's approach is found to be more sensitive to actuation
noise than perception noise. This implies that it is more important for the
shepherding agent to influence the sheep more accurately by reducing actuation
noise than attempting to reduce noise in its sensors. Moreover, different
levels of noise required different parameterisation for the shepherding agent,
where the threshold needed by an agent to decide whether or not to collect
astray sheep is different for different noise levels.
Related papers
- Understanding the Effect of Noise in LLM Training Data with Algorithmic
Chains of Thought [0.0]
We study how noise in chain of thought impacts task performance in highly-controlled setting.
We define two types of noise: textitstatic noise, a local form of noise which is applied after the CoT trace is computed, and textitdynamic noise, a global form of noise which propagates errors in the trace as it is computed.
We find fine-tuned models are extremely robust to high levels of static noise but struggle significantly more with lower levels of dynamic noise.
arXiv Detail & Related papers (2024-02-06T13:59:56Z) - Label Noise: Correcting the Forward-Correction [0.0]
Training neural network classifiers on datasets with label noise poses a risk of overfitting them to the noisy labels.
We propose an approach to tackling overfitting caused by label noise.
Motivated by this observation, we propose imposing a lower bound on the training loss to mitigate overfitting.
arXiv Detail & Related papers (2023-07-24T19:41:19Z) - Parameter estimation from an Ornstein-Uhlenbeck process with measurement noise [0.0]
We present an algorithm that can effectively separate thermal noise with comparable performance to Hamilton Monte Carlo.
We show that, with additional knowledge of the ratio between thermal and multiplicative noise, we can accurately distinguish between the two types of noise.
arXiv Detail & Related papers (2023-05-22T21:28:57Z) - Positive-incentive Noise [91.3755431537592]
Noise is conventionally viewed as a severe problem in diverse fields, e.g., engineering, learning systems.
This paper aims to investigate whether the conventional proposition always holds.
$pi$-noise offers new explanations for some models and provides a new principle for some fields, such as multi-task learning, adversarial training, etc.
arXiv Detail & Related papers (2022-12-19T15:33:34Z) - Inference and Denoise: Causal Inference-based Neural Speech Enhancement [83.4641575757706]
This study addresses the speech enhancement (SE) task within the causal inference paradigm by modeling the noise presence as an intervention.
The proposed causal inference-based speech enhancement (CISE) separates clean and noisy frames in an intervened noisy speech using a noise detector and assigns both sets of frames to two mask-based enhancement modules (EMs) to perform noise-conditional SE.
arXiv Detail & Related papers (2022-11-02T15:03:50Z) - Action Noise in Off-Policy Deep Reinforcement Learning: Impact on
Exploration and Performance [5.573543601558405]
We analyze how the learned policy is impacted by the noise type, noise scale, and impact scaling factor reduction schedule.
We consider the two most prominent types of action noise, Ornstein-Uhlenbeck noise, and perform a vast experimental campaign.
We conclude that the best noise type and scale are environment dependent, and based on our observations derive rules for guiding the choice of the action noise.
arXiv Detail & Related papers (2022-06-08T10:06:24Z) - Robust Meta-learning with Sampling Noise and Label Noise via
Eigen-Reptile [78.1212767880785]
meta-learner is prone to overfitting since there are only a few available samples.
When handling the data with noisy labels, the meta-learner could be extremely sensitive to label noise.
We present Eigen-Reptile (ER) that updates the meta- parameters with the main direction of historical task-specific parameters.
arXiv Detail & Related papers (2022-06-04T08:48:02Z) - Learning to Generate Realistic Noisy Images via Pixel-level Noise-aware
Adversarial Training [50.018580462619425]
We propose a novel framework, namely Pixel-level Noise-aware Generative Adrial Network (PNGAN)
PNGAN employs a pre-trained real denoiser to map the fake and real noisy images into a nearly noise-free solution space.
For better noise fitting, we present an efficient architecture Simple Multi-versa-scale Network (SMNet) as the generator.
arXiv Detail & Related papers (2022-04-06T14:09:02Z) - Removing Noise from Extracellular Neural Recordings Using Fully
Convolutional Denoising Autoencoders [62.997667081978825]
We propose a Fully Convolutional Denoising Autoencoder, which learns to produce a clean neuronal activity signal from a noisy multichannel input.
The experimental results on simulated data show that our proposed method can improve significantly the quality of noise-corrupted neural signals.
arXiv Detail & Related papers (2021-09-18T14:51:24Z) - Towards Noise-resistant Object Detection with Noisy Annotations [119.63458519946691]
Training deep object detectors requires significant amount of human-annotated images with accurate object labels and bounding box coordinates.
Noisy annotations are much more easily accessible, but they could be detrimental for learning.
We address the challenging problem of training object detectors with noisy annotations, where the noise contains a mixture of label noise and bounding box noise.
arXiv Detail & Related papers (2020-03-03T01:32:16Z) - NoiseBreaker: Gradual Image Denoising Guided by Noise Analysis [5.645552640953684]
This paper proposes a gradual denoising strategy that iteratively detects the dominating noise in an image, and removes it using a tailored denoiser.
The method provides an insight on the nature of the encountered noise, and it makes it possible to extend an existing denoiser with new noise nature.
arXiv Detail & Related papers (2020-02-18T11:09:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.