Reinforcement Learning for Data-Driven Workflows in Radio Interferometry. I. Principal Demonstration in Calibration
- URL: http://arxiv.org/abs/2410.17135v1
- Date: Tue, 22 Oct 2024 16:07:55 GMT
- Title: Reinforcement Learning for Data-Driven Workflows in Radio Interferometry. I. Principal Demonstration in Calibration
- Authors: Brian M. Kirk, Urvashi Rau, Ramyaa Ramyaa,
- Abstract summary: Radio interferometry is an observational technique used to study astrophysical phenomena.
Data gathered by an interferometer requires substantial processing before astronomers can extract the scientific information from it.
This paper introduces a simplified description of the principles behind interferometry and the procedures required for data processing.
- Score: 1.167489362272148
- License:
- Abstract: Radio interferometry is an observational technique used to study astrophysical phenomena. Data gathered by an interferometer requires substantial processing before astronomers can extract the scientific information from it. Data processing consists of a sequence of calibration and analysis procedures where choices must be made about the sequence of procedures as well as the specific configuration of the procedure itself. These choices are typically based on a combination of measurable data characteristics, an understanding of the instrument itself, an appreciation of the trade-offs between compute cost and accuracy, and a learned understanding of what is considered "best practice". A metric of absolute correctness is not always available and validity is often subject to human judgment. The underlying principles and software configurations to discern a reasonable workflow for a given dataset is the subject of training workshops for students and scientists. Our goal is to use objective metrics that quantify best practice, and numerically map out the decision space with respect to our metrics. With these objective metrics we demonstrate an automated, data-driven, decision system that is capable of sequencing the optimal action(s) for processing interferometric data. This paper introduces a simplified description of the principles behind interferometry and the procedures required for data processing. We highlight the issues with current automation approaches and propose our ideas for solving these bottlenecks. A prototype is demonstrated and the results are discussed.
Related papers
- Scalability of memorization-based machine unlearning [2.5782420501870296]
Machine unlearning (MUL) focuses on removing the influence of specific subsets of data from pretrained models.
Memorization-based unlearning methods have been developed, demonstrating exceptional performance with respect to unlearning quality.
We tackle these scalability challenges of state-of-the-art memorization-based MUL algorithms using a series of memorization-score proxies.
arXiv Detail & Related papers (2024-10-21T21:18:39Z) - Human-in-the-loop Reinforcement Learning for Data Quality Monitoring in Particle Physics Experiments [0.0]
We propose a proof-of-concept for applying human-in-the-loop Reinforcement Learning to automate the Data Quality Monitoring process.
We show that random, unbiased noise in human classification can be reduced, leading to an improved accuracy over the baseline.
arXiv Detail & Related papers (2024-05-24T12:52:46Z) - A Self-Commissioning Edge Computing Method for Data-Driven Anomaly
Detection in Power Electronic Systems [0.0]
Methods that work well in controlled lab environments to field applications presents significant challenges.
Online machine learning can be a powerful tool to overcome this problem, but it introduces additional challenges in ensuring the stability and predictability of the training processes.
This work presents an edge computing method that mitigates these shortcomings with minimal additional memory usage.
arXiv Detail & Related papers (2023-12-05T10:56:25Z) - Tailoring Machine Learning for Process Mining [5.237999056930947]
We argue that a deeper insight into the issues raised by training machine learning models with process data is crucial to ground a sound integration of process mining and machine learning.
Our analysis of such issues is aimed at laying the foundation for a methodology aimed at correctly aligning machine learning with process mining requirements.
arXiv Detail & Related papers (2023-06-17T12:59:51Z) - MARS: Meta-Learning as Score Matching in the Function Space [79.73213540203389]
We present a novel approach to extracting inductive biases from a set of related datasets.
We use functional Bayesian neural network inference, which views the prior as a process and performs inference in the function space.
Our approach can seamlessly acquire and represent complex prior knowledge by metalearning the score function of the data-generating process.
arXiv Detail & Related papers (2022-10-24T15:14:26Z) - Information-Theoretic Odometry Learning [83.36195426897768]
We propose a unified information theoretic framework for learning-motivated methods aimed at odometry estimation.
The proposed framework provides an elegant tool for performance evaluation and understanding in information-theoretic language.
arXiv Detail & Related papers (2022-03-11T02:37:35Z) - Human-in-the-Loop Disinformation Detection: Stance, Sentiment, or
Something Else? [93.91375268580806]
Both politics and pandemics have recently provided ample motivation for the development of machine learning-enabled disinformation (a.k.a. fake news) detection algorithms.
Existing literature has focused primarily on the fully-automated case, but the resulting techniques cannot reliably detect disinformation on the varied topics, sources, and time scales required for military applications.
By leveraging an already-available analyst as a human-in-the-loop, canonical machine learning techniques of sentiment analysis, aspect-based sentiment analysis, and stance detection become plausible methods to use for a partially-automated disinformation detection system.
arXiv Detail & Related papers (2021-11-09T13:30:34Z) - An Extensible Benchmark Suite for Learning to Simulate Physical Systems [60.249111272844374]
We introduce a set of benchmark problems to take a step towards unified benchmarks and evaluation protocols.
We propose four representative physical systems, as well as a collection of both widely used classical time-based and representative data-driven methods.
arXiv Detail & Related papers (2021-08-09T17:39:09Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Causal Feature Selection for Algorithmic Fairness [61.767399505764736]
We consider fairness in the integration component of data management.
We propose an approach to identify a sub-collection of features that ensure the fairness of the dataset.
arXiv Detail & Related papers (2020-06-10T20:20:10Z) - Machine Learning to Tackle the Challenges of Transient and Soft Errors
in Complex Circuits [0.16311150636417257]
Machine learning models are used to predict accurate per-instance Functional De-Rating data for the full list of circuit instances.
The presented methodology is applied on a practical example and various machine learning models are evaluated and compared.
arXiv Detail & Related papers (2020-02-18T18:38:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.