The role of prior information and computational power in Machine
Learning
- URL: http://arxiv.org/abs/2211.01972v1
- Date: Mon, 31 Oct 2022 20:39:53 GMT
- Title: The role of prior information and computational power in Machine
Learning
- Authors: Diego Marcondes, Adilson Simonis and Junior Barrera
- Abstract summary: We discuss how prior information and computational power can be employed to solve a learning problem.
We argue that employing high computational power has the advantage of a higher performance.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Science consists on conceiving hypotheses, confronting them with empirical
evidence, and keeping only hypotheses which have not yet been falsified. Under
deductive reasoning they are conceived in view of a theory and confronted with
empirical evidence in an attempt to falsify it, and under inductive reasoning
they are conceived based on observation, confronted with empirical evidence and
a theory is established based on the not falsified hypotheses. When the
hypotheses testing can be performed with quantitative data, the confrontation
can be achieved with Machine Learning methods, whose quality is highly
dependent on the hypotheses' complexity, hence on the proper insertion of prior
information into the set of hypotheses seeking to decrease its complexity
without loosing good hypotheses. However, Machine Learning tools have been
applied under the pragmatic view of instrumentalism, which is concerned only
with the performance of the methods and not with the understanding of their
behavior, leading to methods which are not fully understood. In this context,
we discuss how prior information and computational power can be employed to
solve a learning problem, but while prior information and a careful design of
the hypotheses space has as advantage the interpretability of the results,
employing high computational power has the advantage of a higher performance.
We discuss why learning methods which combine both should work better from an
understanding and performance perspective, arguing in favor of basic
theoretical research on Machine Learning, in special about how properties of
classifiers may be identified in parameters of modern learning models.
Related papers
- A Theory of Machine Learning [0.0]
We show that this theory challenges common assumptions in the statistical and the computational learning theories.
We briefly discuss some case studies from natural language processing and macroeconomics from the perspective of the new theory.
arXiv Detail & Related papers (2024-07-07T23:57:10Z) - Class-wise Activation Unravelling the Engima of Deep Double Descent [0.0]
Double descent presents a counter-intuitive aspect within the machine learning domain.
In this study, we revisited the phenomenon of double descent and discussed the conditions of its occurrence.
arXiv Detail & Related papers (2024-05-13T12:07:48Z) - On Continuity of Robust and Accurate Classifiers [3.8673630752805437]
It has been shown that adversarial training can improve the robustness of the hypothesis.
It has been suggested that robustness and accuracy of a hypothesis are at odds with each other.
In this paper, we put forth the alternative proposal that it is the continuity of a hypothesis that is incompatible with its robustness and accuracy.
arXiv Detail & Related papers (2023-09-29T08:14:25Z) - Large Language Models for Automated Open-domain Scientific Hypotheses Discovery [50.40483334131271]
This work proposes the first dataset for social science academic hypotheses discovery.
Unlike previous settings, the new dataset requires (1) using open-domain data (raw web corpus) as observations; and (2) proposing hypotheses even new to humanity.
A multi- module framework is developed for the task, including three different feedback mechanisms to boost performance.
arXiv Detail & Related papers (2023-09-06T05:19:41Z) - A Double Machine Learning Approach to Combining Experimental and Observational Data [59.29868677652324]
We propose a double machine learning approach to combine experimental and observational studies.
Our framework tests for violations of external validity and ignorability under milder assumptions.
arXiv Detail & Related papers (2023-07-04T02:53:11Z) - Advancing Counterfactual Inference through Nonlinear Quantile Regression [77.28323341329461]
We propose a framework for efficient and effective counterfactual inference implemented with neural networks.
The proposed approach enhances the capacity to generalize estimated counterfactual outcomes to unseen data.
Empirical results conducted on multiple datasets offer compelling support for our theoretical assertions.
arXiv Detail & Related papers (2023-06-09T08:30:51Z) - On the Joint Interaction of Models, Data, and Features [82.60073661644435]
We introduce a new tool, the interaction tensor, for empirically analyzing the interaction between data and model through features.
Based on these observations, we propose a conceptual framework for feature learning.
Under this framework, the expected accuracy for a single hypothesis and agreement for a pair of hypotheses can both be derived in closed-form.
arXiv Detail & Related papers (2023-06-07T21:35:26Z) - A Causal Research Pipeline and Tutorial for Psychologists and Social
Scientists [7.106986689736828]
Causality is a fundamental part of the scientific endeavour to understand the world.
Unfortunately, causality is still taboo in much of psychology and social science.
Motivated by a growing number of recommendations for the importance of adopting causal approaches to research, we reformulate the typical approach to research in psychology to harmonize inevitably causal theories with the rest of the research pipeline.
arXiv Detail & Related papers (2022-06-10T15:11:57Z) - Abduction and Argumentation for Explainable Machine Learning: A Position
Survey [2.28438857884398]
This paper presents Abduction and Argumentation as two principled forms for reasoning.
It fleshes out the fundamental role that they can play within Machine Learning.
arXiv Detail & Related papers (2020-10-24T13:23:44Z) - Double Robust Representation Learning for Counterfactual Prediction [68.78210173955001]
We propose a novel scalable method to learn double-robust representations for counterfactual predictions.
We make robust and efficient counterfactual predictions for both individual and average treatment effects.
The algorithm shows competitive performance with the state-of-the-art on real world and synthetic data.
arXiv Detail & Related papers (2020-10-15T16:39:26Z) - Neuro-symbolic Architectures for Context Understanding [59.899606495602406]
We propose the use of hybrid AI methodology as a framework for combining the strengths of data-driven and knowledge-driven approaches.
Specifically, we inherit the concept of neuro-symbolism as a way of using knowledge-bases to guide the learning progress of deep neural networks.
arXiv Detail & Related papers (2020-03-09T15:04:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.