Generalized Householder transformations
- URL: http://arxiv.org/abs/2112.15206v2
- Date: Sat, 19 Mar 2022 17:12:06 GMT
- Title: Generalized Householder transformations
- Authors: Karl Svozil
- Abstract summary: We discuss the Householder transformation, allowing a rewrite of probabilities into expectations of dichotomic observables.
The dichotomy is modulated by allowing more than one negative eigenvalue or by abandoning binaries altogether.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The Householder transformation, allowing a rewrite of probabilities into
expectations of dichotomic observables, is generalized in terms of its spectral
decomposition. The dichotomy is modulated by allowing more than one negative
eigenvalue or by abandoning binaries altogether, yielding generalized
operator-valued arguments for contextuality. We also discuss a form of
contextuality by the variation of the functional relations of the operators, in
particular by additivity.
Related papers
- Interpreting Affine Recurrence Learning in GPT-style Transformers [54.01174470722201]
In-context learning allows GPT-style transformers to generalize during inference without modifying their weights.
This paper focuses specifically on their ability to learn and predict affine recurrences as an ICL task.
We analyze the model's internal operations using both empirical and theoretical approaches.
arXiv Detail & Related papers (2024-10-22T21:30:01Z) - Strengthening Structural Inductive Biases by Pre-training to Perform Syntactic Transformations [75.14793516745374]
We propose to strengthen the structural inductive bias of a Transformer by intermediate pre-training.
Our experiments confirm that this helps with few-shot learning of syntactic tasks such as chunking.
Our analysis shows that the intermediate pre-training leads to attention heads that keep track of which syntactic transformation needs to be applied to which token.
arXiv Detail & Related papers (2024-07-05T14:29:44Z) - On reconstruction of states from evolution induced by quantum dynamical
semigroups perturbed by covariant measures [50.24983453990065]
We show the ability to restore states of quantum systems from evolution induced by quantum dynamical semigroups perturbed by covariant measures.
Our procedure describes reconstruction of quantum states transmitted via quantum channels and as a particular example can be applied to reconstruction of photonic states transmitted via optical fibers.
arXiv Detail & Related papers (2023-12-02T09:56:00Z) - Source Condition Double Robust Inference on Functionals of Inverse
Problems [71.42652863687117]
We consider estimation of parameters defined as linear functionals of solutions to linear inverse problems.
We provide the first source condition double robust inference method.
arXiv Detail & Related papers (2023-07-25T19:54:46Z) - Remarks on the quasi-position representation in models of generalized
uncertainty principle [0.0]
This note aims to elucidate certain aspects of the quasi-position representation frequently used in the investigation of one-dimensional models.
We focus on two key points: (i) Contrary to recent claims, the quasi-position operator can possess physical significance even though it is non-Hermitian, and (ii) in the quasi-position representation, operators associated with the position behave as a derivative operator on the quasi-position coordinate.
arXiv Detail & Related papers (2023-06-20T11:46:56Z) - Marginalized Operators for Off-policy Reinforcement Learning [53.37381513736073]
Marginalized operators strictly generalize generic multi-step operators, such as Retrace, as special cases.
We show that the estimates for marginalized operators can be computed in a scalable way, which also generalizes prior results on marginalized importance sampling as special cases.
arXiv Detail & Related papers (2022-03-30T09:59:59Z) - Learning Algebraic Recombination for Compositional Generalization [71.78771157219428]
We propose LeAR, an end-to-end neural model to learn algebraic recombination for compositional generalization.
Key insight is to model the semantic parsing task as a homomorphism between a latent syntactic algebra and a semantic algebra.
Experiments on two realistic and comprehensive compositional generalization demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-14T07:23:46Z) - Negative Translations of Orthomodular Lattices and Their Logic [0.0]
We introduce residuated ortholattices as a generalization of -- and environment for the investigation of -- orthomodular lattices.
We show that residuated ortholattices are the equivalent algebraic semantics of an algebraizable propositional logic.
arXiv Detail & Related papers (2021-06-07T14:35:27Z) - Provably Strict Generalisation Benefit for Equivariant Models [1.332560004325655]
It is widely believed that engineering a model to be invariant/equivariant improves generalisation.
This paper provides the first provably non-zero improvement in generalisation for invariant/equivariant models.
arXiv Detail & Related papers (2021-02-20T12:47:32Z) - Entropic Bounds as Uncertainty Measure of Unitary Operators [0.0]
We show how distinguishable operators are compatible while maximal incompatibility of unitary operators can be connected to bases for some subspaces of operators which are mutually unbiased.
arXiv Detail & Related papers (2020-11-24T01:38:44Z) - Extensions of Hardy-type true-implies-false gadgets to classically
obtain indistinguishability [0.0]
Hardy-type arguments can be uniformly presented and extended as collections of intertwined contexts and their observables.
They serve as graph-theoretic "gadgets" that enforce correlations on the respective preselected and postselected observable terminal points.
arXiv Detail & Related papers (2020-06-22T10:43:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.