Algebraic Approach to Directed Rough Sets
- URL: http://arxiv.org/abs/2004.12171v1
- Date: Sat, 25 Apr 2020 15:39:14 GMT
- Title: Algebraic Approach to Directed Rough Sets
- Authors: Mani A and Sandor Radeleczki
- Abstract summary: In relational approach to general rough sets, ideas of directed relations are supplemented with additional conditions.
The relations are also specialized to representations of general parthood that are upper-directed, reflexive and antisymmetric.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In relational approach to general rough sets, ideas of directed relations are
supplemented with additional conditions for multiple algebraic approaches in
this research paper. The relations are also specialized to representations of
general parthood that are upper-directed, reflexive and antisymmetric for a
better behaved groupoidal semantics over the set of roughly equivalent objects
by the first author. Another distinct algebraic semantics over the set of
approximations, and a new knowledge interpretation are also invented in this
research by her. Because of minimal conditions imposed on the relations,
neighborhood granulations are used in the construction of all approximations
(granular and pointwise). Necessary and sufficient conditions for the lattice
of local upper approximations to be completely distributive are proved by the
second author. These results are related to formal concept analysis.
Applications to student centered learning and decision making are also
outlined.
Related papers
- Algebraic Models for Qualified Aggregation in General Rough Sets, and
Reasoning Bias Discovery [0.0]
The research is motivated by the desire to model skeptical or pessimistic, and optimistic or possibilistic aggregation in human reasoning.
The model is suitable for the study of discriminatory/toxic behavior in human reasoning, and of ML algorithms learning such behavior.
arXiv Detail & Related papers (2023-08-30T17:07:54Z) - Comparison of Single- and Multi- Objective Optimization Quality for
Evolutionary Equation Discovery [77.34726150561087]
Evolutionary differential equation discovery proved to be a tool to obtain equations with less a priori assumptions.
The proposed comparison approach is shown on classical model examples -- Burgers equation, wave equation, and Korteweg - de Vries equation.
arXiv Detail & Related papers (2023-06-29T15:37:19Z) - Enriching Disentanglement: From Logical Definitions to Quantitative Metrics [59.12308034729482]
Disentangling the explanatory factors in complex data is a promising approach for data-efficient representation learning.
We establish relationships between logical definitions and quantitative metrics to derive theoretically grounded disentanglement metrics.
We empirically demonstrate the effectiveness of the proposed metrics by isolating different aspects of disentangled representations.
arXiv Detail & Related papers (2023-05-19T08:22:23Z) - Pre-trained Sentence Embeddings for Implicit Discourse Relation
Classification [26.973476248983477]
Implicit discourse relations bind smaller linguistic units into coherent texts.
We explore the utility of pre-trained sentence embeddings as base representations in a neural network for implicit discourse relation sense classification.
arXiv Detail & Related papers (2022-10-20T04:17:03Z) - Relational Proxies: Emergent Relationships as Fine-Grained
Discriminators [52.17542855760418]
We propose a novel approach that leverages information between the global and local part of an object for encoding its label.
We design Proxies based on our theoretical findings and evaluate it on seven challenging fine-grained benchmark datasets.
We also experimentally validate our theory and obtain consistent results across multiple benchmarks.
arXiv Detail & Related papers (2022-10-05T11:08:04Z) - Granular Directed Rough Sets, Concept Organization and Soft Clustering [0.0]
Up-directed rough sets are introduced and studied by the present author in earlier papers.
This is extended by her in two different granular directions, with a surprising algebraic semantics.
This research is expected to see significant theoretical and practical applications in related domains.
arXiv Detail & Related papers (2022-08-13T11:01:05Z) - Granular Generalized Variable Precision Rough Sets and Rational
Approximations [0.24366811507669117]
Granular approximations as per the procedures of VPRS are likely to be more rational than those constructed from a classical perspective under certain conditions.
meta applications to cluster validation, image segmentation and dynamic sorting are invented.
arXiv Detail & Related papers (2022-05-28T08:08:26Z) - Compositional Generalization Requires Compositional Parsers [69.77216620997305]
We compare sequence-to-sequence models and models guided by compositional principles on the recent COGS corpus.
We show structural generalization is a key measure of compositional generalization and requires models that are aware of complex structure.
arXiv Detail & Related papers (2022-02-24T07:36:35Z) - Learning Algebraic Recombination for Compositional Generalization [71.78771157219428]
We propose LeAR, an end-to-end neural model to learn algebraic recombination for compositional generalization.
Key insight is to model the semantic parsing task as a homomorphism between a latent syntactic algebra and a semantic algebra.
Experiments on two realistic and comprehensive compositional generalization demonstrate the effectiveness of our model.
arXiv Detail & Related papers (2021-07-14T07:23:46Z) - RatE: Relation-Adaptive Translating Embedding for Knowledge Graph
Completion [51.64061146389754]
We propose a relation-adaptive translation function built upon a novel weighted product in complex space.
We then present our Relation-adaptive translating Embedding (RatE) approach to score each graph triple.
arXiv Detail & Related papers (2020-10-10T01:30:30Z) - Lattice Representation Learning [6.427169570069738]
We introduce theory and algorithms for learning discrete representations that take on a lattice that is embedded in an Euclidean space.
Lattice representations possess an interesting combination of properties: a) they can be computed explicitly using lattice quantization, yet they can be learned efficiently using the ideas we introduce.
This article will focus on laying the groundwork for exploring and exploiting the first two properties, including a new mathematical result linking expressions used during training and inference time and experimental validation on two popular datasets.
arXiv Detail & Related papers (2020-06-24T16:05:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.