Granular Directed Rough Sets, Concept Organization and Soft Clustering
- URL: http://arxiv.org/abs/2208.06623v1
- Date: Sat, 13 Aug 2022 11:01:05 GMT
- Title: Granular Directed Rough Sets, Concept Organization and Soft Clustering
- Authors: Mani A
- Abstract summary: Up-directed rough sets are introduced and studied by the present author in earlier papers.
This is extended by her in two different granular directions, with a surprising algebraic semantics.
This research is expected to see significant theoretical and practical applications in related domains.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Up-directed rough sets are introduced and studied by the present author in
earlier papers. This is extended by her in two different granular directions in
this research, with a surprising algebraic semantics. The granules are based on
ideas of generalized closure under up-directedness that may be read as a form
of weak consequence. This yields approximation operators that satisfy cautious
monotony, while pi-groupoidal approximations (that additionally involve
strategic choice and algebraic operators) have nicer properties. The study is
primarily motivated by possible structure of concepts in distributed cognition
perspectives, real or virtual classroom learning contexts, and student-centric
teaching. Rough clustering techniques for datasets that involve up-directed
relations (as in the study of Sentinel project image data) are additionally
proposed. This research is expected to see significant theoretical and
practical applications in related domains.
Related papers
- Logifold: A Geometrical Foundation of Ensemble Machine Learning [0.0]
We present a local-to-global and measure-theoretical approach to understanding datasets.
The core idea is to formulate a logifold structure and to interpret network models with restricted domains as local charts of datasets.
arXiv Detail & Related papers (2024-07-23T04:47:58Z) - A Theoretical Study of Inductive Biases in Contrastive Learning [32.98250585760665]
We provide the first theoretical analysis of self-supervised learning that incorporates the effect of inductive biases originating from the model class.
We show that when the model has limited capacity, contrastive representations would recover certain special clustering structures that are compatible with the model architecture.
arXiv Detail & Related papers (2022-11-27T01:53:29Z) - Granular Generalized Variable Precision Rough Sets and Rational
Approximations [0.24366811507669117]
Granular approximations as per the procedures of VPRS are likely to be more rational than those constructed from a classical perspective under certain conditions.
meta applications to cluster validation, image segmentation and dynamic sorting are invented.
arXiv Detail & Related papers (2022-05-28T08:08:26Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - Prototypical Representation Learning for Relation Extraction [56.501332067073065]
This paper aims to learn predictive, interpretable, and robust relation representations from distantly-labeled data.
We learn prototypes for each relation from contextual information to best explore the intrinsic semantics of relations.
Results on several relation learning tasks show that our model significantly outperforms the previous state-of-the-art relational models.
arXiv Detail & Related papers (2021-03-22T08:11:43Z) - Deep Clustering by Semantic Contrastive Learning [67.28140787010447]
We introduce a novel variant called Semantic Contrastive Learning (SCL)
It explores the characteristics of both conventional contrastive learning and deep clustering.
It can amplify the strengths of contrastive learning and deep clustering in a unified approach.
arXiv Detail & Related papers (2021-03-03T20:20:48Z) - Generalization Properties of Optimal Transport GANs with Latent
Distribution Learning [52.25145141639159]
We study how the interplay between the latent distribution and the complexity of the pushforward map affects performance.
Motivated by our analysis, we advocate learning the latent distribution as well as the pushforward map within the GAN paradigm.
arXiv Detail & Related papers (2020-07-29T07:31:33Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Relation-Guided Representation Learning [53.60351496449232]
We propose a new representation learning method that explicitly models and leverages sample relations.
Our framework well preserves the relations between samples.
By seeking to embed samples into subspace, we show that our method can address the large-scale and out-of-sample problem.
arXiv Detail & Related papers (2020-07-11T10:57:45Z) - Algebraic Approach to Directed Rough Sets [0.0]
In relational approach to general rough sets, ideas of directed relations are supplemented with additional conditions.
The relations are also specialized to representations of general parthood that are upper-directed, reflexive and antisymmetric.
arXiv Detail & Related papers (2020-04-25T15:39:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.