OntoMath${}^{\mathbf{PRO}}$ 2.0 Ontology: Updates of the Formal Model
- URL: http://arxiv.org/abs/2303.13542v1
- Date: Fri, 17 Mar 2023 20:29:17 GMT
- Title: OntoMath${}^{\mathbf{PRO}}$ 2.0 Ontology: Updates of the Formal Model
- Authors: Alexander Kirillovich, Olga Nevzorova, Evgeny Lipachev
- Abstract summary: The main attention is paid to the development of a formal model for the representation of mathematical statements in the Open Linked Data cloud.
The proposed model is intended for applications that extract mathematical facts from natural language mathematical texts and represent these facts as Linked Open Data.
The model is used in development of a new version of the OntoMath$mathrmPRO$ ontology of professional mathematics is described.
- Score: 68.8204255655161
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper is devoted to the problems of ontology-based mathematical
knowledge management and representation. The main attention is paid to the
development of a formal model for the representation of mathematical statements
in the Open Linked Data cloud. The proposed model is intended for applications
that extract mathematical facts from natural language mathematical texts and
represent these facts as Linked Open Data. The model is used in development of
a new version of the OntoMath${}^{\mathrm{PRO}}$ ontology of professional
mathematics is described. OntoMath${}^{\mathrm{PRO}}$ underlies a semantic
publishing platform, that takes as an input a collection of mathematical papers
in LaTeX format and builds their ontology-based Linked Open Data
representation. The semantic publishing platform, in turn, is a central
component of OntoMath digital ecosystem, an ecosystem of ontologies, text
analytics tools, and applications for mathematical knowledge management,
including semantic search for mathematical formulas and a recommender system
for mathematical papers. According to the new model, the ontology is organized
into three layers: a foundational ontology layer, a domain ontology layer and a
linguistic layer. The domain ontology layer contains language-independent math
concepts. The linguistic layer provides linguistic grounding for these
concepts, and the foundation ontology layer provides them with meta-ontological
annotations. The concepts are organized in two main hierarchies: the hierarchy
of objects and the hierarchy of reified relationships.
Related papers
- Towards a Knowledge Graph for Models and Algorithms in Applied Mathematics [0.0]
We aim to represent models and algorithms as well as their relationship semantically to make this research data FAIR.
The link between the two algorithmic tasks is established, as they occur in modeling corresponding to corresponding tasks.
Subject-specific metadata is relevant here, such as the symmetry of a matrix or the linearity of a mathematical model.
arXiv Detail & Related papers (2024-08-19T13:57:49Z) - Ontology Embedding: A Survey of Methods, Applications and Resources [54.3453925775069]
Ontologies are widely used for representing domain knowledge and meta data.
One straightforward solution is to integrate statistical analysis and machine learning.
Numerous papers have been published on embedding, but a lack of systematic reviews hinders researchers from gaining a comprehensive understanding of this field.
arXiv Detail & Related papers (2024-06-16T14:49:19Z) - MathBench: Evaluating the Theory and Application Proficiency of LLMs with a Hierarchical Mathematics Benchmark [82.64129627675123]
MathBench is a new benchmark that rigorously assesses the mathematical capabilities of large language models.
MathBench spans a wide range of mathematical disciplines, offering a detailed evaluation of both theoretical understanding and practical problem-solving skills.
arXiv Detail & Related papers (2024-05-20T17:52:29Z) - Fundamental Components of Deep Learning: A category-theoretic approach [0.0]
This thesis develops a novel mathematical foundation for deep learning based on the language of category theory.
We also systematise many existing approaches, placing many existing constructions and concepts under the same umbrella.
arXiv Detail & Related papers (2024-03-13T01:29:40Z) - MathGloss: Building mathematical glossaries from text [0.620048328543366]
MathGloss is a database of undergraduate concepts in mathematics.
It uses modern natural language processing (NLP) tools and resources already available on the web.
arXiv Detail & Related papers (2023-11-21T14:49:00Z) - Parmesan: mathematical concept extraction for education [0.5520082338220947]
We develop a prototype system for searching for and defining mathematical concepts in context, focusing on the field of category theory.
This system depends on natural language processing components including concept extraction, relation extraction, definition extraction, and entity linking.
We also provide two cleaned mathematical corpora that power the prototype system, which are based on journal articles and wiki pages.
arXiv Detail & Related papers (2023-07-13T11:55:03Z) - Lattice-preserving $\mathcal{ALC}$ ontology embeddings with saturation [50.05281461410368]
An order-preserving embedding method is proposed to generate embeddings of OWL representations.
We show that our method outperforms state-the-art theory-of-the-art embedding methods in several knowledge base completion tasks.
arXiv Detail & Related papers (2023-05-11T22:27:51Z) - Tree-Based Representation and Generation of Natural and Mathematical
Language [77.34726150561087]
Mathematical language in scientific communications and educational scenarios is important yet relatively understudied.
Recent works on mathematical language focus either on representing stand-alone mathematical expressions, or mathematical reasoning in pre-trained natural language models.
We propose a series of modifications to existing language models to jointly represent and generate text and math.
arXiv Detail & Related papers (2023-02-15T22:38:34Z) - A Survey of Deep Learning for Mathematical Reasoning [71.88150173381153]
We review the key tasks, datasets, and methods at the intersection of mathematical reasoning and deep learning over the past decade.
Recent advances in large-scale neural language models have opened up new benchmarks and opportunities to use deep learning for mathematical reasoning.
arXiv Detail & Related papers (2022-12-20T18:46:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.