Comparative Analysis of Widely use Object-Oriented Languages
- URL: http://arxiv.org/abs/2306.01819v1
- Date: Fri, 2 Jun 2023 12:28:13 GMT
- Title: Comparative Analysis of Widely use Object-Oriented Languages
- Authors: Muhammad Shoaib Farooq, Taymour zaman Khan
- Abstract summary: Learning of object-oriented paradigm is compulsory in every computer science major.
It is difficult to choose which should be the first programming language in order to teach object-oriented principles.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Programming is an integral part of computer science discipline. Every day the
programming environment is not only rapidly growing but also changing and
languages are constantly evolving. Learning of object-oriented paradigm is
compulsory in every computer science major so the choice of language to teach
object-oriented principles is very important. Due to large pool of
object-oriented languages, it is difficult to choose which should be the first
programming language in order to teach object-oriented principles. Many studies
shown which should be the first language to tech object-oriented concepts but
there is no method to compare and evaluate these languages. In this article we
proposed a comprehensive framework to evaluate the widely used object-oriented
languages. The languages are evaluated basis of their technical and
environmental features.
Related papers
- Which Programming Language and What Features at Pre-training Stage Affect Downstream Logical Inference Performance? [26.91104188917787]
Large language models (LLMs) have demonstrated remarkable generalization abilities in mathematics and logical reasoning tasks.
Our research aims to verify which programming languages and features during pre-training affect logical inference performance.
arXiv Detail & Related papers (2024-10-09T10:13:13Z) - The Role of Language Imbalance in Cross-lingual Generalisation: Insights from Cloned Language Experiments [57.273662221547056]
In this study, we investigate an unintuitive novel driver of cross-lingual generalisation: language imbalance.
We observe that the existence of a predominant language during training boosts the performance of less frequent languages.
As we extend our analysis to real languages, we find that infrequent languages still benefit from frequent ones, yet whether language imbalance causes cross-lingual generalisation there is not conclusive.
arXiv Detail & Related papers (2024-04-11T17:58:05Z) - Tapping into the Natural Language System with Artificial Languages when
Learning Programming [7.5520627446611925]
The goal of this study is to investigate the feasibility of this idea, such that we can enhance learning programming by activating language learning mechanisms.
We observed that the training of the artificial language can be easily integrated into our curriculum.
However, within the context of our study, we did not find a significant benefit for programming competency when students learned an artificial language first.
arXiv Detail & Related papers (2024-01-12T07:08:55Z) - Towards Bridging the Digital Language Divide [4.234367850767171]
multilingual language processing systems often exhibit a hardwired, yet usually involuntary and hidden representational preference towards certain languages.
We show that biased technology is often the result of research and development methodologies that do not do justice to the complexity of the languages being represented.
We present a new initiative that aims at reducing linguistic bias through both technological design and methodology.
arXiv Detail & Related papers (2023-07-25T10:53:20Z) - Language Cognition and Language Computation -- Human and Machine
Language Understanding [51.56546543716759]
Language understanding is a key scientific issue in the fields of cognitive and computer science.
Can a combination of the disciplines offer new insights for building intelligent language models?
arXiv Detail & Related papers (2023-01-12T02:37:00Z) - Benchmarking Language Models for Code Syntax Understanding [79.11525961219591]
Pre-trained language models have demonstrated impressive performance in both natural language processing and program understanding.
In this work, we perform the first thorough benchmarking of the state-of-the-art pre-trained models for identifying the syntactic structures of programs.
Our findings point out key limitations of existing pre-training methods for programming languages, and suggest the importance of modeling code syntactic structures.
arXiv Detail & Related papers (2022-10-26T04:47:18Z) - Identifying concept libraries from language about object structure [56.83719358616503]
We leverage natural language descriptions for a diverse set of 2K procedurally generated objects to identify the parts people use.
We formalize our problem as search over a space of program libraries that contain different part concepts.
By combining naturalistic language at scale with structured program representations, we discover a fundamental information-theoretic tradeoff governing the part concepts people name.
arXiv Detail & Related papers (2022-05-11T17:49:25Z) - Pre-Trained Language Models for Interactive Decision-Making [72.77825666035203]
We describe a framework for imitation learning in which goals and observations are represented as a sequence of embeddings.
We demonstrate that this framework enables effective generalization across different environments.
For test tasks involving novel goals or novel scenes, initializing policies with language models improves task completion rates by 43.6%.
arXiv Detail & Related papers (2022-02-03T18:55:52Z) - Towards Zero-shot Language Modeling [90.80124496312274]
We construct a neural model that is inductively biased towards learning human languages.
We infer this distribution from a sample of typologically diverse training languages.
We harness additional language-specific side information as distant supervision for held-out languages.
arXiv Detail & Related papers (2021-08-06T23:49:18Z) - ReferentialGym: A Nomenclature and Framework for Language Emergence &
Grounding in (Visual) Referential Games [0.30458514384586394]
Natural languages are powerful tools wielded by human beings to communicate information and co-operate towards common goals.
computational linguists have been researching the emergence of in artificial languages induced by language games.
The AI community has started to investigate language emergence and grounding working towards better human-machine interfaces.
arXiv Detail & Related papers (2020-12-17T10:22:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.