AnyDB: An Architecture-less DBMS for Any Workload
- URL: http://arxiv.org/abs/2009.02258v1
- Date: Fri, 4 Sep 2020 15:38:27 GMT
- Title: AnyDB: An Architecture-less DBMS for Any Workload
- Authors: Tiemo Bang (Technical University Darmstadt and SAP SE), Norman May
(SAP SE), Ilia Petrov (Reutlingen University), Carsten Binnig (Technical
University Darmstadt)
- Abstract summary: Instead of hard-baking an architectural model, such as a shared-nothing architecture, we aim for a new class of so-called architecture-lesss.
Our initial results show that our architecture-less AnyDB can provide significant speed-ups across varying workloads.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In this paper, we propose a radical new approach for scale-out distributed
DBMSs. Instead of hard-baking an architectural model, such as a shared-nothing
architecture, into the distributed DBMS design, we aim for a new class of
so-called architecture-less DBMSs. The main idea is that an architecture-less
DBMS can mimic any architecture on a per-query basis on-the-fly without any
additional overhead for reconfiguration. Our initial results show that our
architecture-less DBMS AnyDB can provide significant speed-ups across varying
workloads compared to a traditional DBMS implementing a static architecture.
Related papers
- From Requirements to Architecture: Semi-Automatically Generating Software Architectures [0.0]
This method involves the architect's close collaboration with LLM-fueled tooling over the whole process.
The architect is guided through Domain Model creation, Use Case specification, architectural decisions, and architecture evaluation.
Preliminary results suggest the feasibility of this process and indicate major time savings for the architect.
arXiv Detail & Related papers (2025-04-16T15:46:56Z) - TwinArch: A Digital Twin Reference Architecture [83.71071897785683]
Digital Twins (DTs) are dynamic virtual representations of physical systems, enabled by seamless, bidirectional communication between the physical and digital realms.
The proposed Digital Twin Reference Architecture is named TwinArch. It is documented using the Views and Beyond methodology by the Software Engineering Institute.
arXiv Detail & Related papers (2025-04-10T07:53:11Z) - EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - Software Architecture Recovery with Information Fusion [14.537490019685384]
We propose SARIF, a fully automated architecture recovery technique.
It incorporates three types of comprehensive information, including dependencies, code text and folder structure.
SARIF is 36.1% more accurate than the best of the previous techniques on average.
arXiv Detail & Related papers (2023-11-08T12:35:37Z) - Serving Deep Learning Model in Relational Databases [70.53282490832189]
Serving deep learning (DL) models on relational data has become a critical requirement across diverse commercial and scientific domains.
We highlight three pivotal paradigms: The state-of-the-art DL-centric architecture offloads DL computations to dedicated DL frameworks.
The potential UDF-centric architecture encapsulates one or more tensor computations into User Defined Functions (UDFs) within the relational database management system (RDBMS)
arXiv Detail & Related papers (2023-10-07T06:01:35Z) - Visual Analysis of Neural Architecture Spaces for Summarizing Design
Principles [22.66053583920441]
ArchExplorer is a visual analysis method for understanding a neural architecture space and summarizing design principles.
A circle-packing-based architecture visualization has been developed to convey both the global relationships between clusters and local neighborhoods of the architectures in each cluster.
Two case studies and a post-analysis are presented to demonstrate the effectiveness of ArchExplorer in summarizing design principles and selecting better-performing architectures.
arXiv Detail & Related papers (2022-08-20T12:15:59Z) - Federated Learning with Heterogeneous Architectures using Graph
HyperNetworks [154.60662664160333]
We propose a new FL framework that accommodates heterogeneous client architecture by adopting a graph hypernetwork for parameter sharing.
Unlike existing solutions, our framework does not limit the clients to share the same architecture type, makes no use of external data and does not require clients to disclose their model architecture.
arXiv Detail & Related papers (2022-01-20T21:36:25Z) - Rethinking Architecture Selection in Differentiable NAS [74.61723678821049]
Differentiable Neural Architecture Search is one of the most popular NAS methods for its search efficiency and simplicity.
We propose an alternative perturbation-based architecture selection that directly measures each operation's influence on the supernet.
We find that several failure modes of DARTS can be greatly alleviated with the proposed selection method.
arXiv Detail & Related papers (2021-08-10T00:53:39Z) - Revisiting Deep Learning Models for Tabular Data [40.67427600770095]
It is unclear for both researchers and practitioners what models perform best.
The first one is a ResNet-like architecture which turns out to be a strong baseline that is often missing in prior works.
The second model is our simple adaptation of the Transformer architecture for tabular data, which outperforms other solutions on most tasks.
arXiv Detail & Related papers (2021-06-22T17:58:10Z) - MetaPerturb: Transferable Regularizer for Heterogeneous Tasks and
Architectures [61.73533544385352]
We propose a transferable perturbation, MetaPerturb, which is meta-learned to improve generalization performance on unseen data.
As MetaPerturb is a set-function trained over diverse distributions across layers and tasks, it can generalize heterogeneous tasks and architectures.
arXiv Detail & Related papers (2020-06-13T02:54:59Z) - Stage-Wise Neural Architecture Search [65.03109178056937]
Modern convolutional networks such as ResNet and NASNet have achieved state-of-the-art results in many computer vision applications.
These networks consist of stages, which are sets of layers that operate on representations in the same resolution.
It has been demonstrated that increasing the number of layers in each stage improves the prediction ability of the network.
However, the resulting architecture becomes computationally expensive in terms of floating point operations, memory requirements and inference time.
arXiv Detail & Related papers (2020-04-23T14:16:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.