Serving Deep Learning Model in Relational Databases
- URL: http://arxiv.org/abs/2310.04696v2
- Date: Tue, 10 Oct 2023 03:51:58 GMT
- Title: Serving Deep Learning Model in Relational Databases
- Authors: Alexandre Eichenberger, Qi Lin, Saif Masood, Hong Min, Alexander Sim,
Jie Wang, Yida Wang, Kesheng Wu, Binhang Yuan, Lixi Zhou, Jia Zou
- Abstract summary: Serving deep learning (DL) models on relational data has become a critical requirement across diverse commercial and scientific domains.
We highlight three pivotal paradigms: The state-of-the-artDL-Centric architecture offloadsDL computations to dedicated DL frameworks.
The potential UDF-Centric architecture encapsulates one or more tensor computations into User Defined Functions (UDFs) within the database system.
The potentialRelation-Centric architecture aims to represent a large-scale tensor computation through operators.
- Score: 72.72372281808694
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Serving deep learning (DL) models on relational data has become a critical
requirement across diverse commercial and scientific domains, sparking growing
interest recently. In this visionary paper, we embark on a comprehensive
exploration of representative architectures to address the requirement. We
highlight three pivotal paradigms: The state-of-the-artDL-Centricarchitecture
offloadsDL computations to dedicated DL frameworks. The potential UDF-Centric
architecture encapsulates one or more tensor computations into User Defined
Functions (UDFs) within the database system. The
potentialRelation-Centricarchitecture aims to represent a large-scale tensor
computation through relational operators. While each of these architectures
demonstrates promise in specific use scenarios, we identify urgent requirements
for seamless integration of these architectures and the middle ground between
these architectures. We delve into the gaps that impede the integration and
explore innovative strategies to close them. We present a pathway to establish
a novel database system for enabling a broad class of data-intensive DL
inference applications.
Related papers
- ADMUS: A Progressive Question Answering Framework Adaptable to Multiple
Knowledge Sources [9.484792817869671]
We present ADMUS, a progressive knowledge base question answering framework designed to accommodate a wide variety of datasets.
Our framework supports the seamless integration of new datasets with minimal effort, only requiring creating a dataset-related micro-service at a negligible cost.
arXiv Detail & Related papers (2023-08-09T08:46:39Z) - Systems for Parallel and Distributed Large-Model Deep Learning Training [7.106986689736828]
Some recent Transformer models span hundreds of billions of learnable parameters.
These designs have introduced new scale-driven systems challenges for the DL space.
This survey will explore the large-model training systems landscape, highlighting key challenges and the various techniques that have been used to address them.
arXiv Detail & Related papers (2023-01-06T19:17:29Z) - Federated Learning with Heterogeneous Architectures using Graph
HyperNetworks [154.60662664160333]
We propose a new FL framework that accommodates heterogeneous client architecture by adopting a graph hypernetwork for parameter sharing.
Unlike existing solutions, our framework does not limit the clients to share the same architecture type, makes no use of external data and does not require clients to disclose their model architecture.
arXiv Detail & Related papers (2022-01-20T21:36:25Z) - Differentiable Architecture Pruning for Transfer Learning [6.935731409563879]
We propose a gradient-based approach for extracting sub-architectures from a given large model.
Our architecture-pruning scheme produces transferable new structures that can be successfully retrained to solve different tasks.
We provide theoretical convergence guarantees and validate the proposed transfer-learning strategy on real data.
arXiv Detail & Related papers (2021-07-07T17:44:59Z) - Edge-assisted Democratized Learning Towards Federated Analytics [67.44078999945722]
We show the hierarchical learning structure of the proposed edge-assisted democratized learning mechanism, namely Edge-DemLearn.
We also validate Edge-DemLearn as a flexible model training mechanism to build a distributed control and aggregation methodology in regions.
arXiv Detail & Related papers (2020-12-01T11:46:03Z) - Revealing the Invisible with Model and Data Shrinking for
Composite-database Micro-expression Recognition [49.463864096615254]
We analyze the influence of learning complexity, including the input complexity and model complexity.
We propose a recurrent convolutional network (RCN) to explore the shallower-architecture and lower-resolution input data.
We develop three parameter-free modules to integrate with RCN without increasing any learnable parameters.
arXiv Detail & Related papers (2020-06-17T06:19:24Z) - Stage-Wise Neural Architecture Search [65.03109178056937]
Modern convolutional networks such as ResNet and NASNet have achieved state-of-the-art results in many computer vision applications.
These networks consist of stages, which are sets of layers that operate on representations in the same resolution.
It has been demonstrated that increasing the number of layers in each stage improves the prediction ability of the network.
However, the resulting architecture becomes computationally expensive in terms of floating point operations, memory requirements and inference time.
arXiv Detail & Related papers (2020-04-23T14:16:39Z) - A Privacy-Preserving Distributed Architecture for
Deep-Learning-as-a-Service [68.84245063902908]
This paper introduces a novel distributed architecture for deep-learning-as-a-service.
It is able to preserve the user sensitive data while providing Cloud-based machine and deep learning services.
arXiv Detail & Related papers (2020-03-30T15:12:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.