Device-Cloud Collaborative Recommendation via Meta Controller
- URL: http://arxiv.org/abs/2207.03066v1
- Date: Thu, 7 Jul 2022 03:23:04 GMT
- Title: Device-Cloud Collaborative Recommendation via Meta Controller
- Authors: Jiangchao Yao, Feng Wang, Xichen Ding, Shaohu Chen, Bo Han, Jingren
Zhou, Hongxia Yang
- Abstract summary: We propose a meta controller to dynamically manage the collaboration between the on-device recommender and the cloud-based recommender.
On the basis of the counterfactual samples and the extended training, extensive experiments in the industrial recommendation scenarios show the promise of meta controller.
- Score: 65.97416287295152
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: On-device machine learning enables the lightweight deployment of
recommendation models in local clients, which reduces the burden of the
cloud-based recommenders and simultaneously incorporates more real-time user
features. Nevertheless, the cloud-based recommendation in the industry is still
very important considering its powerful model capacity and the efficient
candidate generation from the billion-scale item pool. Previous attempts to
integrate the merits of both paradigms mainly resort to a sequential mechanism,
which builds the on-device recommender on top of the cloud-based
recommendation. However, such a design is inflexible when user interests
dramatically change:
the on-device model is stuck by the limited item cache while the cloud-based
recommendation based on the large item pool do not respond without the new
re-fresh feedback.
To overcome this issue, we propose a meta controller to dynamically manage
the collaboration between the on-device recommender and the cloud-based
recommender, and introduce a novel efficient sample construction from the
causal perspective to solve the dataset absence issue of meta controller. On
the basis of the counterfactual samples and the extended training, extensive
experiments in the industrial recommendation scenarios show the promise of meta
controller in the device-cloud collaboration.
Related papers
- Cloud-Device Collaborative Learning for Multimodal Large Language Models [24.65882336700547]
We introduce a Cloud-Device Collaborative Continual Adaptation framework to enhance the performance of compressed, device-deployed MLLMs.
Our framework is structured into three key components: a device-to-cloud uplink for efficient data transmission, cloud-based knowledge adaptation, and an optimized cloud-to-device downlink for model deployment.
arXiv Detail & Related papers (2023-12-26T18:46:14Z) - Benchmarking Function Hook Latency in Cloud-Native Environments [0.5188841610098435]
Cloud-native applications are often instrumented or altered at runtime, by dynamically patching or hooking them, which introduces a significant performance overhead.
We present recommendations to mitigate these risks and demonstrate how an improper experimental setup can negatively impact latency measurements.
arXiv Detail & Related papers (2023-10-19T12:54:32Z) - Federated Privacy-preserving Collaborative Filtering for On-Device Next
App Prediction [52.16923290335873]
We propose a novel SeqMF model to solve the problem of predicting the next app launch during mobile device usage.
We modify the structure of the classical matrix factorization model and update the training procedure to sequential learning.
One more ingredient of the proposed approach is a new privacy mechanism that guarantees the protection of the sent data from the users to the remote server.
arXiv Detail & Related papers (2023-02-05T10:29:57Z) - Cloud-Device Collaborative Adaptation to Continual Changing Environments
in the Real-world [20.547119604004774]
We propose a new learning paradigm of Cloud-Device Collaborative Continual Adaptation, which encourages collaboration between cloud and device.
We also propose an Uncertainty-based Visual Prompt Adapted (U-VPA) teacher-student model to transfer the generalization capability of the large model on the cloud to the device model.
Our proposed U-VPA teacher-student framework outperforms previous state-of-the-art test time adaptation and device-cloud collaboration methods.
arXiv Detail & Related papers (2022-12-02T05:02:36Z) - MetaNetwork: A Task-agnostic Network Parameters Generation Framework for
Improving Device Model Generalization [65.02542875281233]
We propose a novel task-agnostic framework, named MetaNetwork, for generating adaptive device model parameters from cloud without on-device training.
The MetaGenerator is designed to learn a mapping function from samples to model parameters, and it can generate and deliver the adaptive parameters to the device based on samples uploaded from the device to the cloud.
The MetaStabilizer aims to reduce the oscillation of the MetaGenerator, accelerate the convergence and improve the model performance during both training and inference.
arXiv Detail & Related papers (2022-09-12T13:26:26Z) - Auto-Split: A General Framework of Collaborative Edge-Cloud AI [49.750972428032355]
This paper describes the techniques and engineering practice behind Auto-Split, an edge-cloud collaborative prototype of Huawei Cloud.
To the best of our knowledge, there is no existing industry product that provides the capability of Deep Neural Network (DNN) splitting.
arXiv Detail & Related papers (2021-08-30T08:03:29Z) - Device-Cloud Collaborative Learning for Recommendation [50.01289274123047]
We propose a novel MetaPatch learning approach on the device side to efficiently achieve "thousands of people with thousands of models" given a centralized cloud model.
With billions of updated personalized device models, we propose a "model-over-models" distillation algorithm, namely MoMoDistill, to update the centralized cloud model.
arXiv Detail & Related papers (2021-04-14T05:06:59Z) - Controllable Multi-Interest Framework for Recommendation [64.30030600415654]
We formalize the recommender system as a sequential recommendation problem.
We propose a novel controllable multi-interest framework for the sequential recommendation, called ComiRec.
Our framework has been successfully deployed on the offline Alibaba distributed cloud platform.
arXiv Detail & Related papers (2020-05-19T10:18:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.