Device-Cloud Collaborative Learning for Recommendation
- URL: http://arxiv.org/abs/2104.06624v1
- Date: Wed, 14 Apr 2021 05:06:59 GMT
- Title: Device-Cloud Collaborative Learning for Recommendation
- Authors: Jiangchao Yao and Feng Wang and KunYang Jia and Bo Han and Jingren
Zhou and Hongxia Yang
- Abstract summary: We propose a novel MetaPatch learning approach on the device side to efficiently achieve "thousands of people with thousands of models" given a centralized cloud model.
With billions of updated personalized device models, we propose a "model-over-models" distillation algorithm, namely MoMoDistill, to update the centralized cloud model.
- Score: 50.01289274123047
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: With the rapid development of storage and computing power on mobile devices,
it becomes critical and popular to deploy models on devices to save onerous
communication latencies and to capture real-time features. While quite a lot of
works have explored to facilitate on-device learning and inference, most of
them focus on dealing with response delay or privacy protection. Little has
been done to model the collaboration between the device and the cloud modeling
and benefit both sides jointly. To bridge this gap, we are among the first
attempts to study the Device-Cloud Collaborative Learning (DCCL) framework.
Specifically, we propose a novel MetaPatch learning approach on the device side
to efficiently achieve "thousands of people with thousands of models" given a
centralized cloud model. Then, with billions of updated personalized device
models, we propose a "model-over-models" distillation algorithm, namely
MoMoDistill, to update the centralized cloud model. Our extensive experiments
over a range of datasets with different settings demonstrate the effectiveness
of such collaboration on both cloud and device sides, especially its
superiority in modeling long-tailed users.
Related papers
- ECLM: Efficient Edge-Cloud Collaborative Learning with Continuous
Environment Adaptation [47.35179593006409]
We propose ECLM, an edge-cloud collaborative learning framework for rapid model adaptation for dynamic edge environments.
We show that ECLM significantly improves model performance (e.g., 18.89% accuracy increase) and resource efficiency (e.g. 7.12x communication cost reduction) in adapting models to dynamic edge environments.
arXiv Detail & Related papers (2023-11-18T14:10:09Z) - Structured Cooperative Learning with Graphical Model Priors [98.53322192624594]
We study how to train personalized models for different tasks on decentralized devices with limited local data.
We propose "Structured Cooperative Learning (SCooL)", in which a cooperation graph across devices is generated by a graphical model.
We evaluate SCooL and compare it with existing decentralized learning methods on an extensive set of benchmarks.
arXiv Detail & Related papers (2023-06-16T02:41:31Z) - DC-CCL: Device-Cloud Collaborative Controlled Learning for Large Vision
Models [43.41875046295657]
We propose a device-cloud collaborative controlled learning framework, called DC-CCL.
DC-CCL splits the base model into two submodels, one large submodel for learning from the cloud-side samples and the other small submodel for learning from the device-side samples and performing device-cloud knowledge fusion.
arXiv Detail & Related papers (2023-03-18T08:35:12Z) - Device Tuning for Multi-Task Large Model [0.0]
We propose Device Tuning for the efficient multi-task model, which is a massively multitask framework across the cloud and device.
Specifically, we design Device Tuning architecture of a multi-task model that benefits both cloud modelling and device modelling.
arXiv Detail & Related papers (2023-02-21T16:55:48Z) - Cloud-Device Collaborative Adaptation to Continual Changing Environments
in the Real-world [20.547119604004774]
We propose a new learning paradigm of Cloud-Device Collaborative Continual Adaptation, which encourages collaboration between cloud and device.
We also propose an Uncertainty-based Visual Prompt Adapted (U-VPA) teacher-student model to transfer the generalization capability of the large model on the cloud to the device model.
Our proposed U-VPA teacher-student framework outperforms previous state-of-the-art test time adaptation and device-cloud collaboration methods.
arXiv Detail & Related papers (2022-12-02T05:02:36Z) - MetaNetwork: A Task-agnostic Network Parameters Generation Framework for
Improving Device Model Generalization [65.02542875281233]
We propose a novel task-agnostic framework, named MetaNetwork, for generating adaptive device model parameters from cloud without on-device training.
The MetaGenerator is designed to learn a mapping function from samples to model parameters, and it can generate and deliver the adaptive parameters to the device based on samples uploaded from the device to the cloud.
The MetaStabilizer aims to reduce the oscillation of the MetaGenerator, accelerate the convergence and improve the model performance during both training and inference.
arXiv Detail & Related papers (2022-09-12T13:26:26Z) - Device-Cloud Collaborative Recommendation via Meta Controller [65.97416287295152]
We propose a meta controller to dynamically manage the collaboration between the on-device recommender and the cloud-based recommender.
On the basis of the counterfactual samples and the extended training, extensive experiments in the industrial recommendation scenarios show the promise of meta controller.
arXiv Detail & Related papers (2022-07-07T03:23:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.