Cloud-Device Collaborative Adaptation to Continual Changing Environments
in the Real-world
- URL: http://arxiv.org/abs/2212.00972v1
- Date: Fri, 2 Dec 2022 05:02:36 GMT
- Title: Cloud-Device Collaborative Adaptation to Continual Changing Environments
in the Real-world
- Authors: Yulu Gan, Mingjie Pan, Rongyu Zhang, Zijian Ling, Lingran Zhao,
Jiaming Liu, Shanghang Zhang
- Abstract summary: We propose a new learning paradigm of Cloud-Device Collaborative Continual Adaptation, which encourages collaboration between cloud and device.
We also propose an Uncertainty-based Visual Prompt Adapted (U-VPA) teacher-student model to transfer the generalization capability of the large model on the cloud to the device model.
Our proposed U-VPA teacher-student framework outperforms previous state-of-the-art test time adaptation and device-cloud collaboration methods.
- Score: 20.547119604004774
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: When facing changing environments in the real world, the lightweight model on
client devices suffers from severe performance drops under distribution shifts.
The main limitations of the existing device model lie in (1) unable to update
due to the computation limit of the device, (2) the limited generalization
ability of the lightweight model. Meanwhile, recent large models have shown
strong generalization capability on the cloud while they can not be deployed on
client devices due to poor computation constraints. To enable the device model
to deal with changing environments, we propose a new learning paradigm of
Cloud-Device Collaborative Continual Adaptation, which encourages collaboration
between cloud and device and improves the generalization of the device model.
Based on this paradigm, we further propose an Uncertainty-based Visual Prompt
Adapted (U-VPA) teacher-student model to transfer the generalization capability
of the large model on the cloud to the device model. Specifically, we first
design the Uncertainty Guided Sampling (UGS) to screen out challenging data
continuously and transmit the most out-of-distribution samples from the device
to the cloud. Then we propose a Visual Prompt Learning Strategy with
Uncertainty guided updating (VPLU) to specifically deal with the selected
samples with more distribution shifts. We transmit the visual prompts to the
device and concatenate them with the incoming data to pull the device testing
distribution closer to the cloud training distribution. We conduct extensive
experiments on two object detection datasets with continually changing
environments. Our proposed U-VPA teacher-student framework outperforms previous
state-of-the-art test time adaptation and device-cloud collaboration methods.
The code and datasets will be released.
Related papers
- Towards Robust and Efficient Cloud-Edge Elastic Model Adaptation via Selective Entropy Distillation [56.79064699832383]
We establish a Cloud-Edge Elastic Model Adaptation (CEMA) paradigm in which the edge models only need to perform forward propagation.
In our CEMA, to reduce the communication burden, we devise two criteria to exclude unnecessary samples from uploading to the cloud.
arXiv Detail & Related papers (2024-02-27T08:47:19Z) - Cloud-Device Collaborative Learning for Multimodal Large Language Models [24.65882336700547]
We introduce a Cloud-Device Collaborative Continual Adaptation framework to enhance the performance of compressed, device-deployed MLLMs.
Our framework is structured into three key components: a device-to-cloud uplink for efficient data transmission, cloud-based knowledge adaptation, and an optimized cloud-to-device downlink for model deployment.
arXiv Detail & Related papers (2023-12-26T18:46:14Z) - ECLM: Efficient Edge-Cloud Collaborative Learning with Continuous
Environment Adaptation [47.35179593006409]
We propose ECLM, an edge-cloud collaborative learning framework for rapid model adaptation for dynamic edge environments.
We show that ECLM significantly improves model performance (e.g., 18.89% accuracy increase) and resource efficiency (e.g. 7.12x communication cost reduction) in adapting models to dynamic edge environments.
arXiv Detail & Related papers (2023-11-18T14:10:09Z) - MetaNetwork: A Task-agnostic Network Parameters Generation Framework for
Improving Device Model Generalization [65.02542875281233]
We propose a novel task-agnostic framework, named MetaNetwork, for generating adaptive device model parameters from cloud without on-device training.
The MetaGenerator is designed to learn a mapping function from samples to model parameters, and it can generate and deliver the adaptive parameters to the device based on samples uploaded from the device to the cloud.
The MetaStabilizer aims to reduce the oscillation of the MetaGenerator, accelerate the convergence and improve the model performance during both training and inference.
arXiv Detail & Related papers (2022-09-12T13:26:26Z) - Device-Cloud Collaborative Recommendation via Meta Controller [65.97416287295152]
We propose a meta controller to dynamically manage the collaboration between the on-device recommender and the cloud-based recommender.
On the basis of the counterfactual samples and the extended training, extensive experiments in the industrial recommendation scenarios show the promise of meta controller.
arXiv Detail & Related papers (2022-07-07T03:23:04Z) - On-Device Learning with Cloud-Coordinated Data Augmentation for Extreme
Model Personalization in Recommender Systems [39.41506296601779]
We propose a new device-cloud collaborative learning framework, called CoDA, to break the dilemmas of purely cloud-based learning and on-device learning.
CoDA retrieves similar samples from the cloud's global pool to augment each user's local dataset to train the recommendation model.
Online A/B testing results show the remarkable performance improvement of CoDA over both cloud-based learning without model personalization and on-device training without data augmentation.
arXiv Detail & Related papers (2022-01-24T04:59:04Z) - Device-Cloud Collaborative Learning for Recommendation [50.01289274123047]
We propose a novel MetaPatch learning approach on the device side to efficiently achieve "thousands of people with thousands of models" given a centralized cloud model.
With billions of updated personalized device models, we propose a "model-over-models" distillation algorithm, namely MoMoDistill, to update the centralized cloud model.
arXiv Detail & Related papers (2021-04-14T05:06:59Z) - Fast-Convergent Federated Learning [82.32029953209542]
Federated learning is a promising solution for distributing machine learning tasks through modern networks of mobile devices.
We propose a fast-convergent federated learning algorithm, called FOLB, which performs intelligent sampling of devices in each round of model training.
arXiv Detail & Related papers (2020-07-26T14:37:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.