Dimensionality Expansion of Load Monitoring Time Series and Transfer
Learning for EMS
- URL: http://arxiv.org/abs/2204.02802v4
- Date: Wed, 19 Apr 2023 10:09:37 GMT
- Title: Dimensionality Expansion of Load Monitoring Time Series and Transfer
Learning for EMS
- Authors: Bla\v{z} Bertalani\v{c}, Jakob Jenko and Carolina Fortuna
- Abstract summary: Energy management systems rely on (non)-intrusive load monitoring (N)ILM to monitor and manage appliances.
We propose a new approach for load monitoring in building EMS based on dimensionality expansion of time series and transfer learning.
- Score: 0.7133136338850781
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Energy management systems (EMS) rely on (non)-intrusive load monitoring
(N)ILM to monitor and manage appliances and help residents be more energy
efficient and thus more frugal. The robustness as well as the transfer
potential of the most promising machine learning solutions for (N)ILM is not
yet fully understood as they are trained and evaluated on relatively limited
data. In this paper, we propose a new approach for load monitoring in building
EMS based on dimensionality expansion of time series and transfer learning. We
perform an extensive evaluation on 5 different low-frequency datasets. The
proposed feature dimensionality expansion using video-like transformation and
resource-aware deep learning architecture achieves an average weighted F1 score
of 0.88 across the datasets with 29 appliances and is computationally more
efficient compared to the state-of-the-art imaging methods. Investigating the
proposed method for cross-dataset intra-domain transfer learning, we find that
1) our method performs with an average weighted F1 score of 0.80 while
requiring 3-times fewer epochs for model training compared to the non-transfer
approach, 2) can achieve an F1 score of 0.75 with only 230 data samples, and 3)
our transfer approach outperforms the state-of-the-art in precision drop by up
to 12 percentage points for unseen appliances.
Related papers
- DigiRL: Training In-The-Wild Device-Control Agents with Autonomous Reinforcement Learning [61.10299147201369]
This paper introduces a novel autonomous RL approach, called DigiRL, for training in-the-wild device control agents.
We build a scalable and parallelizable Android learning environment equipped with a VLM-based evaluator.
We demonstrate the effectiveness of DigiRL using the Android-in-the-Wild dataset, where our 1.3B VLM trained with RL achieves a 49.5% absolute improvement.
arXiv Detail & Related papers (2024-06-14T17:49:55Z) - Foundation Models for Structural Health Monitoring [17.37816294594306]
We propose for the first time the use of Transformer neural networks, with a Masked Auto-Encoder architecture, as Foundation Models for Structural Health Monitoring.
We demonstrate the ability of these models to learn generalizable representations from multiple large datasets through self-supervised pre-training.
We showcase the effectiveness of our foundation models using data from three operational viaducts.
arXiv Detail & Related papers (2024-04-03T13:32:44Z) - Semi-Federated Learning: Convergence Analysis and Optimization of A
Hybrid Learning Framework [70.83511997272457]
We propose a semi-federated learning (SemiFL) paradigm to leverage both the base station (BS) and devices for a hybrid implementation of centralized learning (CL) and FL.
We propose a two-stage algorithm to solve this intractable problem, in which we provide the closed-form solutions to the beamformers.
arXiv Detail & Related papers (2023-10-04T03:32:39Z) - RA-DIT: Retrieval-Augmented Dual Instruction Tuning [90.98423540361946]
Retrieval-augmented language models (RALMs) improve performance by accessing long-tail and up-to-date knowledge from external data stores.
Existing approaches require either expensive retrieval-specific modifications to LM pre-training or use post-hoc integration of the data store that leads to suboptimal performance.
We introduce Retrieval-Augmented Dual Instruction Tuning (RA-DIT), a lightweight fine-tuning methodology that provides a third option.
arXiv Detail & Related papers (2023-10-02T17:16:26Z) - Parameter-Efficient Transfer Learning for Remote Sensing Image-Text
Retrieval [10.84733740863356]
In this work, we investigate the parameter-efficient transfer learning (PETL) method to transfer visual-language knowledge from the natural domain to the RS domain on the image-text retrieval task.
Our proposed model only contains 0.16M training parameters, which can achieve a parameter reduction of 98.9% compared to full fine-tuning.
Our retrieval performance exceeds traditional methods by 7-13% and achieves comparable or better performance than full fine-tuning.
arXiv Detail & Related papers (2023-08-24T02:43:53Z) - FusionAD: Multi-modality Fusion for Prediction and Planning Tasks of
Autonomous Driving [20.037562671813]
We present FusionAD, the first unified framework that fuse the information from most critical sensors, camera and LiDAR, goes beyond perception task.
In constrast to camera-based end-to-end UniAD, we establish a method fusion aided modality-aware prediction status planning modules, dubbed FMS.
We conduct extensive experiments on commonly used benchmark nu's dataset, our advantages state-of-the-art performance and surpassing baselines on average 15% on perception tasks like detection and tracking, 10% on occupancy prediction accuracy, reducing prediction error from 0.708 to 0.389, and reducing collision rate from 0.31%
arXiv Detail & Related papers (2023-08-02T08:29:44Z) - Energy Efficient Deep Multi-Label ON/OFF Classification of Low Frequency Metered Home Appliances [0.16777183511743468]
Non-intrusive load monitoring (NILM) is the process of obtaining appliance-level data from a single metering point.
We introduce a novel DL model aimed at enhanced multi-label classification of NILM with improved computation and energy efficiency.
Compared to the state-of-the-art, the proposed model has its energy consumption reduced by more than 23%.
arXiv Detail & Related papers (2023-07-18T13:23:23Z) - MMTSA: Multimodal Temporal Segment Attention Network for Efficient Human
Activity Recognition [33.94582546667864]
Multimodal sensors provide complementary information to develop accurate machine-learning methods for human activity recognition.
This paper proposes an efficient multimodal neural architecture for HAR using an RGB camera and inertial measurement units (IMUs)
Using three well-established public datasets, we evaluated MMTSA's effectiveness and efficiency in HAR.
arXiv Detail & Related papers (2022-10-14T08:05:16Z) - Federated Learning for Energy-limited Wireless Networks: A Partial Model
Aggregation Approach [79.59560136273917]
limited communication resources, bandwidth and energy, and data heterogeneity across devices are main bottlenecks for federated learning (FL)
We first devise a novel FL framework with partial model aggregation (PMA)
The proposed PMA-FL improves 2.72% and 11.6% accuracy on two typical heterogeneous datasets.
arXiv Detail & Related papers (2022-04-20T19:09:52Z) - Deep Reinforcement Learning Assisted Federated Learning Algorithm for
Data Management of IIoT [82.33080550378068]
The continuous expanded scale of the industrial Internet of Things (IIoT) leads to IIoT equipments generating massive amounts of user data every moment.
How to manage these time series data in an efficient and safe way in the field of IIoT is still an open issue.
This paper studies the FL technology applications to manage IIoT equipment data in wireless network environments.
arXiv Detail & Related papers (2022-02-03T07:12:36Z) - Towards Accurate Knowledge Transfer via Target-awareness Representation
Disentanglement [56.40587594647692]
We propose a novel transfer learning algorithm, introducing the idea of Target-awareness REpresentation Disentanglement (TRED)
TRED disentangles the relevant knowledge with respect to the target task from the original source model and used as a regularizer during fine-tuning the target model.
Experiments on various real world datasets show that our method stably improves the standard fine-tuning by more than 2% in average.
arXiv Detail & Related papers (2020-10-16T17:45:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.