Optimal Event Monitoring through Internet Mashup over Multivariate Time
Series
- URL: http://arxiv.org/abs/2210.09992v1
- Date: Tue, 18 Oct 2022 16:56:17 GMT
- Title: Optimal Event Monitoring through Internet Mashup over Multivariate Time
Series
- Authors: Chun-Kit Ngan, Alexander Brodsky
- Abstract summary: This framework supports the services of model definitions, querying, parameter learning, model evaluations, data monitoring, decision recommendations, and web portals.
We further extend the MTSA data model and query language to support this class of problems for the services of learning, monitoring, and recommendation.
- Score: 77.34726150561087
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: We propose a Web-Mashup Application Service Framework for Multivariate Time
Series Analytics (MTSA) that supports the services of model definitions,
querying, parameter learning, model evaluations, data monitoring, decision
recommendations, and web portals. This framework maintains the advantage of
combining the strengths of both the domain-knowledge-based and the
formal-learning-based approaches and is designed for a more general class of
problems over multivariate time series. More specifically, we identify a
general-hybrid-based model, MTSA-Parameter Estimation, to solve this class of
problems in which the objective function is maximized or minimized from the
optimal decision parameters regardless of particular time points. This model
also allows domain experts to include multiple types of constraints, e.g.,
global constraints and monitoring constraints. We further extend the MTSA data
model and query language to support this class of problems for the services of
learning, monitoring, and recommendation. At the end, we conduct an
experimental case study for a university campus microgrid as a practical
example to demonstrate our proposed framework, models, and language.
Related papers
- Empowering Large Language Models in Wireless Communication: A Novel Dataset and Fine-Tuning Framework [81.29965270493238]
We develop a specialized dataset aimed at enhancing the evaluation and fine-tuning of large language models (LLMs) for wireless communication applications.
The dataset includes a diverse set of multi-hop questions, including true/false and multiple-choice types, spanning varying difficulty levels from easy to hard.
We introduce a Pointwise V-Information (PVI) based fine-tuning method, providing a detailed theoretical analysis and justification for its use in quantifying the information content of training data.
arXiv Detail & Related papers (2025-01-16T16:19:53Z) - Unified Parameter-Efficient Unlearning for LLMs [25.195126838721492]
Large Language Models (LLMs) have revolutionized natural language processing, enabling advanced understanding and reasoning capabilities across a variety of tasks.
This raises significant privacy and security concerns, as models may inadvertently retain and disseminate sensitive or undesirable information.
We introduce a novel instance-wise unlearning framework, LLMEraser, which systematically categorizes unlearning tasks and applies precise adjustments using influence functions.
arXiv Detail & Related papers (2024-11-30T07:21:02Z) - Advancing Enterprise Spatio-Temporal Forecasting Applications: Data Mining Meets Instruction Tuning of Language Models For Multi-modal Time Series Analysis in Low-Resource Settings [0.0]
patio-temporal forecasting is crucial in transportation, logistics, and supply chain management.
We propose a dynamic, multi-modal approach that integrates the strengths of traditional forecasting methods and instruction tuning of small language models.
Our framework enables on-premises customization with reduced computational and memory demands, while maintaining inference speed and data privacy/security.
arXiv Detail & Related papers (2024-08-24T16:32:58Z) - UniTime: A Language-Empowered Unified Model for Cross-Domain Time Series
Forecasting [59.11817101030137]
This research advocates for a unified model paradigm that transcends domain boundaries.
Learning an effective cross-domain model presents the following challenges.
We propose UniTime for effective cross-domain time series learning.
arXiv Detail & Related papers (2023-10-15T06:30:22Z) - MinT: Boosting Generalization in Mathematical Reasoning via Multi-View
Fine-Tuning [53.90744622542961]
Reasoning in mathematical domains remains a significant challenge for small language models (LMs)
We introduce a new method that exploits existing mathematical problem datasets with diverse annotation styles.
Experimental results show that our strategy enables a LLaMA-7B model to outperform prior approaches.
arXiv Detail & Related papers (2023-07-16T05:41:53Z) - Multi-Task Learning with Summary Statistics [4.871473117968554]
We propose a flexible multi-task learning framework utilizing summary statistics from various sources.
We also present an adaptive parameter selection approach based on a variant of Lepski's method.
This work offers a more flexible tool for training related models across various domains, with practical implications in genetic risk prediction.
arXiv Detail & Related papers (2023-07-05T15:55:23Z) - Model-Based Deep Learning: On the Intersection of Deep Learning and
Optimization [101.32332941117271]
Decision making algorithms are used in a multitude of different applications.
Deep learning approaches that use highly parametric architectures tuned from data without relying on mathematical models are becoming increasingly popular.
Model-based optimization and data-centric deep learning are often considered to be distinct disciplines.
arXiv Detail & Related papers (2022-05-05T13:40:08Z) - Conditional Generative Modeling via Learning the Latent Space [54.620761775441046]
We propose a novel framework for conditional generation in multimodal spaces.
It uses latent variables to model generalizable learning patterns.
At inference, the latent variables are optimized to find optimal solutions corresponding to multiple output modes.
arXiv Detail & Related papers (2020-10-07T03:11:34Z) - Counterfactual Explanations for Machine Learning on Multivariate Time
Series Data [0.9274371635733836]
This paper proposes a novel explainability technique for providing counterfactual explanations for supervised machine learning frameworks.
The proposed method outperforms state-of-the-art explainability methods on several different ML frameworks and data sets in metrics such as faithfulness and robustness.
arXiv Detail & Related papers (2020-08-25T02:04:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.