Data Privacy in IoT Equipped Future Smart Homes
- URL: http://arxiv.org/abs/2008.04979v1
- Date: Tue, 11 Aug 2020 19:26:51 GMT
- Title: Data Privacy in IoT Equipped Future Smart Homes
- Authors: Athar Khodabakhsh, Sule Yildirim Yayilgan
- Abstract summary: In this paper data privacy requirements in a smart home environment equipped with "Internet of Things" are described.
Privacy challenges for data and models are addressed.
- Score: 0.9442139459221784
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Smart devices are becoming inseparable from daily lives and are improving
fast for providing intelligent services and remote monitoring and control. In
order to provide personalized and customized services more personal data
collection is required. Consequently, intelligent services are becoming
intensely personal and they raise concerns regarding data privacy and security.
In this paper data privacy requirements in a smart home environment equipped
with "Internet of Things" are described and privacy challenges for data and
models are addressed.
Related papers
- Preserving Privacy in Large Language Models: A Survey on Current Threats and Solutions [12.451936012379319]
Large Language Models (LLMs) represent a significant advancement in artificial intelligence, finding applications across various domains.
Their reliance on massive internet-sourced datasets for training brings notable privacy issues.
Certain application-specific scenarios may require fine-tuning these models on private data.
arXiv Detail & Related papers (2024-08-10T05:41:19Z) - PrivacyCube: Data Physicalization for Enhancing Privacy Awareness in IoT [1.2564343689544843]
We describe PrivacyCube, a novel data physicalization designed to increase privacy awareness within smart home environments.
PrivacyCube visualizes IoT data consumption by displaying privacy-related notices.
Our results show that PrivacyCube helps home occupants comprehend IoT privacy better with significantly increased privacy awareness.
arXiv Detail & Related papers (2024-06-08T12:20:42Z) - Towards Privacy-Aware and Personalised Assistive Robots: A User-Centred Approach [55.5769013369398]
This research pioneers user-centric, privacy-aware technologies such as Federated Learning (FL)
FL enables collaborative learning without sharing sensitive data, addressing privacy and scalability issues.
This work includes developing solutions for smart wheelchair assistance, enhancing user independence and well-being.
arXiv Detail & Related papers (2024-05-23T13:14:08Z) - AI-Driven Anonymization: Protecting Personal Data Privacy While
Leveraging Machine Learning [5.015409508372732]
This paper focuses on personal data privacy protection and the promotion of anonymity as its core research objectives.
It achieves personal data privacy protection and detection through the use of machine learning's differential privacy protection algorithm.
The paper also addresses existing challenges in machine learning related to privacy and personal data protection, offers improvement suggestions, and analyzes factors impacting datasets to enable timely personal data privacy detection and protection.
arXiv Detail & Related papers (2024-02-27T04:12:25Z) - Data privacy for Mobility as a Service [3.6474839708864497]
Mobility as a Service (M) is revolutionizing the transportation industry by offering convenient, efficient and integrated transportation solutions.
The extensive use of user data as well as the integration of multiple service providers raises significant privacy concerns.
arXiv Detail & Related papers (2023-09-18T21:58:35Z) - A Survey on Privacy in Graph Neural Networks: Attacks, Preservation, and
Applications [76.88662943995641]
Graph Neural Networks (GNNs) have gained significant attention owing to their ability to handle graph-structured data.
To address this issue, researchers have started to develop privacy-preserving GNNs.
Despite this progress, there is a lack of a comprehensive overview of the attacks and the techniques for preserving privacy in the graph domain.
arXiv Detail & Related papers (2023-08-31T00:31:08Z) - Privacy Explanations - A Means to End-User Trust [64.7066037969487]
We looked into how explainability might help to tackle this problem.
We created privacy explanations that aim to help to clarify to end users why and for what purposes specific data is required.
Our findings reveal that privacy explanations can be an important step towards increasing trust in software systems.
arXiv Detail & Related papers (2022-10-18T09:30:37Z) - Smart Home, security concerns of IoT [91.3755431537592]
The IoT (Internet of Things) has become widely popular in the domestic environments.
People are renewing their homes into smart homes; however, the privacy concerns of owning many Internet connected devices with always-on environmental sensors remain insufficiently addressed.
Default and weak passwords, cheap materials and hardware, and unencrypted communication are identified as the principal threats and vulnerabilities of IoT devices.
arXiv Detail & Related papers (2020-07-06T10:36:11Z) - A vision for global privacy bridges: Technical and legal measures for
international data markets [77.34726150561087]
Despite data protection laws and an acknowledged right to privacy, trading personal information has become a business equated with "trading oil"
An open conflict is arising between business demands for data and a desire for privacy.
We propose and test a vision of a personal information market with privacy.
arXiv Detail & Related papers (2020-05-13T13:55:50Z) - Beyond privacy regulations: an ethical approach to data usage in
transportation [64.86110095869176]
We describe how Federated Machine Learning can be applied to the transportation sector.
We see Federated Learning as a method that enables us to process privacy-sensitive data, while respecting customer's privacy.
arXiv Detail & Related papers (2020-04-01T15:10:12Z) - PDS: Deduce Elder Privacy from Smart Homes [0.0]
This paper shows that elders' privacy could be substantially exposed from smart homes due to non-fully protected network communication.
We develop a Privacy Deduction Scheme (PDS) by eavesdropping sensor traffic from a smart home to identify elders' movement activities and speculating sensor locations in the smart home based on a series of deductions from the viewpoint of an attacker.
arXiv Detail & Related papers (2020-01-21T13:55:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.