Towards Usable Parental Control for Voice Assistants
- URL: http://arxiv.org/abs/2303.04957v2
- Date: Fri, 24 Mar 2023 04:15:35 GMT
- Title: Towards Usable Parental Control for Voice Assistants
- Authors: Peiyi Yang, Jie Fan, Zice Wei, Haoqian Li, Tu Le, and Yuan Tian
- Abstract summary: We conduct a parent survey to find out what they like and dislike about the current parental control features.
We find that parents need more visuals about their children's activity, easier access to security features for their children, and a better user interface.
- Score: 6.827452316943251
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Voice Personal Assistants (VPA) have become a common household appliance. As
one of the leading platforms for VPA technology, Amazon created Alexa and
designed Amazon Kids for children to safely enjoy the rich functionalities of
VPA and for parents to monitor their kids' activities through the Parent
Dashboard. Although this ecosystem is in place, the usage of Parent Dashboard
is not yet popularized among parents. In this paper, we conduct a parent survey
to find out what they like and dislike about the current parental control
features. We find that parents need more visuals about their children's
activity, easier access to security features for their children, and a better
user interface. Based on the insights from our survey, we present a new design
for the Parent Dashboard considering the parents' expectations.
Related papers
- Designing a Dashboard for Transparency and Control of Conversational AI [39.01999161106776]
We present an end-to-end prototype-connecting interpretability techniques with user experience design.
Our results suggest that users appreciate seeing internal states, which helped them expose biased behavior and increased their sense of control.
arXiv Detail & Related papers (2024-06-12T05:20:16Z) - Privacy Management and Interface Design for a Smart House [0.0]
This study highlights the role of security and interface design in controlling a smart house.
The study underscores the importance of providing an interface that can be used easily by any person to manage data and live activities.
arXiv Detail & Related papers (2024-02-29T09:26:41Z) - Exploring Parent's Needs for Children-Centered AI to Support
Preschoolers' Storytelling and Reading Activities [54.8155184348616]
New advances in artificial intelligence have sparked a surge of AI-based storytelling technologies.
This paper investigates how they function in practical storytelling scenarios and how parents, the most critical stakeholders, experience and perceive them.
Our findings suggest that even though AI-based storytelling technologies provide more immersive and engaging interaction, they still cannot meet parents' expectations.
arXiv Detail & Related papers (2024-01-24T20:55:40Z) - What Do End-Users Really Want? Investigation of Human-Centered XAI for
Mobile Health Apps [69.53730499849023]
We present a user-centered persona concept to evaluate explainable AI (XAI)
Results show that users' demographics and personality, as well as the type of explanation, impact explanation preferences.
Our insights bring an interactive, human-centered XAI closer to practical application.
arXiv Detail & Related papers (2022-10-07T12:51:27Z) - In Alexa, We Trust. Or Do We? : An analysis of People's Perception of
Privacy Policies [0.0]
Amazon Alexa is a voice-controlled application that is rapidly gaining popularity.
This paper tries to explore the extent to which people are aware of the privacy policies pertaining to the Amazon Alexa devices.
arXiv Detail & Related papers (2022-08-31T19:44:58Z) - Imagining new futures beyond predictive systems in child welfare: A
qualitative study with impacted stakeholders [89.6319385008397]
We conducted a set of seven design workshops with 35 stakeholders who have been impacted by the child welfare system.
We found that participants worried current PRMs perpetuate or exacerbate existing problems in child welfare.
Participants suggested new ways to use data and data-driven tools to better support impacted communities.
arXiv Detail & Related papers (2022-05-18T13:49:55Z) - StoryBuddy: A Human-AI Collaborative Chatbot for Parent-Child
Interactive Storytelling with Flexible Parental Involvement [61.47157418485633]
We developed StoryBuddy, an AI-enabled system for parents to create interactive storytelling experiences.
A user study validated StoryBuddy's usability and suggested design insights for future parent-AI collaboration systems.
arXiv Detail & Related papers (2022-02-13T04:53:28Z) - SkillBot: Identifying Risky Content for Children in Alexa Skills [4.465104643266321]
Children benefit from the rich functionalities of VPAs but are also exposed to new risks in the VPA ecosystem.
We build a Natural Language Processing-based system to automatically interact with VPA apps.
We identify 28 child-directed apps with risky contents and maintain a growing dataset of 31,966 non-overlapping app behaviors.
arXiv Detail & Related papers (2021-02-05T19:07:39Z) - Betrayed by the Guardian: Security and Privacy Risks of Parental Control
Solutions [0.0]
We present an experimental framework for systematically evaluating security and privacy issues in parental control software and hardware solutions.
Our analysis uncovers pervasive security and privacy issues that can lead to leakage of private information, and/or allow an adversary to fully control the parental control solution.
arXiv Detail & Related papers (2020-12-11T17:06:00Z) - Stop Bugging Me! Evading Modern-Day Wiretapping Using Adversarial
Perturbations [47.32228513808444]
Mass surveillance systems for voice over IP (VoIP) conversations pose a great risk to privacy.
We present an adversarial-learning-based framework for privacy protection for VoIP conversations.
arXiv Detail & Related papers (2020-10-24T06:56:35Z) - I-ViSE: Interactive Video Surveillance as an Edge Service using
Unsupervised Feature Queries [70.69741666849046]
This paper proposes an Interactive Video Surveillance as an Edge service (I-ViSE) based on unsupervised feature queries.
An I-ViSE prototype is built following the edge-fog computing paradigm and the experimental results verified the I-ViSE scheme meets the design goal of scene recognition in less than two seconds.
arXiv Detail & Related papers (2020-03-09T14:26:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.