How Could AI Support Design Education? A Study Across Fields Fuels Situating Analytics
- URL: http://arxiv.org/abs/2404.17390v1
- Date: Fri, 26 Apr 2024 13:06:52 GMT
- Title: How Could AI Support Design Education? A Study Across Fields Fuels Situating Analytics
- Authors: Ajit Jain, Andruid Kerne, Hannah Fowler, Jinsil Seo, Galen Newman, Nic Lupfer, Aaron Perrine,
- Abstract summary: We use the process and findings from a case study of design educators' practices of assessment and feedback to fuel theorizing.
We theorize a methodology, which we call situating analytics, because making AI support living human activity depends on aligning what analytics measure with situated practices.
- Score: 3.362956277221427
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We use the process and findings from a case study of design educators' practices of assessment and feedback to fuel theorizing about how to make AI useful in service of human experience. We build on Suchman's theory of situated actions. We perform a qualitative study of 11 educators in 5 fields, who teach design processes situated in project-based learning contexts. Through qualitative data gathering and analysis, we derive codes: design process; assessment and feedback challenges; and computational support. We twice invoke creative cognition's family resemblance principle. First, to explain how design instructors already use assessment rubrics and second, to explain the analogous role for design creativity analytics: no particular trait is necessary or sufficient; each only tends to indicate good design work. Human teachers remain essential. We develop a set of situated design creativity analytics--Fluency, Flexibility, Visual Consistency, Multiscale Organization, and Legible Contrast--to support instructors' efforts, by providing on-demand, learning objectives-based assessment and feedback to students. We theorize a methodology, which we call situating analytics, firstly because making AI support living human activity depends on aligning what analytics measure with situated practices. Further, we realize that analytics can become most significant to users by situating them through interfaces that integrate them into the material contexts of their use. Here, this means situating design creativity analytics into actual design environments. Through the case study, we identify situating analytics as a methodology for explaining analytics to users, because the iterative process of alignment with practice has the potential to enable data scientists to derive analytics that make sense as part of and support situated human experiences.
Related papers
- Data Analysis in the Era of Generative AI [56.44807642944589]
This paper explores the potential of AI-powered tools to reshape data analysis, focusing on design considerations and challenges.
We explore how the emergence of large language and multimodal models offers new opportunities to enhance various stages of data analysis workflow.
We then examine human-centered design principles that facilitate intuitive interactions, build user trust, and streamline the AI-assisted analysis workflow across multiple apps.
arXiv Detail & Related papers (2024-09-27T06:31:03Z) - Indexing Analytics to Instances: How Integrating a Dashboard can Support Design Education [14.45375751032367]
We develop a research artifact integrating a design analytics dashboard with design instances, and the design environment that students use to create them.
We develop research implications addressing how AI-based design analytics can support instructors' assessment and feedback experiences in situated course contexts.
arXiv Detail & Related papers (2024-04-08T11:33:58Z) - Evaluating and Optimizing Educational Content with Large Language Model Judgments [52.33701672559594]
We use Language Models (LMs) as educational experts to assess the impact of various instructions on learning outcomes.
We introduce an instruction optimization approach in which one LM generates instructional materials using the judgments of another LM as a reward function.
Human teachers' evaluations of these LM-generated worksheets show a significant alignment between the LM judgments and human teacher preferences.
arXiv Detail & Related papers (2024-03-05T09:09:15Z) - Evaluating the Utility of Model Explanations for Model Development [54.23538543168767]
We evaluate whether explanations can improve human decision-making in practical scenarios of machine learning model development.
To our surprise, we did not find evidence of significant improvement on tasks when users were provided with any of the saliency maps.
These findings suggest caution regarding the usefulness and potential for misunderstanding in saliency-based explanations.
arXiv Detail & Related papers (2023-12-10T23:13:23Z) - The LAVA Model: Learning Analytics Meets Visual Analytics [0.0]
Human-Centered learning analytics (HCLA) emphasizes the human factors in learning analytics and truly meets user needs.
Visual analytics is a multidisciplinary data science research field that follows a human-centered approach.
This paper explores the benefits of incorporating visual analytics concepts into the learning analytics process.
arXiv Detail & Related papers (2023-03-22T08:57:42Z) - Introducing Practicable Learning Analytics [0.0]
This book introduces the concept of practicable learning analytics to illuminate what learning analytics may look like from the perspective of practice.
We use the concept of Information Systems Artifact (ISA) which comprises three interrelated subsystems: the informational, the social and the technological artefacts.
The ten chapters in this book are presented and reflected upon from the ISA perspective, clarifying that detailed attention to the social artefact is critical to the design of practicable learning analytics.
arXiv Detail & Related papers (2023-01-26T21:20:08Z) - An Interdisciplinary Perspective on Evaluation and Experimental Design
for Visual Text Analytics: Position Paper [24.586485898038312]
In this paper, we focus on the issues of evaluating visual text analytics approaches.
We identify four key groups of challenges for evaluating visual text analytics approaches.
arXiv Detail & Related papers (2022-09-23T11:47:37Z) - Distributed intelligence on the Edge-to-Cloud Continuum: A systematic
literature review [62.997667081978825]
This review aims at providing a comprehensive vision of the main state-of-the-art libraries and frameworks for machine learning and data analytics available today.
The main simulation, emulation, deployment systems, and testbeds for experimental research on the Edge-to-Cloud Continuum available today are also surveyed.
arXiv Detail & Related papers (2022-04-29T08:06:05Z) - A Field Guide to Federated Optimization [161.3779046812383]
Federated learning and analytics are a distributed approach for collaboratively learning models (or statistics) from decentralized data.
This paper provides recommendations and guidelines on formulating, designing, evaluating and analyzing federated optimization algorithms.
arXiv Detail & Related papers (2021-07-14T18:09:08Z) - AR-LSAT: Investigating Analytical Reasoning of Text [57.1542673852013]
We study the challenge of analytical reasoning of text and introduce a new dataset consisting of questions from the Law School Admission Test from 1991 to 2016.
We analyze what knowledge understanding and reasoning abilities are required to do well on this task.
arXiv Detail & Related papers (2021-04-14T02:53:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.