Conference Paper

Trends and Trajectories for Explainable, Accountable and Intelligible Systems: An HCI Research Agenda

Association for Computing Machinery (ACM),

2018

DOI:10.1145/3173574.3174156, Dimensions: pub.1103568851,

Affiliations

Organisations

  1. (1) National University of Singapore, grid.4280.e
  2. (2) Aarhus University, grid.7048.b, AU

Countries

Singapore

Denmark

Continents

Asia

Europe

Description

Advances in artificial intelligence, sensors and big data management have far-reaching societal impacts. As these systems augment our everyday lives, it becomes increasing-ly important for people to understand them and remain in control. We investigate how HCI researchers can help to develop accountable systems by performing a literature analysis of 289 core papers on explanations and explaina-ble systems, as well as 12,412 citing papers. Using topic modeling, co-occurrence and network analysis, we mapped the research space from diverse domains, such as algorith-mic accountability, interpretable machine learning, context-awareness, cognitive psychology, and software learnability. We reveal fading and burgeoning trends in explainable systems, and identify domains that are closely connected or mostly isolated. The time is ripe for the HCI community to ensure that the powerful new autonomous systems have intelligible interfaces built-in. From our results, we propose several implications and directions for future research to-wards this goal.

Research Categories

Main Subject Area

Links & Metrics

NORA University Profiles

Aarhus University

Dimensions Citation Indicators

Times Cited: 78

Field Citation Ratio (FCR): 64.89