Cognitive Dissonance as a Measure of Reactions to Human-Robot Interaction

Dan Levin, Caroline Harriott, Natalie A Paul, Tao Zheng, Julie A. Adams


When people interact with intelligent agents, they likely rely upon a wide range of existing knowledge about machines, minds, and intelligence. This knowledge not only guides these interactions, but it can be challenged and potentially changed by interaction experiences. We hypothesized that a key factor mediating conceptual change in response to human-machine interactions is cognitive conflict, or dissonance. In this experiment, we evaluated whether interactions with a robot partner during a realistic medical triage scenario caused increased levels of cognitive dissonance relative to a control condition in which the same task was performed with a human partner. In addition, we evaluated whether heightened levels of dissonance affected concepts about agents. We observed increased cognitive dissonance after the human-robot interaction and found that this dissonance was correlated with a significantly less intentional (e.g., less human-like) view of the intelligence inherent to computers.


Human-robot interaction, socially assistive robotics, exercise, elderly, intrinsic motivation, embodiment

Full Text:



  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.