Religion-Related Values Differently Influence Moral Attitude for Robots in the United States and Japan

Shogo Ikari, Kosuke Sato, Emily Burdett, Hiroshi Ishiguro, Jonathan Jong, Yo Nakawake

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)
94 Downloads (Pure)

Abstract

Increasing evidence suggests that people show moral concern for robots among other nonhuman entities. Furthermore, people’s attitudes toward new automated technologies such as robots and artificial intelligence (AI) are influenced by their social backgrounds, including religion. Two specific religion-related values, that is, animism and anthropocentrism, have been recognized to influence preference for and familiarity with robots. However, how they affect moral care for robots under different religious traditions has not been studied. Here, we empirically examined how moral care for robots is influenced by religiosity (i.e., religious beliefs and religious attendance) and religion-related values (i.e., animism and anthropomorphism) in U.S. and Japanese samples, cultures that are grounded in Abrahamic and Shinto-Buddhist traditions, respectively ( N = 3,781). Overall, moral care for robots was higher in Japan than in the United States, matching previous findings. Moral care for robots was negatively associated with religiosity in the United States and positively in Japan, although its variance was better explained by religion-related values than religiosity. Furthermore, moral care for robots had a negative association with anthropocentrism in the United States and a positive association with animism in Japan. The findings demonstrate how religious tradition may influence moral attitudes toward robots, highlighting the role of cultural traditions in the realm of moral considerations.
Original languageEnglish
Pages (from-to)742-759
Number of pages18
JournalJournal of Cross-Cultural Psychology
Volume54
Issue number6-7
Early online date23 Aug 2023
DOIs
Publication statusPublished - 1 Sept 2023

Bibliographical note

Copyright © and Moral Rights are retained by the author(s) and/ or other copyright owners. A copy can be downloaded for personal non-commercial research or study, without prior permission or charge. This item cannot be reproduced or quoted extensively from without first obtaining permission in writing from the copyright holder(s). The content must not be changed in any way or sold commercially in any format or medium without the formal permission of the copyright holders.

This document is the author’s post-print version, incorporating any revisions agreed during the peer-review process. Some differences between the published version and this version may remain and you are advised to consult the published version if you wish to cite from it.

Funder

The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the research Grant of the Templeton World Charity Foundation (grant no. TWCF0164), Japan Society for Promotion of Scientific Research (grant no. 22K18150), Toshiba International Foundation, and the Toyota Foundation (grant no. D21-ST-0012).

Keywords

  • human-robot interaction
  • religion
  • Social Psychology
  • moral foundation theory
  • care
  • cross-cultural study

Fingerprint

Dive into the research topics of 'Religion-Related Values Differently Influence Moral Attitude for Robots in the United States and Japan'. Together they form a unique fingerprint.

Cite this