Abstract
In this study, we investigated the impact of a match in personality between a chatbot and the user. Previous research have proposed that personality can offer a stable pattern to how chatbots are perceived, and add consistency to the user experience. The assumptions regarding the effects of personality was investigated by measuring the effects of two chatbot agents, with two levels of personality, on the user experience. This study found that personality has a significant positive effect on the user experience of chatbot interfaces, but this effect is dependent on context, the job it performs, and its user group.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Balzarotti, S., Piccini, L., Andreoni, G., Ciceri, R.: “I know that you know how i feel”: behavioral and physiological signals demonstrate emotional attunement while interacting with a computer simulating emotional intelligence. J. Nonverbal Behav. 38(3), 283–299 (2014). https://doi.org/10.1007/s10919-014-0180-6
Beun, R.-J., de Vos, E., Witteman, C.: Embodied conversational agents: effects on memory performance and anthropomorphisation. In: Rist, T., Aylett, R.S., Ballin, D., Rickel, J. (eds.) IVA 2003. LNCS (LNAI), vol. 2792, pp. 315–319. Springer, Heidelberg (2003). https://doi.org/10.1007/978-3-540-39396-2_52
Brahnam, S., De Angeli, A.: Gender affordances of conversational agents. Interact. Comput. 24(3), 139–153 (2012). https://doi.org/10.1016/j.intcom.2012.05.001
Callejas, Z., López-Cózar, R., Ábalos, N., Griol, D.: Affective conversational agents: the role of personality and emotion in spoken interactions. In: Conversational Agents and Natural Language Interaction: Techniques and Effective Practices: Techniques and Effective Practices, pp. 203–223 (2011). https://doi.org/10.4018/978-1-60960-617-6.ch009
Dautenhahn, K., Ogden, B., Quick, T.: From embodied to socially embedded agents-implications for interaction-aware robots. Cogn. Syst. Res. 3(3), 397–428 (2002). https://doi.org/10.1016/S1389-0417(02)00050-5
Oxford English Dictionary: Anthropomorphism. www.oed.com/view/Entry/8449?redirectedFrom=anthropomorphism&
Epley, N., Waytz, A., Cacioppo, J.T.: On seeing human: a three-factor theory of anthropomorphism. Psychol. Rev. 114(4), 864 (2007). https://doi.org/10.1037/0033-295X.114.4.864
Fogg, B.J.: Persuasive technology: using computers to change what we think and do. Interactive Technologies, Elsevier Science (2002). https://doi.org/10.1145/764008.763957
Forrester: Chatbots are transforming marketing (2017). https://www.forrester.com/report/Chatbots+Are+Transforming+Marketing/-/E-RES136771
Griol, D., Molina, J.M., Callejas, Z.: Towards emotionally sensitive conversational interfaces for E-therapy. In: Ferrández Vicente, J.M., Álvarez-Sánchez, J.R., de la Paz López, F., Toledo-Moreo, F.J., Adeli, H. (eds.) IWINAC 2015. LNCS, vol. 9107, pp. 498–507. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-18914-7_52
Hassenzahl, M., Platz, A., Burmester, M., Lehner, K.: Hedonic and ergonomic quality aspects determine a software’s appeal. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2000, pp. 201–208. ACM, New York (2000). https://doi.org/10.1145/332040.332432
JuniperResearch: Chatbot infographic key statistics (2017). https://www.juniperresearch.com/resources/infographics/chatbots-infographic-key-statistics-2017
Lee, E.J.: The more humanlike, the better? How speech type and users’ cognitive style affect social responses to computers. Comput. Hum. Behav. 26(4), 665–672 (2010). https://doi.org/10.1016/j.chb.2010.01.003
Lee, J.D., See, K.A.: Trust in automation: designing for appropriate reliance. Hum. Factors 46(1), 50–80 (2004). https://doi.org/10.1518/hfes.46.1.50_30392
Lester, J., Converse, S., Kahler, S., Barlow, S., Stone, B., Bhogal, R.: The persona effect: affective impact of animated pedagogical agents, pp. 359–366 (1997). https://doi.org/10.1145/258549.258797
McTear, M., Callejas, Z., Griol, D.: Affective conversational interfaces. The Conversational Interface. LNCS, pp. 329–357. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-32967-3_15
Mencia, B.L., Pardo, D.D., Trapote, A.H., Gómez, L.A.H.: Embodied conversational agents in interactive applications for children with special educational needs. In: Technologies for Inclusive Education: Beyond Traditional Integration Approaches: Beyond Traditional Integration Approaches, p. 59 (2012). https://doi.org/10.4018/978-1-4666-2530-3.ch004
Meyer, J., Miller, C., Hancock, P., de Visser, E.J., Dorneich, M.: Politeness in machine-human and human-human interaction. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 60(1), 279–283 (2016). https://doi.org/10.1177/1541931213601064
Mori, M.: The uncanny valley. Energy 7(4), 33–35 (1970). https://doi.org/10.1109/MRA.2012.2192811
Nass, C., Moon, Y.: Machines and mindlessness: social responses to computers. J. Soc. Issues 56(1), 81–103 (2000). https://doi.org/10.1111/0022-4537.00153
Norman, D.A.: Emotional Design: Why We Love (or Hate) Everyday Things. Basic Books, New York (2007)
Orf, D.: Facebook chatbots are frustrating and useless (2017). https://gizmodo.com/facebook-messenger-chatbots-are-more-frustrating-than-h-1770732045
Piltch, A.: Talk is cheap: why chatbots will always be a waste of time (2017). www.tomsguide.com/us/chatbots-waste-our-time,news-22562.html
Prada, R., Vala, M., Paiva, A., Hook, K., Bullock, A.: FantasyA – the duel of emotions. In: Rist, T., Aylett, R.S., Ballin, D., Rickel, J. (eds.) IVA 2003. LNCS (LNAI), vol. 2792, pp. 62–66. Springer, Heidelberg (2003). https://doi.org/10.1007/978-3-540-39396-2_11
Reeves, B., Nass, C.: The Media Equation: How People Treat Computers, Television and New Media Like Real People and Places. Cambridge University Press, New York (1996)
Schroeder, J., Epley, N.: Mistaking minds and machines: how speech affects dehumanization and anthropomorphism. J. Exp. Psychol.: Gen. (2016). https://doi.org/10.1037/xge0000214
Smestad, T.L.: Personality matters! Improving the user experience of chatbot interfaces (2018). http://hdl.handle.net/11250/2502575
Stern, A.: Creating emotional relationships with virtual characters; from: emotions in humans and artifacts. Trappl, R., Petta, P., Payr, S. (eds.) (2003)
Terada, K., Jing, L., Yamada, S.: Effects of agent appearance on customer buying motivations on online shopping sites. In: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, pp. 929–934. ACM, Seoul (2015). https://doi.org/10.1145/2702613.2732798
Xiao, H., Reid, D., Marriott, A., Gulland, E.K.: An adaptive personality model for ECAs. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 637–645. Springer, Heidelberg (2005). https://doi.org/10.1007/11573548_82
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Smestad, T.L., Volden, F. (2019). Chatbot Personalities Matters. In: Bodrunova, S., et al. Internet Science. INSCI 2018. Lecture Notes in Computer Science(), vol 11551. Springer, Cham. https://doi.org/10.1007/978-3-030-17705-8_15
Download citation
DOI: https://doi.org/10.1007/978-3-030-17705-8_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-17704-1
Online ISBN: 978-3-030-17705-8
eBook Packages: Computer ScienceComputer Science (R0)