In this policy brief, Abel Wajnerman Paz discusses “bonding chatbots,” a type of social, conversational artificial intelligence designed to embody a personal relationship — a friend, a sibling or a romantic partner. These artificial bonds are available 24/7 to attend to their users’ emotional needs and reduce their loneliness, but the personal and societal damage that this technology can cause is already tangible. Wajnerman Paz writes that “unlike therapist chatbots, which are presented as medical technologies backed by professional psychologists, bonding chatbots are presented as well-being technologies for everyday use. However, some of their users may be psychologically vulnerable people seeking therapeutic solutions for their mental health issues, and therefore at significant risk if exposed to emotional manipulation and deception. Treating bonding chatbots as a medical technology could help mitigate these risks.”
|