Dr. Sheryl BrahnamThe Economist article Call and response: Computing: Nobody enjoys telephoning a call centre. Could “chatbot” technology make the experience less painful? said
There is more to handling call-centre queries than simply understanding language and looking things up in databases. Sheryl Brahnam, a researcher at Missouri State University in Springfield, suggests that it will also be necessary to program chatbots to deal with verbal abuse. In some cases, she says, companies that have used chatbots to handle online queries have found that when confronted by verbal abuse or sexual innuendo, the chatbots were programmed to respond inappropriately in kind, with insults of their own.
Dr Brahnam has also found that the appearance of the chatbot’s on-screen persona, or avatar, has a significant impact on how much abuse is levelled at it. “My study showed that you get more abuse and sexual comments with a white female compared with a white male,” she says. Black female avatars were the most abused of all. This leads Dr Brahnam to question how effective IBM’s electronic-elocution lessons will prove to be. Even if two operators are using the same script, she says, some callers may respond differently (or even abusively) depending on the operator’s gender or accent.
Never mind the philosophical question of whether it is wrong to insult a machine. To neutralise such situations, chatbots must be able to handle verbal abuse constructively, says Dr Brahnam. She is now devising ways to program chatbots with the sorts of rules that human operators use. There are two broad approaches. The first is a “three strikes and you’re out” approach in which the chatbot repeatedly warns the customer to stop being abusive, and eventually hangs up or passes the call over to a human manager. The second approach is more psychological. Giving some ground to customers and acknowledging that they have been wronged, and that their frustration is legitimate and understandable, can help to restore calm and allow the call to proceed.
Her research interests include:
- Decision support systems: medical DSS, especially DSS using facial expression detection (e.g., neonatal facial pain detection systems)
- Virtual humans and embodied conversational agents: especially as they function as virtual sales agents, business representatives, and navigational aids
- Artificial intelligence and computer vision: face recognition (support vector machines, neural networks, and classifier ensembles)
- Modeling and simulation: smart embodiment (facial attribution modeling and synthesis)
- Electronic face enhancement: small screen facial correction and attribution adjustment systems
- Social, cultural, ethical, and educational aspects of technology: conflict management among IT staff, gender and IT, self-directed learning and the use of content management systems, message boards, and other community building technologies, the abuse and misuse including the creative misuse of interactive technologies
Sheryl earned a Masters of Fine Art in Intermedia in 1991 from the City College of New York, a Masters of Science in Computer Science in 1997 from the City College of New York, a M. Phil. in Computer Science in 2002 from the Graduate Center at The City University of New York, and a Ph.D. in Computer Science in 2002 from the Graduate Center at The City University of New York.
Read about her research in medical face detection in the MIT Technology Review article Assessing pain in infants: New software could help medical staff know when newborn patients are in pain.