Do you furrow your brows when you’re concentrating? Or touch your hand to your chin when you’re deep in thought? In the not-so-distant future, classroom-ready learning software may be able to pick up on those kinds of movements and deliver content most suited to a student’s emotional state.
When it comes to emotion-sensing artificial intelligence software, academics across the country are contributing to a growing body of research and experimental technology. But, this week, researchers at North Caroline State University said they were able to show that automated facial expression recognition could be nearly as accurate as human recognition in analyzing a wider range of student movements and gestures. The work, they said, moves them closer to developing an artificial intelligence program for teaching computer science, called JavaTutor, that not only adapts to a student’s skill level but her emotions, including frustration, engagement or confusion.
“We want systems that are more responsive and adaptive to the student – and systems that are based on empirical findings – to build motivation and keep confidence going,” said Joseph Grafsgaard, a Ph.D. student at NCSU and lead author of the paper.
He and Kristy Boyer, an assistant professor of computer science at NCSU, used the Computer Expression Recognition Toolbox (CERT), a sophisticated software package for analyzing facial expressions developed at the University of California San Diego, to evaluate the expressions of 65 college students participating in online tutoring sessions. They determined that the program was able to identify facial movements associated with learning-related emotions like frustration and concentration and that it was consistent with expert human evaluators 85 percent of the time.
As part of the study, they also tested students before and after each tutoring session to measure knowledge gains and asked them to report how effective they felt the session was.
Positive feedback can backfire if students don’t need it
Combining the CERT data with the students’ self-assessments, the researchers built models that could predict how effective a session was based on what the students’ facial expressions indicated about their emotions.
All of the research will be folded into the next version of JavaTutor to ensure that the program provides the most effective emotion-relevant feedback. For example, said Grafsgaard, if a student completes a coding assignment that generates an error and he appears frustrated, the program could offer up a motivational message, like: “Keep at it – the more mistakes you make, the more you learn.” But if he appears confident and focused, it might provide minimal feedback and merely say “Good job.”
Interestingly, while it may seem harmless to provide encouraging feedback to students who are not frustrated, Boyer said research indicates that it can actually backfire if students aren’t in the right emotional mindset. An otherwise focused student could feel patronized or annoyed by ill-timed positive feedback, she said.
“We do think that if we give motivational feedback at the wrong time it could be harmful to learning,” she said. “And the fact these messages are coming from a computer-based tutor could magnify the effect.”
More technology that monitors our emotions
Boyer and Grafsgaard’s work is part of a budding movement, in academia and out outside of it, to build technology that is more responsive to our emotions. As a recent New York Times article pointed out, emotionally-aware software isn’t without ethical and privacy questions, but it opens the door to technology that’s even more engaging and that fits more seamlessly into our lives. Startup Affectiva creates software that reads expressions to gauge feedback to ads and media. Beyond Verbal, another startup, can recognize emotion in human speech. Those types of technologies could be used to generate more personalized digital experiences or even influence person-to-person interactions.
In education, emotion-sensing technology could build on the already booming field of adaptive learning software that assesses students’ mastery and delivers content appropriate to their skill level. The Autotutor program, developed out of the University of Memphis, and Wayang Outpost, created by researchers at the University of Massachusetts Amherst, also provide automated instruction guided by a student’s emotional state. But Grafsgaard said those programs are trained to process a more limited set of student gestures and expressions.
While emotional-aware educational technology could improve a student’s learning experience, some may worry that it could undermine the role of an in-person teacher. I also wondered whether the ways in which people express emotions is uniform across cultures. But Grafsgaard said the point of the software is not to replace traditional instruction, just to complement it. He also said that while more data is needed, the research suggests that facial expressions (although not necessarily other gestures) are indeed similar across different cultures.
Boyer said they chose to focus on computer science learning because of the national need for more professionals in the STEM (science, technology, engineering and math) fields.
“In the fairly short term, in less than 10 years, we could deploy a system that can substantially improve students’ experiences in computing classes,” she said. “We hope so, in particular, for students who are at risk for giving classes a try and then feeling overwhelmed and dropping out.”