Brain Sensors for Better Learning

A new artificial intelligence system designed at Tufts makes it quicker and easier to play the piano—is it the future of education?

illustration of a conductor coming out of someone's head

In a fourth-floor Tufts lab, a computer program was in the process of convincing a student that she was actually interacting with a human. It was spring 2015, and the student had come to the lab for a study involving a new way of teaching people to play the piano.

The beginning of the session had been fairly unremarkable. The researchers—Beste Yuksel, E16, then a Ph.D. candidate in computer science, and Kurt Oleson, A15, a brain science major with a minor in music engineering—put a headband-like contraption on the student’s head and sat her down at a piano keyboard. In front of the keyboard was a computer screen that displayed the soprano line of a Bach piano chorale that she was supposed to play.

As a beginner pianist, she was seeing the piece for the first time, so it was optimistic to think that she’d be able to play it well. Still, the student concentrated hard on the notes, her hand hovering tentatively over the keyboard as she tried to move the appropriate finger to the appropriate note at the appropriate time. It was difficult, but eventually she managed to get the hang of it.

And that’s when things began to get weird. At the precise moment that the student got comfortable with the soprano line, it was joined on the screen by the bass line. Then, as soon as she got the hang of playing those two lines together, a third one, the alto line, materialized. And finally, once she was used to playing those three at the same time, the tenor line appeared on the screen. As if by magic, a new challenge always seemed to show up just when she was ready for it, until she found herself playing all the parts of the piece together.

After she was done, the student sat down with Yuksel and Oleson, eager to compliment them on how intuitive they were, how exquisitely attuned to her needs as a learner. Their timing was great, she told them. That’s when she learned that, rather than the researchers deciding when to present her with the next line, it had been a computer program that used machine learning, a type of artificial intelligence.

Neuroscience and Interactive Digital Learning

Yuksel and Oleson call their AI system Brain Automated Chorales, or BACh. It’s the first AI system to collect brain data and use that information to adapt a task for learners in real time. “It’s a huge deal,” said H. Chad Lane, an educational psychologist at the University of Illinois at Urbana-Champaign who studies intelligent technologies for learning. “No one has really successfully integrated neuroscience into interactive digital learning very well yet.”

BACh employs a technology known as functional near infrared spectroscopy, or fNIRS. Infrared light is shot into the skull, and the contraption worn by the subject—a couple of rubbery sensor pads secured to the forehead via a navy blue Tufts athletic headband, and connected via wires to a white microwave-sized box—measures fluctuating oxygen levels up to three and a half centimeters deep into the brain. That includes part of the prefrontal cortex, a key area for tasks like note reading that require planning, attention, problem solving and working memory.

BACh uses sensors like this one. Photo: Courtesy of Beste YukselBACh uses sensors like this one. Photo: Courtesy of Beste Yuksel
The data on oxygen levels in the prefrontal cortex allows BACh to calculate a person’s “cognitive load” from moment to moment—the higher that load, the harder the brain is working. When cognitive load drops below a certain threshold, it signals that the person is ready for something more difficult, which is when the system introduces an additional line of the chorale.

The idea is that meting out material in this way helps make learning more efficient—and the evidence suggests that it does. Each of the 16 beginner piano players in Yuksel and Oleson’s study learned one Bach chorale with help from BACh and a second chorale without it.

The sensors are secured to the forehead with an athletic headband. Photo: Courtesy of Beste YukselThe sensors are secured to the forehead with an athletic headband. Photo: Courtesy of Beste Yuksel
With the help, they learned to play the piece at faster tempos, played more correct notes and fewer incorrect ones, and missed fewer notes overall. The subjects told Yuksel and Oleson that playing seemed easier with BACh, resulting in a better sense of mastery. Those with less experience benefited more, and Yuksel thinks the approach could be applied to any field, such as math or engineering, in which the material to be mastered can be clearly divided into increasing levels of difficulty.

Bringing Up Bach

BACh grew out of Oleson’s own experience as a volunteer subject. In the spring of 2014, when he was a junior, he participated in an earlier experiment Yuksel had designed. That one also used fNIRS to measure a pianist’s cognitive load. “Seeing that technology, I just knew that it had some great potential,” Oleson said. The next fall, he emailed Yuksel to see if they could work together for his senior thesis. Yuksel, who has master’s degrees in both neuroscience and computer science, agreed, and the two set up shop in the lab of their advisor, Robert Jacob, a professor of computer science who has pioneered similar studies to measure cognitive load. (See the story and video “A Load Off Your Mind.”)

Yuksel built the computer algorithm that used the fNIRS information to calculate a subject’s cognitive load and determine when to give a new line of music. Meanwhile, it was Oleson who suggested using Bach chorales for the experiment.

A piano player since the age of 5, when a babysitter taught him Christmas carols on a keyboard, he’d learned during a music theory course at Tufts that the chorales have rules about which notes can be played at the same time, and which notes can follow others. That makes each of them similar in difficulty level, which makes them ideal for a controlled experiment. After each subject participated in the experiment, Oleson would listen for missed and wrong notes in their playing. He also wrote a Python programming language script to analyze the recordings for problems with tempo and time.

Last May, Yuksel presented their research at the ACM CHI Conference for Human-Computer Interaction, where the report won a best paper award. Oleson, meanwhile, won Tufts’ DeFlorez Prize in Human Engineering, awarded to only one student a year, for the portion of the work that comprised his senior thesis.

What makes BACh special is that much of cognitive neuroscience research relies on functional magnetic resonance imaging (fMRI), which means that subjects have to lie completely still inside the large, clanking metal tube of an fMRI scanner. With BACh’s flexibility, however, it becomes possible to envision brain-based AI tutoring systems that students could use in daily life—while doing homework, for example.

BACh is just one of several projects across the country exploring how AI can improve learning. Ruth Wylie, the assistant director of the Center for Science and the Imagination at Arizona State University, pointed out that AI can work with data from sensors that measure a variety of physiological responses, not just brain activity.

Take skin conductance—that is, the skin’s ability to conduct electricity. It turns out that stress causes skin conductance to increase, possibly indicating when the demands on a student have become overwhelming. And video footage can show revealing facial expressions and even pulse rates. All of that information could be incorporated into AI systems that calculate when a learner is ready for new information.

Another tack has been taken by Arthur Graesser, co-director of the Institute for Intelligent Systems at the University of Memphis, and Sidney D’Mello, an associate professor of psychology and computer science at the University of Notre Dame. Their AI system, called AutoTutor, measures student engagement via pressure-sensing pads that are draped over the seat and back of an office chair. When boredom causes a student to slump or shift around, the system picks up on it and adjusts the pace and difficulty of the lesson.

Meanwhile, Matthew Berland, a computer scientist at the University of Wisconsin Madison, is using AI to help improve collaborative rather than individual learning. In one of his projects, an AI system called Amoeba analyzed computer code written by junior high and high school students.

Gentle on My Mind

The system calculated which students would learn well together by looking for patterns in their work and in the ways that they solved problems. The students were then assigned by Amoeba to groups of four that were tasked with writing code to control a robot soccer team. Sure enough, Berland said, the groups assembled by Amoeba built programs with more complexity, depth and interacting elements than the ones created without help from the system.

As exciting as their potential may be, AI systems used in a classroom setting do raise privacy issues. “When a system is tracking a student as they learn a skill, it’s often visible to other students,” said Lane, the educational scientist. He said that schools using a system should require parental consent to share educational data. “As a parent myself, I’d prefer to see a permission slip,” he said.

Then again, Lane said that if his own child brought home one of those permission slips, he would sign it right away. “The potential is really great to get more and more accurate assessments of a student,” he said. “The exciting thing is what the combined power of all these techniques will mean for the future of learning.”

Yuksel completed her Ph.D. research at Tufts in June and is now an assistant professor of computer science at the University of San Francisco. At a lab there, she continues to work on BACh. The next step, she said, will be to add emotion-sensing capabilities that use facial recognition technology from the software startup Affectiva, and a wristband from Empatica, a company that develops wearable devices to monitor responses like heart rate, movement, stress and skin temperature. “Learning is such an emotional process,” Yuksel said. “We use emotion all the time in our rational thinking.”

The ability to sense emotions may allow BACh to respond more effectively to users, she said, by recognizing not just when they need more information but also when they need less. Relying on cognitive-load readings to make that determination is tricky, she said, because a high load could indicate that learners are overwhelmed—or that they are in a productive learning zone. Assessing a user’s emotional state, perhaps by looking for signs of escalating stress or anxiety, could be a better approach.

For example, Yuksel said, “research shows that when people feel frustrated, that often precedes just giving up. So if we could measure someone’s emotional state, we could adapt and maybe decrease information” in those moments. BACh could also recommend good times to take a short break, she said.

Yuksel said that she has no immediate plans to turn BACh into a commercial venture, but looks forward to the day when it could become more widely accessible. Right now, checking in on someone’s brain with fNIRS requires a big, clunky machine, but she predicts that will change. “Technology is being developed to make fNIRS portable, so you could have a device that you could wear,” she said. When that happens, perhaps in 30 or 40 years, she wants systems like BACh to be ready. “We’re going to want computer systems that can respond intelligently,” she said. “So the way I see my research is, I’m starting to build those systems.”

Anna Nowogrodzki has written for National Geographic, NOVA, Nature, and MIT Technology Review.

This article first appeared in the Fall 2016 issue of Tufts Magazine.

Back to Top