Tufts researchers help create an award-winning ASL database that allows users to look up a sign without having to guess at its English translation
Although American Sign Language, used by 250,000 people in the United States, is widely recognized as a rich, complex language, ASL learners and researchers have never enjoyed the kind of large, comprehensive database available in other languages—until now.
A new database of 1,000-plus ASL signs and their lexical and phonological properties, developed by students and faculty at Tufts University and the Laboratory for Language and Cognitive Neuroscience at San Diego State University, won first place late last month in the people’s choice interactive category of the National Science Foundation’s 2017 Vizzies: Visualization Challenge, which recognizes visual conceptualizations that help general audiences understand complex ideas in science and engineering.
Called ASL-LEX, the project is the largest and most thorough database of ASL signs and meanings to date and is already being used by schools including the Learning Center for the Deaf in Framingham, Massachusetts, and Horace Mann School for the Deaf in Boston, Massachusetts, according to Ariel Goldberg, an associate professor of psychology who heads the Psycholinguistics and Linguistics Lab at Tufts.
“The database will have a broad appeal for psychologists, linguists and others who are working on the science of language and doing their own experiments,” said Goldberg, whose former doctoral student, Naomi Caselli, G15, began creating the database for her dissertation in cognitive science.
Caselli was studying ASL “rhymes” or “neighbors”—signs that resemble one another in “sign form,” such as shape or type of movement. For instance, the sign for apple and the sign for onion are “rhymes” because they are formed in the same way except apple is produced on the cheek and onion is produced at eye level.
Working with Goldberg and Tufts students fluent in ASL, Caselli developed a coding system that indicates various aspects of “sign form,” including where a sign is made on the body, whether it uses one or both hands, and how the fingers move. She also included information about how frequently each sign is used in everyday conversation, how much a sign resembles the object or action it represents, and what grammatical class the sign belongs to.
Caselli discovered that signs with many neighbors took longer to recognize than those with fewer.
“When you’re seeing a sign, you have to go through certain stages to translate the signal, map it to long-term memory, and retrieve the meaning,” Goldberg said. “If a sign has many neighbors, which we call high neighborhood density, it’s slower to process—to understand—that sign.”
Karen Emmorey and Zed Sevcikova Sehyr from the Laboratory for Language and Cognitive Neuroscience at San Diego State University also contributed to the database, which Caselli made publicly available to help other researchers with similar questions.
“Researchers have uncovered information about small sets of signs, but there hasn’t been a bird’s-eye view of the ASL lexicon until now,” said Caselli, now an assistant professor at Boston University’s School of Education.
With Caselli’s direction and the supervision of David Grogan of Tufts Technology Services (TTS), Ben Tanen, E17, worked to make ASL-LEX accessible to those who want to look up ASL signs but lack a database background, including people learning or teaching ASL. TTS created a visual interface that groups signs according to similarities in shape and location; it also allows a user to zoom in on different “neighborhoods.”
“Until now, there have been ASL dictionaries, but they are organized alphabetically based on English translations for each sign. That means you have to know or guess an English translation to look up the sign,” Caselli said. “ASL-LEX allows users to look up signs using the sign form directly.”
Another section of the database contains estimates of how often signs are used on a daily basis. By offering information about how easy signs are to retrieve from memory and how likely they are to be retained, ASL-LEX can help educators determine which signs to teach first, and assist teachers and others in assessing students’ proficiency in acquiring the language.
Caselli, Goldberg and Emmorey won an NSF grant that will enable them to add 1,500 more signs and many more sign properties to the database as well as information about how quickly people produce and identify each sign. They are also intent on making the website more user-friendly, particularly for ASL teachers and learners. They want users to be able to organize the signs the way they want, by part of speech or by hand shape, for example.
For his part, Goldberg is working on another project inspired by ASL-LEX—a system for automated gesture recognition, which could allow cyclists to talk to drivers, drivers to talk to cars, scuba divers to talk to each other, and airport runway workers and crane operators to signal information—all without saying a word.
“I’m fundamentally amazed by language,” he said. “It’s something we use all the time every day without thinking about it. Thanks to remarkable advances in linguistics, we now know that although languages look very different from each other, they actually have a lot of commonalities.”
Monica Jimenez can be reached at monica.jimenez@tufts.edu.