stub Can NLP Soon Include Signed Languages? - Unite.AI
Connect with us

Artificial Intelligence

Can NLP Soon Include Signed Languages?

Updated on

Natural language processing (NLP) is one of the most important subfields of artificial intelligence (AI). It involves computers being able to analyze and understand human language, and it’s often used in translation software, chatbots, voice-controlled assistants, and various tools throughout many industries. However, it is often focused on spoken languages, meaning it doesn’t address the 200+ signed languages that nearly 70 million people rely on across the globe. 

This could soon be changing as new research addresses the issue. Master’s student in the Language Technologies Institute Kayo Yin is the co-author of a new paper looking into NLP for signed languages.

Including Signed Languages in NLP

The paper titled “Including Signed Languages in Natural Language Processing” received the award for Best Theme Paper at the 59th Annual Meeting of the Association for Computational Linguistics. The other co-authors included Amit Moryossef of Bar-Ilan University in Israel; Julie Hochgesang of Gallaudet University; Yoav Goldberg of Bar-Ilan University and the Allen Institute for AI; and Malihe Alikhani of the University of Pittsburgh's School of Computing and Information.

“Signed languages, even though they are a significant part of the languages used in the world, aren't included,” Yin said. “There is a demand and an importance in having technology that can handle signed languages.”

According to the authors, communities that use sign language have been fighting for decades to learn and use the languages and to have them be recognized as legitimate.

The Suppression of Sign Language

“However, in a predominantly oral society, deaf people are constantly encouraged to use spoken languages through lipreading or text-based communication,” the paper stated. “The exclusion of signed languages from modern language technologies further suppresses signing in favor of spoken languages.”

Yin’s interest in sign language first appeared while she was doing outreach work at a homeless shelter as an undergraduate at École Polytechnique in Paris. She recognized the difficulty deaf individuals were having in establishing connections with others, so Yin began learning French sign language and made sign language translation one of the focuses of her undergraduate research. 

Sign languages, which rely mostly on hand gestures, facial expressions, and head and body movements, make it possible to convey multiple words at once, unlike spoken languages. Signed languages also use shortcuts, which act similar to pronouns in spoken languages. NLP processing tools are extremely efficient at handling complex language, far better than computer vision models. 

“We need researchers in both fields to work hand in hand,” Yin said. “We can't fully understand signed language if we only look at the visuals.”

According to Hochgesang, a deaf linguist who focuses on signed languages, they were barely present in the literature, linguistic classes, and research encountered in the pursuit of her degree. When it comes to the degree, language only involves speech. 

“On a personal scale, this hurt. It completely ignored my way of being,” Hochgesang said. “When I was a student, I didn't see myself in the data being described and that made it really hard for me to connect. That it still hasn't improved much these days is unfortunate. The only way this kind of thing will change is if we are included more.”

Yin says the paper is a big step forward in motivating individuals and bringing communities together. 

“It's really exciting to see a paper that I wrote motivate people, and I hope it can make a change in these communities,” Yin said.

 

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.