Brigham Young Researchers Develop Google Glass System To Assist Deaf Students

Researchers at Brigham Young University (BYU) have developed a system that projects sign language interpreters onto Google Glass and other similar types of glasses.

The "Signglasses" project was developed to improve the planetarium experience for deaf students. Typically, when deaf students visit the planetarium, they can't see the sign language interpreter and the overhead projections at the same time because the lights have to be on to see the interpreter and off to see the projection. With Signglasses, deaf students can watch the planetarium projection at the same time as they watch the interpreter projected onto their glasses.

The research team has field tested the system with students from Jean Massieu School for the Deaf. The researchers were surprised to discover that students preferred the interpreter to be projected in the center of one lens, so they could look straight through the signer when focusing on the planetarium show. The team had assumed students would prefer to see the projection at the top of the lens, as Google Glass normally does.

The Signglasses project is lead by Michael Jones, assistant professor of computer science at BYU, and several of the student researchers working with him are deaf. "Having a group of students who are fluent in sign language here at the university has been huge," said Jones in a prepared statement. "We got connected into that community of fluent sign language students and that opened a lot of doors for us."

The team is also working with researchers at Georgia Tech to explore the potential of Signglasses as a literacy tool. With the technology, when deaf students encounter new words in books, they could push a button, and a video dictionary would project a definition of the word in sign language.

The full results of the Signglasses research project will be published in June at Interaction Design and Children.

Further information about the project can be viewed in a YouTube video.

About the Author

Leila Meyer is a technology writer based in British Columbia. She can be reached at leilameyer@gmail.com.

Featured

  • robot typing on a computer

    Microsoft Announces 'Computer Use' Automation in Copilot Studio

    Microsoft has introduced a new AI-powered feature called "computer use" for its Copilot Studio platform that allows agents to directly interact with Web sites and desktop applications using simulated mouse clicks, menu selections and text inputs.

  • university building with classical columns and a triangular roof displayed on a computer screen, surrounded by minimalist tech elements like circuit lines and abstract digital shapes

    Pima Community College Launches New Portal for a Unified Digital Campus Experience

    Arizona's Pima Community College is elevating the digital campus experience for students, faculty, and staff with a new portal built on the Pathify digital engagement platform.

  • From the Kuali Days 2025 Conference: A CEO's View of Planning for AI

    How can a company serving higher education navigate the changes AI brings to ed tech? What will customers expect? CT talks with Kuali CEO Joel Dehlin, who shared his company's AI strategies with attendees at Kuali Days 2025 in Anaheim.

  • illustration of a football stadium with helmet on the left and laptop with ed tech icons on the right

    The 2025 NFL Draft and Ed Tech Selection: A Strategic Parallel

    In the fast-evolving landscape of collegiate football, the NFL, and higher education, one might not immediately draw connections between the 2025 NFL Draft and the selection of proper educational technology for a college campus. However, upon closer examination, both processes share striking similarities: a rigorous assessment of needs, long-term strategic impact, talent or tool evaluation, financial considerations, and adaptability to a dynamic future.