Carnegie Mellon and IBM Intro Open Platform To Help Blind Navigate

A research project at Carnegie Mellon University is helping blind people navigate with the use of an iPhone app that directs them based on information picked up from nearby sensors. NavCog, which is expected to be made available in the iTunes Store soon, draws on Bluetooth beacons and cognitive technologies to inform users on campus about their surroundings by talking to them through the phone or vibrating. The user enters the destination, turns on voice navigation, follows directions and will be notified when he or she arrives.

A separate function under development performs facial recognition to help the user identify who is approaching, friends or strangers, and what kinds of expressions they're showing. The researchers are also investigating the use of computer vision to characterize the activities of people in the vicinity and ultrasonic technology to help identify locations more accurately inside and outside of buildings.
A separate function under development performs facial recognition to help the user identify who is approaching, friends or strangers, and what kinds of expressions they're showing.

The app is built on the Human-Scale Localization Platform (HULOP), an open source initiative introduced jointly by Carnegie Mellon and IBM as an environment upon which to build applications specifically for people who are visually impaired.

"To gain further independence and help improve the quality of life, ubiquitous connectivity across indoor and outdoor environments is necessary," said IBM Fellow Chieko Asakawa, a visiting faculty member at Carnegie Mellon who is blind herself. "I'm excited that this open platform will help accelerate the advancement of cognitive assistance research by giving developers opportunities to build various accessibility applications and test non-traditional technologies such as ultrasonic and advanced inertial sensors to assist navigation."

Asakawa's research is supported with an award from IBM's Open Collaborative Research program.

A video demonstrating the technology is available on YouTube.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • SXSW EDU

    Explore the Future of AI in Higher Ed at SXSW EDU 2025

    This March 3-6 in Austin, TX, the SXSW EDU Conference & Festival celebrates its 15th year of exploring education's most critical issues and providing a forum for creativity, innovation, and expression.

  • white clouds in the sky overlaid with glowing network nodes, circuits, and AI symbols

    AWS, Microsoft, Google, Others Make DeepSeek-R1 AI Model Available on Their Platforms

    Leading cloud service providers are now making the open source DeepSeek-R1 reasoning model available on their platforms, including Amazon, Microsoft, and Google.

  • glowing futuristic laptop with a holographic screen displaying digital text

    New Turnitin Product Brings AI-Powered Tools to Students with Instructor Guardrails

    Academic integrity solution provider Turnitin has introduced Turnitin Clarity, a paid add-on for Turnitin Feedback Studio that provides a composition workspace for students with educator-guided AI assistance, AI-generated writing feedback, visibility into integrity insights, and more.

  • From Fire TV to Signage Stick: University of Utah's Digital Signage Evolution

    Jake Sorensen, who oversees sponsorship and advertising and Student Media in Auxiliary Business Development at the University of Utah, has navigated the digital signage landscape for nearly 15 years. He was managing hundreds of devices on campus that were incompatible with digital signage requirements and needed a solution that was reliable and lowered labor costs. The Amazon Signage Stick, specifically engineered for digital signage applications, gave him the stability and design functionality the University of Utah needed, along with the assurance of long-term support.