Amazon Alexa Fellowships to Fund AI Research

Amazon Echo Dot

Photo: Mahathir Mohd Yasin / Shutterstock.com

Amazon has chosen recipients from higher education institutions around the world to receive Alexa Fellowships. The goal: to help people dedicate research energy to exploring speech and language technologies.

The program was launched last year, when four researchers won awards. This year's awards are going to two kinds of recipients: students and faculty.

Ten Ph.D. and post-doctoral students are receiving "graduate fellowships" to foster their study of "conversational AI." The fellowships include enough funding for tuition and a "competitive stipend," as well as mentoring from an Alexa scientist. The schools where the students are enrolled will also receive Alexa devices and access to the Alexa Skills Kit (ASK) and Alexa Voice Services (AVS) to use in coursework.

Those who received the graduate fellowships, according to Amazon, were selected based on their research interests, planned coursework and existing conversational AI curriculum. The institutions include six in the United States, two in the United Kingdom, one in Canada and one in India.

Among those recognized: Jessica Van Brummelen, a student at MIT, who wants to develop conversational artificial intelligence tools that let anyone create their own intelligent systems; Hao Fang, at the University of Washington, whose interests include social chatbots and natural language processing; and James Thorne at the University of Cambridge, who's researching new ways in which AI can be used to verify the truthfulness of information.

The new Alexa innovation fellowships are going to faculty who serve as "expert resources" on voice interfaces on their campuses. Instructors will receive funding, Alexa devices, hardware kits and regular training, as well as introductions to successful Alexa Fund-backed entrepreneurs. All work in American institutions. Among them is Alice Liu, director of the University of Southern California Startup Garage, and Alexander Fred-Ojala, research director of the Data Lab and a co-founder of the Blockchain Lab at the Center for Entrepreneurship & Technology, in the University of California Berkeley's College of Engineering.

Two universities, Carnegie Mellon and USC, have received both kinds of fellowships.

"Voice is the most natural, convenient interface and we believe it can change the way humans interact with technology," wrote Kevin Crews, a senior product manager for the Amazon Alexa Fellowship, in a blog post. That's why, he noted, "it is important for us to support the academic community that continues to tackle the hardest of challenges that can advance voice technology."

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • pattern featuring interconnected lines, nodes, lock icons, and cogwheels

    Red Hat Enterprise Linux 9.5 Expands Automation, Security

    Open source solution provider Red Hat has introduced Red Hat Enterprise Linux (RHEL) 9.5, the latest version of its flagship Linux platform.

  • glowing lines connecting colorful nodes on a deep blue and black gradient background

    Juniper Launches AI-Native Networking and Security Management Platform

    Juniper Networks has introduced a new solution that integrates security and networking management under a unified cloud and artificial intelligence engine.

  • a digital lock symbol is cracked and breaking apart into dollar signs

    Ransomware Costs Schools Nearly $550,000 per Day of Downtime

    New data from cybersecurity research firm Comparitech quantifies the damage caused by ransomware attacks on educational institutions.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Garners OpenAI Support

    ChatGPT creator OpenAI is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.