Joint University Team To Lay Out Vision of New Internet

Four decades after it hosted the first message to be sent out over the Internet in 1969, the University of California, Los Angeles will be managing a joint institutional research project to lay out the framework for a new Internet. This time, however, the computer scientists and networking experts will be joined by theater arts and film people. The project is being funded through a $7.9 million research grant issued through the National Science Foundation's Directorate for Computer and Information Science and Engineering.

Besides UCLA, participating institutions also include Colorado State University in Fort Collins; the University of Arizona in Tucson; the University of Illinois at Urbana--Champaign; UC Irvine; the University of Memphis in Tennessee; UC San Diego; Washington University in St. Louis, MO; and Yale University in New Haven, CT.

Van Jacobson of the Palo Alto Research Center, a Xerox company, will serve as the network architect for the research initiative. He'll be working with UCLA principal investigators Lixia Zhang, a professor of computer science at UCLA; Deborah Estrin, director of the UCLA Center for Embedded Networked Sensing (CENS) and a professor of computer science at UCLA; and Jeff Burke, a professor at the UCLA School of Theater, Film and Television and executive director of the UCLA Center for Research in Engineering, Media and Performance (REMAP).

"Our vision is conceptually simple," said Zhang. "We plan to explore a new Internet architecture, Named Data Networking (NDN), in which we'll replace the 'where'--addresses and hosts--with 'what,' the content that users and applications care about. By naming data instead of locations, [which allows computers to find and communicate with one another], the new architecture transforms data into a first-class entity."

Among the areas of exploration for the researchers will be Internet security. Current plans call for the data's name to provide the context for security. NDN will be able to tell if the data on a Web page was actually produced and signed by the owner of the site, similar to a bank transaction.

"Technical challenges will be addressed to validate NDN as a future Internet architecture," Zhang added. "Routing scalability, fast forwarding, trust models, network security, content protection and privacy, and fundamental communication theory will all need to be considered."

REMAP's role is of particular interest. "I am excited that REMAP's participation will bring storytellers into a significant and productive collaboration with engineers and computer scientists," said Teri Schwartz, dean of the UCLA School of Theater, Film and Television. "The NDN project exemplifies the extraordinary potential for interdisciplinary research at UCLA and demonstrates the proactive role that great film and theater institutions should play in developing fundamental next-generation technology."

REMAP's Burke will be leading work on developing and deploying prototype applications to test out the architecture. "REMAP has been exploring the use of named data at the application networking level since 2002, when we realized it would help us organize and develop distributed applications that incorporated sensors, media, and automation of the physical environment," he said. "The NDN project is very exciting, as it makes a more sophisticated and comprehensive version of this idea fundamental to the network itself."

The application prototypes to test NDN are expected to focus on three areas: streaming content distribution; media-rich, instrumented environments (such as "Smart buildings"); and participatory sensing on mobile phones.

"REMAP's research integrates cultural, social and engineering objectives; we are interested in applications that are both expressive and functional," Burke said. "This is crucial to understanding future network applications, as the Internet has become integral to not just commerce but our social and creative lives."

The research project will also include education and outreach components. Curriculum will be developed; graduate students will be involved in core research and thesis work; and a summer internship program for traditionally underrepresented undergraduate students will be created.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • college student wearing a backpack sits on a campus walkway, looking down at his smartphone

    University of Utah Partners with Pathify on Utah 360 App

    The University of Utah has partnered with digital engagement hub Pathify to launch a new app for the university community.

  • magnifying glass highlighting the letters “AI” within lines of text

    New Turnitin Detection Feature Helps Identify Use of AI Humanizer Tools

    Academic integrity solution provider Turnitin has expanded its AI writing detection capabilities with AI bypasser detection, a feature designed to help identify text that has been modified by AI humanizer tools.

  • magnifying glass highlighting a human profile silhouette, set over a collage of framed icons including landscapes, charts, and education symbols

    AWS, DeepBrain AI Launch AI-Generated Multimedia Content Detector

    Amazon Web Services (AWS) and DeepBrain AI have introduced AI Detector, an enterprise-grade solution designed to identify and manage AI-generated content across multiple media types. The collaboration targets organizations in government, finance, media, law, and education sectors that need to validate content authenticity at scale.

  • server racks, a human head with a microchip, data pipes, cloud storage, and analytical symbols

    OpenAI, Oracle Expand AI Infrastructure Partnership

    OpenAI and Oracle have announced they will develop an additional 4.5 gigawatts of data center capacity, expanding their artificial intelligence infrastructure partnership as part of the Stargate Project, a joint venture among OpenAI, Oracle, and Japan's SoftBank Group that aims to deploy 10 gigawatts of computing capacity over four years.