MIT Researchers Aim to Bring Neural Nets to Smartphones

Neural networks have been behind many advancements in AI in recent years, underpinning systems designed to recognize speech or individual faces, among others. But neural nets are also large and power hungry, making them poor candidates to run on personal devices such as smartphones, and forcing apps that rely on neural nets to upload data to a server for processing. But researchers at MIT are working on a new kind of computer chip that might change that.

The new chip improves the speed of neural-network computation by three to seven times and reduces energy consumption by 94 to 95 percent, according to the research team.

The way processors generally work is to have memory on one part of the chip and processing on another part — and as computations occur, the data is moved back and forth.

"Since these machine learning algorithms need so many computations, this transferring back and forth of data is the dominant portion of the energy consumption," said Avishek Biswas, an MIT graduate student in electrical engineering and computer science and leader of the team developing the chip, in a prepared statement. "But the computation these algorithms do can be simplified to one specific operation, called the dot product. Our approach was, can we implement this dot-product functionality inside the memory so that you don't need to transfer this data back and forth?"

Most neural nets are arranged in layers of nodes, with nodes in one layer accepting input from the nodes below it and passing input up to nodes above it. The connections between nodes also have weights assigned to them, which determines how important the data traveling over that connection will be in the next round of computations.

"A node receiving data from multiple nodes in the layer below will multiply each input by the weight of the corresponding connection and sum the results," according to an MIT news release. "That operation — the summation of multiplications — is the definition of a dot product. If the dot product exceeds some threshold value, the node will transmit it to nodes in the next layer, over connections with their own weights."

In the new chip, input values are converted to voltage and multiplied by weight and then added together. The voltage is only then converted back to data for storage and further processing. The process allows the new chip to figure the dot products for multiple nodes in one step, eliminating the need to shovel data back and forth repeatedly. The group has demonstrated success with 16 nodes in its prototype.

"This is a promising real-world demonstration of SRAM-based in-memory analog computing for deep-learning applications," said Dario Gil, vice president of artificial intelligence at IBM, according to information released by MIT. "The results show impressive specifications for the energy-efficient implementation of convolution operations with memory arrays. It certainly will open the possibility to employ more complex convolutional neural networks for image and video classifications in IoT [the internet of things] in the future."

About the Author

Joshua Bolkan is contributing editor for Campus Technology, THE Journal and STEAM Universe. He can be reached at [email protected].

Featured

  • abstract illustration of a glowing AI-themed bar graph on a dark digital background with circuit patterns

    Stanford 2025 AI Index Reveals Surge in Adoption, Investment, and Global Impact as Trust and Regulation Lag Behind

    Stanford University's Institute for Human-Centered Artificial Intelligence (HAI) has released its AI Index Report 2025, measuring AI's diverse impacts over the past year.

  • modern college building with circuit and brain motifs

    Anthropic Launches Claude for Education

    Anthropic has announced a version of its Claude AI assistant tailored for higher education institutions. Claude for Education "gives academic institutions secure, reliable AI access for their entire community," the company said, to enable colleges and universities to develop and implement AI-enabled approaches across teaching, learning, and administration.

  • lightbulb

    Call for Speakers Now Open for Tech Tactics in Education: Overcoming Roadblocks to Innovation

    The annual virtual conference from the producers of Campus Technology and THE Journal will return on September 25, 2025, with a focus on emerging trends in cybersecurity, data privacy, AI implementation, IT leadership, building resilience, and more.

  • From Fire TV to Signage Stick: University of Utah's Digital Signage Evolution

    Jake Sorensen, who oversees sponsorship and advertising and Student Media in Auxiliary Business Development at the University of Utah, has navigated the digital signage landscape for nearly 15 years. He was managing hundreds of devices on campus that were incompatible with digital signage requirements and needed a solution that was reliable and lowered labor costs. The Amazon Signage Stick, specifically engineered for digital signage applications, gave him the stability and design functionality the University of Utah needed, along with the assurance of long-term support.