New Algorithm from MIT, Yale, and USC Tackles Max Flow Challenge

Researchers from MIT, Yale University, and the University of Southern California have developed what they are labeling the "fastest known algorithm" for solving the problem of "maximum flow."

The max flow problem, as it's called, refers to the process of figuring out how much capacity is available between any given starting point and any given end point. Capacity could refer to trucks on roads, water in pipes, or packets on a cable. The challenge first surfaced in 1954 in an attempt to model the behavior of Soviet railway traffic. The first algorithm to solve it came the following year. Since then numerous algorithms have surfaced to address the calculation of max flow more efficiently.

Typically, the network has been represented as a graph with a series of nodes or "vertices"; the lines between the vertices are called "edges." Each edge has a given maximum capacity, and the algorithms set out to calculate the most efficient way to use the capacity represented in the graph without exceeding the constraints.

As the network grows, however, the use of traditional algorithms may become too time-consuming to be practical. "There has recently been an explosion in the sizes of graphs being studied," explained Jonathan Kelner, an associate professor of applied mathematics at MIT and a member of MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL). "For example, if you wanted to route traffic on the Internet, study all the connections on Facebook, or analyze genomic data, you could easily end up with graphs with millions, billions or even trillions of edges."

The algorithm developed by Kelner and his four colleagues chooses to solve the max-flow problem by analyzing all potential paths at the same time. According to university coverage of the technique, the researchers viewed the graph as a collection of electrical resistors and then imagined connecting a battery to node A and a ground to node B, and allowing the current to flow through the network. "Electrical current doesn't pick just one path, it will send a little bit of current over every resistor on the network," Kelner said. "So it probes the whole graph globally, studying many paths at the same time."

This approach introduced definite efficiencies. Then the team added an additional twist. Their algorithm also identifies network routes that pose bottlenecks. "Our algorithm figures out which parts of the graph can easily route what they need to and which parts are the bottlenecks. This allows you to focus on the problem areas and the high-level structure instead of spending a lot of time making unimportant decisions, which means you can use your time a lot more efficiently," Kelner noted.

As the number of nodes increases so does the amount of running time required to solve the problem. However, the paper stated, rather than an exponential increase, the computation scales "near-linearly."

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • student and teacher using AI-enabled laptops, with rising arrows on a graph

    Student and Teacher AI Use Jumps Nearly 30% in One Year

    In a recent survey from learning platform Quizlet, 85% of high school and college students and teachers said they use AI technology, compared to 66% in 2024 — a 29% increase year over year.

  • cloud connected to a quantum processor with digital circuit lines and quantum symbols

    Columbia Engineering Researchers Develop Cloud-Style Virtualization for Quantum Computing

    Columbia Engineering's HyperQ system introduces cloud-style virtualization to quantum computing, allowing multiple users to run programs simultaneously on a single machine. Learn how it works, why it matters, and highlights from other recent quantum breakthroughs from leading institutions and vendors.

  • shield with an AI microchip emblem hovering above stacks of gold coins

    AI Security Spend Surges While Traditional Security Budgets Shrink

    A new Thales report reveals that while enterprises are pouring resources into AI-specific protections, only 8% are encrypting the majority of their sensitive cloud data — leaving critical assets exposed even as AI-driven threats escalate and traditional security budgets shrink.

  • stylized illustration of a desktop, laptop, tablet, and smartphone all displaying an orange AI icon

    Report: AI Shifting from Cloud to PCs

    AI is shifting from the cloud to PCs, offering enhanced productivity, security, and ROI. Key players like Intel, Microsoft (Copilot+ PCs), and Google (Gemini Nano) are driving this on-device AI trend, shaping a crucial hybrid future for IT.