Urban congestion is a pressing challenge, driving up emissions and compromising transport efficiency. Advances in big-data collection and processing now enable adaptive traffic signals, offering a promising strategy for congestion mitigation. In our study of China’s 100 most congested cities, big-data empowered adaptive traffic signals reduced peak-hour trip times by 11% and off-peak by 8%, yielding an estimated annual CO₂ reduction of 31.73 million tonnes. Despite an annual implementation cost of US$1.48 billion, societal benefits—including CO₂ reduction, time savings, and fuel efficiency—amount to US$31.82 billion. Widespread adoption will require enhanced data collection and processing systems, underscoring the need for policy and technological development. Our findings highlight the transformative potential of big-data-driven adaptive systems to alleviate congestion and promote urban sustainability. Big-data empowered traffic signal control in China can reduce vehicle trip times, creating potential reduction of 31.73 million tonnes (Mt) of CO2 emissions annually and US$31.8 billion benefits per year.
I’m extremely sceptical about local data being enough to properly guide traffic…
the problem is that intersections are connected.
one intersection influences others down the line, wether that is by keeping back too much traffic, thereby unnecessarily restricting flow, or by letting too much traffic flow, thus creating blockages.
you need a big picture approach, and you need historical data to estimate flow on any given day.
neither can be done with local data.
could you (slightly) improve traffic by using local traffic flow to determine signals? probably, sure.
but in large systems, on metropolitan scales, that will inevitably lead to unforseen consequences that will probably probe impossible to solve with local solutions or will need to be handles by hard coded rules (think something like “on friday this light needs to be green for 30 sec and red for 15 sec, from 8-17h, except on holidays”) which just introduces insane amounts of maintenance…
source: i used to do analysis on factory shop-floor-planning, which involves simulation of mathematically identical problems.
things like assembly of parts that are dependant on other parts, all of which have different assembly speeds and locations, thus travel times, throughout the process. it gets incredibly complex, incredibly quickly, but it’s a lot of fun to solve, despite being math heavy! one exercise we did at uni, was re-creating the master’s thesis of my professor, which was about finding the optimal locations for snow plow depots containing road salt for an entire province, so, yeah, traffic analysis is largely the same thing math-wise, with a bit of added complexity due to human behavior.
i can say, with certainty, that the data of just the local situation at any given node is not sufficient to optimize the entire system.
you are right about real-time data being important to account for things like construction. that is actually a problem, but has little to do with the local data approach you suggested and can’t be solved by that local data approach either… it’s actually (probably) easier to solve with the big data approach!
I’m extremely sceptical about local data being enough to properly guide traffic…
the problem is that intersections are connected.
one intersection influences others down the line, wether that is by keeping back too much traffic, thereby unnecessarily restricting flow, or by letting too much traffic flow, thus creating blockages.
you need a big picture approach, and you need historical data to estimate flow on any given day.
neither can be done with local data.
could you (slightly) improve traffic by using local traffic flow to determine signals? probably, sure.
but in large systems, on metropolitan scales, that will inevitably lead to unforseen consequences that will probably probe impossible to solve with local solutions or will need to be handles by hard coded rules (think something like “on friday this light needs to be green for 30 sec and red for 15 sec, from 8-17h, except on holidays”) which just introduces insane amounts of maintenance…
source: i used to do analysis on factory shop-floor-planning, which involves simulation of mathematically identical problems.
things like assembly of parts that are dependant on other parts, all of which have different assembly speeds and locations, thus travel times, throughout the process. it gets incredibly complex, incredibly quickly, but it’s a lot of fun to solve, despite being math heavy! one exercise we did at uni, was re-creating the master’s thesis of my professor, which was about finding the optimal locations for snow plow depots containing road salt for an entire province, so, yeah, traffic analysis is largely the same thing math-wise, with a bit of added complexity due to human behavior.
i can say, with certainty, that the data of just the local situation at any given node is not sufficient to optimize the entire system.
you are right about real-time data being important to account for things like construction. that is actually a problem, but has little to do with the local data approach you suggested and can’t be solved by that local data approach either… it’s actually (probably) easier to solve with the big data approach!