• Artyom@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    I mean, this is also an area where neural networks will improve things. Neural networks are excellent for optimizing data with an extremely large amount of input variables, as is the case here. You don’t need language models, you don’t need to steal all the content on the internet for training. You have analysis tools that will easily validate any solution, so you’re not going to deal with mystery hallucinations.

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      It’s not an extremely large amount of data at all, you can get perfect efficiency by having lights act on completely local, real-time, sensor data, as in “how many cars are in which direction”. AI is useful to recognise who wants to use the light but that’s the end of it. You don’t need to predict traffic patters as you don’t need them to see what’s the state of the streets right now, worse, such predictions are a source of BS. Lots of patterns happen all the time that have no precedence as construction sites shift, sportsball games get cancelled or not, whatnot.

      • 9bananas@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 days ago

        I’m extremely sceptical about local data being enough to properly guide traffic…

        the problem is that intersections are connected.

        one intersection influences others down the line, wether that is by keeping back too much traffic, thereby unnecessarily restricting flow, or by letting too much traffic flow, thus creating blockages.

        you need a big picture approach, and you need historical data to estimate flow on any given day.

        neither can be done with local data.

        could you (slightly) improve traffic by using local traffic flow to determine signals? probably, sure.

        but in large systems, on metropolitan scales, that will inevitably lead to unforseen consequences that will probably probe impossible to solve with local solutions or will need to be handles by hard coded rules (think something like “on friday this light needs to be green for 30 sec and red for 15 sec, from 8-17h, except on holidays”) which just introduces insane amounts of maintenance…

        source: i used to do analysis on factory shop-floor-planning, which involves simulation of mathematically identical problems.

        things like assembly of parts that are dependant on other parts, all of which have different assembly speeds and locations, thus travel times, throughout the process. it gets incredibly complex, incredibly quickly, but it’s a lot of fun to solve, despite being math heavy! one exercise we did at uni, was re-creating the master’s thesis of my professor, which was about finding the optimal locations for snow plow depots containing road salt for an entire province, so, yeah, traffic analysis is largely the same thing math-wise, with a bit of added complexity due to human behavior.

        i can say, with certainty, that the data of just the local situation at any given node is not sufficient to optimize the entire system.

        you are right about real-time data being important to account for things like construction. that is actually a problem, but has little to do with the local data approach you suggested and can’t be solved by that local data approach either… it’s actually (probably) easier to solve with the big data approach!