• ☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOP
    link
    fedilink
    arrow-up
    25
    ·
    5 days ago

    the key bit

    This represents a potentially significant shift in AI deployment. While traditional AI infrastructure typically relies on multiple Nvidia GPUs consuming several kilowatts of power, the Mac Studio draws less than 200 watts during inference. This efficiency gap suggests the AI industry may need to rethink assumptions about infrastructure requirements for top-tier model performance.

    • loathsome dongeater@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      18
      ·
      4 days ago

      The takeaway from all this is that Western tech companies have gone full parasite and are inflating costs at all costs to funnel more and more VC and public money into their incestuous cesspool. From what Deepseek first demonstrated it seems possible that OpenAI could cut their costs by a significant factor (like by half or maybe a lot more). This could allow them to turn a profit for the first time in their lives but the thought does not even cross their minds.

      • Philo_and_sophy@lemmygrad.ml
        link
        fedilink
        English
        arrow-up
        10
        ·
        4 days ago

        More than turning a profit, they could bend the arc of ai’s climate crisis by competing in this domain of efficiency

        32 gpus -> 1 gpu loosely translates to a 10x reduction in water and energy needed for interference after the 100x reduction in training requirements.

        And these are just the improvements within 4 months…