DeepSeek's free 685B-parameter AI model runs at 20 tokens/second on Apple's Mac Studio, outperforming Claude Sonnet while using just 200 watts, challenging OpenAI's cloud-dependent business model.
The takeaway from all this is that Western tech companies have gone full parasite and are inflating costs at all costs to funnel more and more VC and public money into their incestuous cesspool. From what Deepseek first demonstrated it seems possible that OpenAI could cut their costs by a significant factor (like by half or maybe a lot more). This could allow them to turn a profit for the first time in their lives but the thought does not even cross their minds.
The takeaway from all this is that Western tech companies have gone full parasite and are inflating costs at all costs to funnel more and more VC and public money into their incestuous cesspool. From what Deepseek first demonstrated it seems possible that OpenAI could cut their costs by a significant factor (like by half or maybe a lot more). This could allow them to turn a profit for the first time in their lives but the thought does not even cross their minds.
More than turning a profit, they could bend the arc of ai’s climate crisis by competing in this domain of efficiency
32 gpus -> 1 gpu loosely translates to a 10x reduction in water and energy needed for interference after the 100x reduction in training requirements.
And these are just the improvements within 4 months…
And that’s the problem with profit motive in a nutshell.