• GreatAlbatross@feddit.uk
        link
        fedilink
        English
        arrow-up
        59
        ·
        10 days ago

        Or from the sounds of it, doing things more efficiently.
        Fewer cycles required, less hardware required.

        Maybe this was an inevitability, if you cut off access to the fast hardware, you create a natural advantage for more efficient systems.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          34
          arrow-down
          1
          ·
          10 days ago

          That’s generally how tech goes though. You throw hardware at the problem until it works, and then you optimize it to run on laptops and eventually phones. Usually hardware improvements and software optimizations meet somewhere in the middle.

          Look at photo and video editing, you used to need a workstation for that, and now you can get most of it on your phone. Surely AI is destined to follow the same path, with local models getting more and more robust until eventually the beefy cloud services are no longer required.

          • jmcs@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            39
            arrow-down
            1
            ·
            10 days ago

            The problem for American tech companies is that they didn’t even try to move to stage 2.

            OpenAI is hemorrhaging money even on their most expensive subscription and their entire business plan was to hemorrhage money even faster to the point they would use entire power stations to power their data centers. Their plan makes about as much sense as digging your self out of a hole by trying to dig to the other side of the globe.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              14
              ·
              10 days ago

              Hey, my friends and I would’ve made it to China if recess was a bit longer.

              Seriously though, the goal for something like OpenAI shouldn’t be to sell products to end customers, but to license models to companies that sell “solutions.” I see these direct to consumer devices similarly to how GPU manufacturers see reference cards or how Valve sees the Steam Deck: they’re a proof of concept for others to follow.

              OpenAI should be looking to be more like ARM and less like Apple. If they do that, they might just grow into their valuation.

      • theunknownmuncher@lemmy.world
        link
        fedilink
        English
        arrow-up
        30
        arrow-down
        1
        ·
        edit-2
        9 days ago

        China really has nothing to do with it, it could have been anyone. It’s a reaction to realizing that GPT4-equivalent AI models are dramatically cheaper to train than previously thought.

        It being China is a noteable detail because it really drives the nail in the coffin for NVIDIA, since China has been fenced off from having access to NVIDIA’s most expensive AI GPUs that were thought to be required to pull this off.

        It also makes the USA gov look extremely foolish to have made major foreign policy and relationship sacrifices in order to try to delay China by a few years, when it’s January and China has already caught up, those sacrifices did not pay off, in fact they backfired and have benefited China and will allow them to accelerate while hurting USA tech/AI companies

      • golli@lemm.ee
        link
        fedilink
        English
        arrow-up
        22
        ·
        10 days ago

        It’s a reaction to thinking China has better AI

        I don’t think this is the primary reason behind Nvidia’s drop. Because as long as they got a massive technological lead it doesn’t matter as much to them who has the best model, as long as these companies use their GPUs to train them.

        The real change is that the compute resources (which is Nvidia’s product) needed to create a great model suddenly fell of a cliff. Whereas until now the name of the game was that more is better and scale is everything.

        China vs the West (or upstart vs big players) matters to those who are investing in creating those models. So for example Meta, who presumably spends a ton of money on high paying engineers and data centers, and somehow got upstaged by someone else with a fraction of their resources.

          • golli@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 days ago

            Looking at the market cap of Nvidia vs their competitors the market belives it is, considering they just lost more than AMD/Intel and the likes are worth combined and still are valued at $2.9 billion.

            And with technology i mean both the performance of their hardware and the software stack they’ve created, which is a big part of their dominance.

            • mapumbaa@lemmy.zip
              link
              fedilink
              English
              arrow-up
              2
              ·
              9 days ago

              Yeah. I don’t believe market value is a great indicator in this case. In general, I would say that capital markets are rational at a macro level, but not micro. This is all speculation/gambling.

              My guess is that AMD and Intel are at most 1 year behind Nvidia when it comes to tech stack. “China”, maybe 2 years, probably less.

              However, if you can make chips with 80% performance at 10% price, its a win. People can continue to tell themselves that big tech always will buy the latest and greatest whatever the cost. It does not make it true. I mean, it hasn’t been true for a really long time. Google, Meta and Amazon already make their own chips. That’s probably true for DeepSeek as well.

              • golli@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 days ago

                Yeah. I don’t believe market value is a great indicator in this case. In general, I would say that capital markets are rational at a macro level, but not micro. This is all speculation/gambling.

                I have to concede that point to some degree, since i guess i hold similar views with Tesla’s value vs the rest of the automotive Industry. But i still think that the basic hirarchy holds true with nvidia being significantly ahead of the pack.

                My guess is that AMD and Intel are at most 1 year behind Nvidia when it comes to tech stack. “China”, maybe 2 years, probably less.

                Imo you are too optimistic with those estimations, particularly with Intel and China, although i am not an expert in the field.

                As i see it AMD seems to have a quite decent product with their instinct cards in the server market on the hardware side, but they wish they’d have something even close to CUDA and its mindshare. Which would take years to replicate. Intel wish they were only a year behind Nvidia. And i’d like to comment on China, but tbh i have little to no knowledge of their state in GPU development. If they are “2 years, probably less” behind as you say, then they should have something like the rtx 4090, which was released end of 2022. But do they have something that even rivals the 2000 or 3000 series cards?

                However, if you can make chips with 80% performance at 10% price, its a win. People can continue to tell themselves that big tech always will buy the latest and greatest whatever the cost. It does not make it true.

                But the issue is they all make their chips at the same manufacturer, TSMC, even Intel in the case of their GPUs. So they can’t really differentiate much on manufacturing costs and are also competing on the same limited supply. So no one can offer 80% of performance at 10% price, or even close to it. Additionally everything around the GPU (datacenters, rack space, power useage during operation etc.) also costs, so it is only part of the overall package cost and you also want to optimize for your limited space. As i understand it datacenter building and power delivery for them is actually another limiting factor right now for the hyperscalers.

                Google, Meta and Amazon already make their own chips. That’s probably true for DeepSeek as well.

                Google yes with their TPUs, but the others all use Nvidia or AMD chips to train. Amazon has their Graviton CPUs, which are quite competitive, but i don’t think they have anything on the GPU side. DeepSeek is way to small and new for custom chips, they evolved out of a hedge fund and just use nvidia GPUs as more or less everyone else.

      • nieceandtows@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 days ago

        From what I understand, it’s more that it takes a lot less money to train your own llms with the same powers with this one than to pay license to one of the expensive ones. Somebody correct me if I’m wrong

        • CheeseNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          9 days ago

          I wouldn’t be surprised if China spent more on AI development than the west did, sure here we spent tens of billions while China only invested a few million but that few million was actually spent on the development while out of the tens of billions all but 5$ was spent on bonuses and yachts.

      • bobalot@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        9 days ago

        Does it still need people spending huge amounts of time to train models?

        After doing neural networks, fuzzy logic, etc. in university, I really question the whole usability of what is called “AI” outside niche use cases.

      • tburkhol@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        15
        ·
        10 days ago

        Exactly. Galaxy brains on Wall Street realizing that nvidia’s monopoly pricing power is coming to an end. This was inevitable - China has 4x as many workers as the US, trained in the best labs and best universities in the world, interns at the best companies, then, because of racism, sent back to China. Blocking sales of nvidia chips to China drives them to develop their own hardware, rather than getting them hooked on Western hardware. China’s AI may not be as efficient or as good as the West right now, but it will be cheaper, and it will get better.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      47
      arrow-down
      5
      ·
      10 days ago

      It’s coming, Pelosi sold her shares like a month ago.

      It’s going to crash, if not for the reasons she sold for, as more and more people hear she sold, they’re going to sell because they’ll assume she has insider knowledge due to her office.

      Which is why politicians (and spouses) shouldn’t be able to directly invest into individual companies.

      Even if they aren’t doing anything wrong, people will follow them and do what they do. Only a truly ignorant person would believe it doesn’t have an effect on other people.

    • SuiXi3D@fedia.io
      link
      fedilink
      arrow-up
      17
      arrow-down
      1
      ·
      10 days ago

      I just hope it means I can get a high end GPU for less than a grand one day.

      • NuXCOM_90Percent@lemmy.zip
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        9 days ago

        Prices rarely, if ever, go down and there is a push across the board to offload things “to the cloud” for a range of reasons.

        That said: If your focus is on gaming, AMD is REAL good these days and, if you can get past their completely nonsensical naming scheme, you can often get a really good GPU using “last year’s” technology for 500-800 USD (discounted to 400-600 or so).

      • manicdave@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        9 days ago

        I’m using an Rx6700xt which you can get for about £300 and it works fine.

        Edit: try using ollama on your PC. If your CPU is capable, that software should work out the rest.

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 days ago

      If anything, this will accelerate the AI hype, as big leaps forward have been made without increased resource usage.

      • Alphane Moon@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        ·
        edit-2
        10 days ago

        Something is got to give. You can’t spend ~$200 billion annually on capex and get a mere $2-3 billion return on this investment.

        I understand that they are searching for a radical breakthrough “that will change everything”, but there is also reasons to be skeptical about this (e.g. documents revealing that Microsoft and OpenAI defined AGI as something that can get them $100 billion in annual revenue as opposed to some specific capabilities).

  • barsoap@lemm.ee
    link
    fedilink
    English
    arrow-up
    119
    ·
    edit-2
    10 days ago

    Shovel vendors scrambling for solid ground as prospectors start to understand geology.

    …that is, this isn’t yet the end of the AI bubble. It’s just the end of overvaluing hardware because efficiency increased on the software side, there’s still a whole software-side bubble to contend with.

    • theunknownmuncher@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      10 days ago

      there’s still a whole software-side bubble to contend with

      They’re ultimately linked together in some ways (not all). OpenAI has already been losing money on every GPT subscription that they charge a premium for because they had the best product, now that premium must evaporate because there are equivalent AI products on the market that are much cheaper. This will shake things up on the software side too. They probably need more hype to stay afloat

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      9 days ago

      The software side bubble should take a hit here because:

      • Trained model made available for download and offline execution, versus locking it behind a subscription friendly cloud only access. Not the first, but it is more famous.

      • It came from an unexpected organization, which throws a wrench in the assumption that one of the few known entities would “win it”.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      9 days ago

      …that is, this isn’t yet the end of the AI bubble.

      The “bubble” in AI is predicated on proprietary software that’s been oversold and underdelivered.

      If I can outrun OpenAI’s super secret algorithm with 1/100th the physical resources, the $13B Microsoft handed Sam Altman’s company starts looking like burned capital.

      And the way this blows up the reputation of AI hype-artists makes it harder for investors to be induced to send US firms money. Why not contract with Hangzhou DeepSeek Artificial Intelligence directly, rather than ask OpenAI to adopt a model that’s better than anything they’ve produced to date?

    • meliante@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 days ago

      I really think GenAI is comparable to the internet in terms of what it will allow mankind in a couple of decades.

      Lots of people thought the internet was a fad and saw no future for it …

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        9 days ago

        Lots of techies loved the internet, built it, and were all early adopters. Lots of normies didn’t see the point.

        With AI it’s pretty much the other way around: CEOs saying “we don’t need programmers, any more”, while people who understand the tech roll their eyes.

        • oldfart@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 days ago

          Back then the CEOs were babbling about information superhighways while tech rolled their eyes

        • meliante@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          9 days ago

          I believe programming languages will become obsolete. You’ll still need professionals that will be experts in leading the machines but not nearly as hands on as presently. The same for a lot of professions that exist currently.

          I like to compare GenAI to the assembly line when it was created, but instead of repetitive menial tasks, it’s repetitive mental tasks that it improves/performs.

          • barsoap@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            9 days ago

            Oh great you’re one of them. Look I can’t magically infuse tech literacy into you, you’ll have to learn to program and, crucially, understand how much programming is not about giving computers instructions.

            • meliante@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              9 days ago

              Let’s talk in five years. There’s no point in discussing this right now. You’re set on what you believe you know and I’m set on what I believe I know.

              And, piece of advice, don’t assume others lack tech literacy because they don’t agree with you, it just makes you look like a brat that can’t discuss things maturely and invites the other part to be a prick as well.

              Especially because programming is quite fucking literally giving computers instructions, despite what you believe keyboard monkeys do. You wanker!

              What? You think “developers” are some kind on mythical beings that possess the mystical ability of speaking to the machines in cryptic tongues?

              They’re a dime a dozen, the large majority of “developers” are just cannon fodder that are not worth what they think they are.

              Ironically, the real good ones probably brought about their demise.

              • barsoap@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                edit-2
                9 days ago

                Especially because programming is quite fucking literally giving computers instructions, despite what you believe keyboard monkeys do. You wanker!

                What? You think “developers” are some kind on mythical beings that possess the mystical ability of speaking to the machines in cryptic tongues?

                First off, you’re contradicting yourself: Is programming about “giving instructions in cryptic languages”, or not?

                Then, no: Developers are mythical beings who possess the magical ability of turning vague gesturing full of internal contradictions, wishful thinking, up to right-out psychotic nonsense dreamt up by some random coke-head in a suit, into hard specifications suitable to then go into algorithm selection and finally into code. Typing shit in a cryptic language is the easy part, also, it’s not cryptic, it’s precise.

                • meliante@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  edit-2
                  9 days ago

                  You must be a programmer. Can’t understand shit of what you’re told to do and then blame the client for “not knowing how it works”. Typical. Stereotypical even!

                  Read it again moron, or should I use an LLM to make it simpler for your keyboard monkey brain?

          • Strider@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 days ago

            That’s not the way it works. And I’m not even against that.

            It sill won’t work this way a few years later.

            • meliante@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              9 days ago

              I’m not talking about this being a snap transition. It will take several years but I do think this tech will evolve in that direction.

              I’ve been working with LLMs since month 1 and in these short 24 months things have progressed in a way that is mind boggling.

              I’ve produced more and better than ever and we’re developing a product that improves and makes some repetitive “sweat shop” tasks regarding documentation a thing of the past for people. It really is cool.

              • Strider@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                9 days ago

                In part we agree. However there are two things to consider.

                For one, the llms are plateauing pretty much now. So they are dependant on more quality input. Which, basically, they replace. So perspecively imo the learning will not work to keep this up. (in other fields like nature etc there’s comparatively endless input for training, so it will keep on working there).

                The other thing is, as we likely both agree, this is not intelligence. It has it’s uses. But you said to replace programming, which in my opinion will never work: were missing the critical intelligence element. It might be there at some point. Maybe llm will help there, maybe not, we might see. But for now we don’t have that piece of the puzzle and it will not be able to replace human work with (new) thought put into it.

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 days ago

        Sure but you had the .com bubble but it was still useful. Same as AI in a big bubble right now doesn’t mean it won’t be useful.

        • meliante@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 days ago

          Oh yes, there definitely is a bubble, but I don’t believe that means the tech is worthless, not even close to worthless.

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 days ago

        I don’t know. In a lot of usecase AI is kinda crap, but there’s certain usecase where it’s really good. Honestly I don’t think people are giving enough thought to it’s utility in early-middle stages of creative works where an img2img model can take the basic composition from the artist, render it then the artist can go in and modify and perfect it for the final product. Also video games that use generative AI are going to be insane in about 10-15 years. Imagine an open world game where it generates building interiors and NPCs as you interact with them, even tying the stuff the NPCs say into the buildings they’re in, like an old sailer living in a house with lots of pictures of boats and boat models, or the warrior having tons of books about battle and decorative weapons everywhere all in throw away structures that would have previously been closed set dressing. Maybe they’ll even find sane ways to create quests on the fly that don’t feel overly cookie-cutter? Life changing? Of course not, but definitely a cool technology with a lot of potential

        Also realistically I don’t think there’s going to be long term use for AI models that need a quarter of a datacenter just to run, and they’ll all get tuned down to what can run directly on a phone efficiently. Maybe we’ll see some new accelerators become common place maybe we won’t.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    119
    arrow-down
    1
    ·
    9 days ago

    Good. That shit is way overvalued.

    There is no way that Nvidia are worth 3 times as much as TSMC, the company that makes all their shit and more besides.

    I’m sure some of my market tracker funds will lose value, and they should, because they should never have been worth this much to start with.

    • CleoTheWizard@lemmy.world
      link
      fedilink
      English
      arrow-up
      47
      ·
      9 days ago

      It’s because Nvidia is an American company and also because they make final stage products. American companies right now are all overinflated and almost none of the stocks are worth what they’re at because of foreign trading influence.

      As much as people whine about inflation here, the US didn’t get hit as bad as many other countries and we recovered quickly which means that there is a lot of incentive for other countries to invest here. They pick our top movers, they invest in those. What you’re seeing is people bandwagoning onto certain stocks because the consistent gains create more consistent gains for them.

      The other part is that yes, companies who make products at the end stage tend to be worth a lot more than people trading more fundamental resources or parts. This is true of almost every industry except oil.

      • bobalot@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        9 days ago

        It is also because the USA is the reserve currency of the world with open capital markets.

        Savers of the world (including countries like Germany and China who have excess savings due to constrained consumer demand) dump their savings into US assets such as stocks.

        This leads to asset bubbles and an uncompetitively high US dollar.

        • Freefall@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          9 days ago

          The current administration is working real hard on removing trust and value of anything American.

          • bobalot@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            8 days ago

            The root problem they are trying to fix is real (systemic trade imbalances) but they way they are trying to fix it is terrible and won’t work.

            1. Only a universally applied tariff would work in theory but would require other countries not to retaliate (there will 100% be retaliation).

            2. It doesn’t really solve the root cause, capital inflows into the USA rather than purchasing US goods and services.

            3. Trump wants to maintain being the reserve currency which is a big part of the problem (the strength of currency may not align with domestic conditions, i.e. high when it needs to be low).

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 days ago

        The US is also a regulations haven compared to other developed economies, corporations get away with shit in most places but America is on a whole other level of regulatory capture.

  • drascus@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    3
    ·
    9 days ago

    Okay seriously this technology still baffles me. Like its cool but why invest so much in an unknown like AIs future ? We could invest in people and education and end up with really smart people. For the cost of an education we could end up with smart people who contribute to the economy and society. Instead we are dumping billions into this shit.

    • vga@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      34
      ·
      9 days ago

      For the cost of an education we could end up with smart people who contribute to the economy and society. Instead we are dumping billions into this shit.

      Those are different "we"s.

    • sudo42@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      1
      ·
      edit-2
      9 days ago

      Tech/Wall St constantly needs something to hype in order to bring in “investor” money. The “new technology-> product development -> product -> IPO” pipeline is now “straight to pump-and-dump” (for example, see Crypto currency).

      The excitement of the previous hype train (self-driving cars) is no longer bringing in starry-eyed “investors” willing to quickly part ways with OPM. “AI” made a big splash and Tech/Wall St is going to milk it for all they can lest they fall into the same bad economy as that one company that didn’t jam the letters “AI” into their investor summary.

      Tech has laid off a lot of employees, which means they are aware there is nothing else exciting in the near horizon. They also know they have to flog “AI” like crazy before people figure out there’s no “there” there.

      That “investors” scattered like frightened birds at the mere mention of a cheaper version means that they also know this is a bubble. Everyone wants the quick money. More importantly they don’t want to be the suckers left holding the bag.

        • sudo42@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 days ago

          I follow EV battery tech a little. You’re not wrong that there is a lot of “oh its just around the bend” in battery development and tech development in general. I blame marketing for 80% of that.

          But battery technology is changing drastically. The giant cell phone market is pushing battery tech relentlessly. Add in EV and grid storage demand growth and the potential for some companies to land on top of a money printing machine is definitely there.

          We’re in a golden age of battery research. Exciting for our future, but it will be a while before we consumers will have clear best options.

    • AppleTea@lemmy.zip
      link
      fedilink
      English
      arrow-up
      21
      ·
      9 days ago

      It’s easier to sell people on the idea of a new technology or system that doesn’t have any historical precedent. All you have to do is list the potential upsides.

      Something like a school or a workplace training programme, those are known quantities. There’s a whole bunch of historical and currently-existing projects anyone can look at to gauge the cost. Your pitch has to be somewhat realistic compared to those, or it’s gonna sound really suspect.

    • _chris@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      9 days ago

      Education doesn’t make a tech CEO ridiculously wealthy, so there’s no draw for said CEOs to promote the shit out of education.

      Plus educated people tend to ask for more salary. Can’t do that and become a billionaire!

    • surph_ninja@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      6
      ·
      9 days ago

      And you could pay people to use an abacus instead of a calculator. But the advanced tech improves productivity for everyone, and helps their output.

      If you don’t get the tech, you should play with it more.

      • TheFriar@lemm.ee
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        2
        ·
        8 days ago

        “Improves productivity for everyone”

        Famously only one class benefits from productivity, while one generates the productivity. Can you explain what you mean, if you don’t mean capitalistic productivity?

        • surph_ninja@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          8 days ago

          I’m referring to output for amount of work put in.

          I’m a socialist. I care about increased output leading to increased comfort for the general public. That the gains are concentrated among the wealthy is not the fault of technology, but rather those who control it.

          Thank god for DeepSeek.

      • fuck_u_spez_in_particular@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        8 days ago

        I get the tech, and still agree with the preposter. I’d even go so far as that it probably worsens a lot currently, as it’s generating a lot of bullshit that sounds great on the surface, but in reality is just regurgitated stuff that the AI has no clue of. For example I’m tired of reading AI generated text, when a hand written version would be much more precise and has some character at least…

        • surph_ninja@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          8
          ·
          8 days ago

          It’s one thing to be ignorant. It’s quite another to be confidently so in the face of overwhelming evidence that you’re wrong. Impressive.

          • fuck_u_spez_in_particular@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            8 days ago

            confidently so in the face of overwhelming evidence

            That I’d really like to see. And I mean more than the marketing bullshit that AI companies are doing…

            For the record I was one of the first jumping on the AI hype-train (as programmer, and computer-scientist with machine-learning background), following the development of GPT1-4, being excited about having to do less boilerplaty code etc. getting help about rough ideas etc. GPT4 was almost so far as being a help (similar with o1 etc. or Anthropics models). Though I seldom use AI currently (and I’m observing similar with other colleagues and people I know of) because it actually slows me down with my stuff or gives wrong ideas, having to argue, just to see it yet again saturating at a local-minimum (aka it doesn’t get better, no matter what input I try). Just so that I have to do it myself… (which I should’ve done in the first place…).

            Same is true for the image-generative side (i.e. first with GANs now with diffusion-based models).

            I can get into more details about transformer/attention-based-models and its current plateau phase (i.e. more hardware doesn’t actually make things significantly better, it gets exponentially more expensive to make things slightly better) if you really want…

            I hope that we do a breakthrough of course, that a model actually really learns reasoning, but I fear that that will take time, and it might even mean that we need different type of hardware.

            • surph_ninja@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              8 days ago

              Any other AI company, and most of that would be legitimate criticism of the overhype used to generate more funding. But how does any of that apply to DeepSeek, and the code & paper they released?

              • fuck_u_spez_in_particular@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 days ago

                DeepSeek

                Yeah it’ll be exciting to see where this goes, i.e. if it really develops into a useful tool, for certain. Though I’m slightly cautious non-the less. It’s not doing something significantly different (i.e. it’s still an LLM), it’s just a lot cheaper/efficient to train, and open for everyone (which is great).

    • Redex@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 days ago

      Look at it in another way, people think this is the start of an actual AI revolution, as in full blown AGI or close to it or something very capable at least. Personally I don’t think we’re anywhere near something like that with the current technology, I think it’s a dead end, but if there’s even a small possibility of it being true, you want to invest early because the returns will be insane if it pans out. Full blown AGI would revolutionize everything, it would probably be the next industrial revolution after the internet.

    • lightnsfw@reddthat.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      9 days ago

      How would the investors profit from paying for someone’s education? By giving them a loan? Don’t we have enough problems with the student loan system without involving these assholes more?

  • ChiefGyk3D@infosec.pub
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    1
    ·
    10 days ago

    My understanding is that DeepSeek still used Nvidia just older models and way more efficiently, which was remarkable. I hope to tinker with the opensource stuff at least with a little Twitch chat bot for my streams I was already planning to do with OpenAI. Will be even more remarkable if I can run this locally.

    However this is embarassing to the western companies working on AI and especially with the $500B announcement of Stargate as it proves we don’t need as high end of an infrastructure to achieve the same results.

    • sunzu2@thebrainbin.org
      link
      fedilink
      arrow-up
      30
      arrow-down
      4
      ·
      10 days ago

      500b of trust me Bros… To shake down US taxpayer for subsidies

      Read between the lines folks

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      9 days ago

      It’s really not. This is the ai equivalent of the vc repurposing usa bombs that didn’t explode when dropped.

      Their model is the differentiator here but they had to figure out something more efficient in order to overcome the hardware shortcomings.

      The us companies will soon outpace this by duping the model and running it on faster hw

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        9 days ago

        Throw more hardware and power at it. Build more power plants so we can use AI.

    • Cocodapuf@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      9 days ago

      My understanding is that DeepSeek still used Nvidia just older models

      That’s the funniest part here, the sell off makes no sense. So what if some companies are better at utilizing AI than others, it all runs in the same hardware. Why sell stock in the hardware company? (Besides the separate issue of it being totally overvalued at the moment)

      This would be kind of like if a study showed that American pilots were more skilled than European pilots, so investors sold stock in airbus… Either way, the pilots still need planes to fly…

      • bobalot@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        9 days ago

        Perhaps the stocks were massively overvalued and any negative news was going to start this sell off regardless of its actual impact?

        That is my theory anyway.

      • hitmyspot@aussie.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 days ago

        Yes, but if they already have lots of planes, they don’t need to keep buying more planes. Especially if their current planes can now run for longer.

        AI is not going away but it will require less computing power and less capital investment. Not entirely unexpected as a trend, but this was a rapid jump that will catch some off guard. So capital will be reallocated.

  • gerryflap@feddit.nl
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    3
    ·
    9 days ago

    I’m so happy this happened. This is really a power move from China. The US was really riding the whole AI bubble. By “just” releasing a powerful open-source AI model they’ve fucked the not so open US AI companies. I’m not sure if this was planned from China or whether this is was really just a small company doing this because they wanted to, but either way this really damages the western economy. And its given western consumers a free alternative. A few million dollars invested (if we are to believe the cost figures) for a major disruption.

    • surph_ninja@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      20
      ·
      9 days ago

      Socialism/Communism will always outcompete the capitalists. And they know it, which is why the US invades, topples, or sanctions every country that moves towards worker controlled countries.

        • surph_ninja@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          5
          ·
          8 days ago

          That you had to qualify it with a date after it had been corrupted by the west, implies that you’re well aware of how well communism served for half a century before that.

          They went from a nation of dirt poor peasants, to a nuclear superpower driving the space race in just a couple of decades. All thanks to communism. And also why China is leaving us in the dust.

            • surph_ninja@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              3
              ·
              8 days ago

              Any corrupt leaders are capable of committing genocide. The difference is capitalism requires genocide to continue functioning.

          • houstoneulers@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            edit-2
            8 days ago

            There are many instances of communism failing lmao

            There are also many current communist states that have less freedom than many capitalist states

            Also, you need to ask the Uyghurs how they’re feeling about their experience under the communist government you’re speaking so highly of at the moment.

            • surph_ninja@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              8 days ago

              How many of those instances failed due to external factors, such as illegal sanctions or a western coup or western military aggression?

              Which communist states would you say have less freedom than your country? Let’s compare.

              The Uyghur genocide was debunked. Even the US state department was forced to admit they didn’t have the evidence to support their claims. In reality, western intelligence agencies were trying to radicalize the Uyghurs to destabilize the region, but China has been rehabilitating them. The intel community doesn’t like their terrorist fronts to be shut down.

              • houstoneulers@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                8 days ago

                LMAO found the pro-Xi propagandist account

                Either you’re brainwashed, are only reading one-sided articles, or you’re an adolescent with little world experience given how confidently you speak in absolutes, which doesn’t reflect how nuanced the global stage is.

                I’m not saying capitalism is the best, but communism won’t ALWAYS beat out capitalism (as it hasn’t regardless of external factors b/c if those regimes were strong enough they would be able to handle or recover from external pressures) nor does it REQUIRE negatively affecting others as your other comment says. You’re just delulu.

                Remember, while there maybe instances where all versions of a certain class of anything are equal, in most cases they are not. So blanketly categorizing as your have done just reflects your lack of historical perspective.

                • surph_ninja@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  8 days ago

                  You should really drop the overconfidence, and re-evaluate your biases and perspectives. Regurgitating western propaganda almost verbatim is not a good sign that you’re on the right path.

      • Bohurt@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        8 days ago

        You don’t even realise how strong capitalism is in China.

        • surph_ninja@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          8 days ago

          It sounds like you don’t know what “capitalism” means. Market participation exists in other economy types, too. It’s how the means of production are controlled and the profits distributed that defines capitalism vs communism.

          And you don’t lift 800 million people out of poverty under capitalism. Or they’ve done a ridiculously bad job of concentrating profits into the hands of a very small few.

        • surph_ninja@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 days ago

          Absolutely. More direct democracy. The whole point of representative democracy is issues of time and distance. Now that we can communicate fast and across the globe, average citizens should play a much larger & more active role in directing the government.

  • Justin@lemmy.jlh.name
    link
    fedilink
    English
    arrow-up
    43
    ·
    10 days ago

    Bizarre story. China building better LLMs and LLMs being cheaper to train does not mean that nVidia will sell less GPUs when people like Elon Musk and Donald Trump can’t shut up about how important “AI” is.

    I’m all for the collapse of the AI bubble, though. It’s cool and all that all the bankers know IT terms now, but the massive influx of money towards LLMs and the datacenters that run them has not been healthy to the industry or the broader economy.

    • theunknownmuncher@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      2
      ·
      edit-2
      10 days ago

      It literally defeats NVIDIA’s entire business model of “I shit golden eggs and I’m the only one that does and I can charge any price I want for them because you need my golden eggs”

      Turns out no one actually even needs a golden egg anyway.

      And… same goes for OpenAI, who were already losing money on every subscription. Now they’ve lost the ability to charge a premium for their service (anyone can train a GPT4 equivalent model cheaply, or use DeepSeek’s existing open models) and subscription prices will need to come down, so they’ll be losing money even faster

      • Justin@lemmy.jlh.name
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        10 days ago

        Nvidia cards were the only GPUs used to train DeepSeek v3 and R1. So, that narrative still superficially holds. Other stocks like TSMC, ASML, and AMD are also down in pre-market.

          • Justin@lemmy.jlh.name
            link
            fedilink
            English
            arrow-up
            8
            ·
            10 days ago

            Ah, fair. I guess it makes sense that Wall Street is questioning the need for these expensive blackwell gpus when the hopper gpus are already so good?

            • legion02@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              10 days ago

              It’s more that the newer models are going to need less compute to train and run them.

              • frezik@midwest.social
                link
                fedilink
                English
                arrow-up
                8
                ·
                10 days ago

                Right. There’s indications of 10x to 100x less compute power needed to train the models to an equivalent level. Not a small thing at all.

                • NuXCOM_90Percent@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  ·
                  edit-2
                  9 days ago

                  Not small but… smaller than you would expect.

                  Most companies aren’t, and shouldn’t be, training their own models. Especially with stuff like RAG where you can use the highly trained model with your proprietary offline data with only a minimal performance hit.

                  What matters is inference and accuracy/validity. Inference being ridiculously cheap (the reason why AI/ML got so popular) and the latter being a whole different can of worms that industry and researchers don’t want you to think about (in part because “correct” might still be blatant lies because it is based on human data which is often blatant lies but…).

                  And for the companies that ARE going to train their own models? They make enough bank that ordering the latest Box from Jensen is a drop in the bucket.


                  That said, this DOES open the door back up for tiered training and the like where someone might use a cheaper commodity GPU to enhance an off the shelf model with local data or preferences. But it is unclear how much industry cares about that.

    • Redditsux@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      9 days ago

      US economy has been running on bubbles for decades, and using bubbles to fuel innovation and growth. It has survived telecom bubble, housing bubble, bubble in the oil sector for multiple times (how do you think fracking came to be?) etc. This is just the start of the AI bubble because its innovations have yet to have a broad-based impact on the economy. Once AI becomes commonplace in aiding in everything we do, that’s when valuations will look “normal”.

  • Kazumara@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    1
    ·
    9 days ago

    Hm even with DeepSeek being more efficient, wouldn’t that just mean the rich corps throw the same amount of hardware at it to achieve a better result?

    In the end I’m not convinced this would even reduce hardware demand. It’s funny that this of all things deflates part of the bubble.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      45
      ·
      edit-2
      9 days ago

      Hm even with DeepSeek being more efficient, wouldn’t that just mean the rich corps throw the same amount of hardware at it to achieve a better result?

      Only up to the point where the AI models yield value (which is already heavily speculative). If nothing else, DeepSeek makes Altman’s plan for $1T in new data-centers look like overkill.

      The revelation that you can get 100x gains by optimizing your code rather than throwing endless compute at your model means the value of graphics cards goes down relative to the value of PhD-tier developers. Why burn through a hundred warehouses full of cards to do what a university mathematics department can deliver in half the time?

      • AppleTea@lemmy.zip
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        9 days ago

        you can get 100x gains by optimizing your code rather than throwing endless compute at your model

        woah, that sounds dangerously close to saying this is all just developing computer software. Don’t you know we’re trying to build God???

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 days ago

          Altman insisting that once the model is good enough, it will program itself was the moment I wrote the whole thing off as a flop.

    • peereboominc@lemm.ee
      link
      fedilink
      English
      arrow-up
      13
      ·
      9 days ago

      Maybe but it also means that if a company needs a datacenter with 1000 gpu’s to do it’s AI tasks demand, it will now buy 500.

      Next year it might need more but then AMD could have better gpu’s.

    • mapumbaa@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      9 days ago

      It will probably not reduce demand. But it will for sure make it impossible to sell insanely overpriced hardware. Now I’m looking forward to buying a PC with a Chinese open source RISCV CPU and GPU. Bye bye Intel, AMD, ARM and Nvidia.

  • index@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    9 days ago

    It still rely on nvidia hardware why would it trigger a sell-off? Also why all media are picking up this news? I smell something fishy here…

    • Railcar8095@lemm.ee
      link
      fedilink
      English
      arrow-up
      31
      ·
      9 days ago

      The way I understood it, it’s much more efficient so it should require less hardware.

      Nvidia will sell that hardware, an obscene amount of it, and line will go up. But it will go up slower than nvidia expected because anything other than infinite and always accelerating growth means you’re not good at business.

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 days ago

        Back in the day, that would tell me to buy green.

        Of course, that was also long enough ago that you could just swap money from green to red every new staggered product cycle.

    • PhAzE@lemmy.ca
      link
      fedilink
      English
      arrow-up
      18
      ·
      9 days ago

      It requires only 5% of the same hardware that OpenAI needs to do the same thing. So that can mean less quantity of top end cards and it can also run on less powerful cards (not top of the line).

      Should their models become standard or used more commonly, then nvidis sales will drop.

      • b34k@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        9 days ago

        Doesn’t this just mean that now we can make models 20x more complex using the same hardware? There’s many more problems that advanced Deep Learning models could potentially solve that are far more interesting and useful than a chat bot.

        I don’t see how this ends up bad for Nvidia in the long run.

        • Isthisreddit@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          9 days ago

          Honestly none of this means anything at the moment. This might be some sort of calculated trickery from China to give Nvidia the finger, or Biden the finger, or a finger to Trump’s AI infrastructure announcement a few days ago, or some other motive.

          Maybe this “selloff” is masterminded by the big wall street players (who work hand-in-hand with investor friendly media) to panic retail investors so they can snatch up shares at a discount.

          What I do know is that “AI” is a very fast moving tech and shit that was true a few months ago might not be true tomorrow - no one has a crystal ball so we all just gotta wait and see.

    • ArchRecord@lemm.ee
      link
      fedilink
      English
      arrow-up
      16
      ·
      9 days ago

      Here’s someone doing 200 tokens/s (for context, OpenAI doesn’t usually get above 100) on… A Raspberry Pi.

      Yes, the “$75-$120 micro computer the size of a credit card” Raspberry Pi.

      If all these AI models can be run directly on users devices, or on extremely low end hardware, who needs large quantities of top of the line GPUs?

      • aesthelete@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        1
        ·
        edit-2
        9 days ago

        Thank the fucking sky fairies actually, because even if AI continues to mostly suck it’d be nice if it didn’t swallow up every potable lake in the process. When this shit is efficient that makes it only mildly annoying instead of a complete shitstorm of failure.

      • adoxographer@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        9 days ago

        While this is great, the training is where the compute is spent. The news is also about R1 being able to be trained, still on an Nvidia cluster but for 6M USD instead of 500

        • orange@communick.news
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 days ago

          That’s becoming less true. The cost of inference has been rising with bigger models, and even more so with “reasoning models”.

          Regardless, at the scale of 100M users, big one-off costs start looking small.

        • vrighter@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 days ago

          if, on a modern gaming pc, you can get breakneck speeds of 5 tokens per second, then actually inference is quite energy intensive too. 5 per second of anything is very slow

    • teegus@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      9 days ago

      A year ago the price was $62, now after the fall it is $118. Stocks are volatile, what else is new? Pretty much non-news if you ask me.

    • 𝓔𝓶𝓶𝓲𝓮@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      9 days ago

      And you should, generally we are amidst the internet world war. It’s not something fishy but digital rotten eggs thrown around by the hundreds.

      The only way to remain sane is to ignore it and scroll on. There is no winning versus geopolitical behemoths as a lone internet adventurer. It’s impossible to tell what’s real and what isn’t
      the first casualty of war is truth

    • atempuser23@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 days ago

      Well there is also threats of huge tariffs on all TMSC chips entering the US. That seems like the actual fuel for the selloff.

  • Mac@mander.xyz
    link
    fedilink
    English
    arrow-up
    29
    ·
    9 days ago

    What the fuck are markets when you can automate making money on them???

    Ive been WTF about the stock market for a long time but now it’s obviously a scam.

    • thistleboy@lemmy.world
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      1
      ·
      9 days ago

      The stock market is nothing more than a barometer for the relative peace of mind of rich people.

      • nomy@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 days ago

        Economics is a social science not a hard science, it’s highly reactive to rumors and speculation. The stock market kind of does just run on vibes.

  • vga@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    29
    ·
    edit-2
    9 days ago

    I should really start looking into shorting stocks. I was looking at the news and Nvidia’s stock and thought “huh, the stock hasn’t reacted to these news at all yet, I should probably short this”.

    And then proceeded to do fuck all.

    I guess this is why some people are rich and others are like me.

    • Knock_Knock_Lemmy_In@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      9 days ago

      It’s pretty difficult to open a true short position. Providers like Robinhood create contract for differences which are subject to their TOS.

    • peregrin5@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      8 days ago

      It’s been proven that people who do fuckall after throwing their money into mutual funds generally fare better than people actively monitoring and making stock moves.

      You’re probably fine.

      I never bought NVIDIA in the first place so this news doesn’t affect me.

      If anything now would be a good time to buy NVIDIA. But I probably won’t.

      • MutilationWave@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 days ago

        The vast majority of my invested money is in SPY. I had a lot of “money” wiped out yesterday. It’s already trending back up. I’m holding for now.

  • RxBrad@infosec.pub
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    8 days ago

    Okay, cool…

    So, how much longer before Nvidia stops slapping a “$500-600 RTX XX70” label on a $300 RTX XX60 product with each new generation?

    The thinly-veiled 75-100% price increases aren’t fun for those of us not constantly-touching-themselves over AI.

  • Teknikal@eviltoast.org
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    9 days ago

    Was watching bbc news interview some American guy about this and wow they were really pushing that it’s no big deal and deepseek is way behind and a bit of a joke. Made claims they weren’t under cyber attack they just couldn’t handle having traffic etc.

    Kinda making me root for China honestly.

    • atempuser23@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 days ago

      He’s likely not wrong. Too soon to know how well it lives up to the hype. As well It could be like we had a 6 million $$ budget. Just don’t pay any attention to the free data center use that pre-computed the data. As well startups make reckless optimistic promises that can be delivered on all the time.