

Yea the vision is very unclear to me, at least what they’ve announced publicly. I’ve seen people say the goal is achieving AGI but I’m not sure what that even means.
From internal docs leak it seems that the company is totally capitalist brained and it doesn’t have a clear definition of AGI either.
According to leaked documents obtained by The Information, the two companies came to agree in 2023 that AGI will be achieved once OpenAI has developed an AI system that can generate at least $100 billion in profits.
https://gizmodo.com/leaked-documents-show-openai-has-a-very-clear-definition-of-agi-2000543339
Again this is also unclear. How is it going to generate profits?
If it’s supposed to replace Google search, one avenue I can see is trying to incorporate ads into the LLM answers, which just degrades their product, but I suppose if they hit a mass market then they can create a sort of walled Garden that keeps people on it.
Another way is that it will totally replace workers. I don’t think this is possible and is AI idealism.
The other way is you make this a product that increases productivity. In that case you can replace workers because it makes the average worker more productive and then a company can do more with less.
But I’m just speculating, I think any of these can be totally wrong.
I wish that OpenAI and their ilk would die already so I could begin to make sense of the world
Ignoring the incoming climate catastrophe, I think the only way for AI to die is for something else new and shiny to come in that VC’s then stop throwing their money into AI buzzword bullshit startup and into that new industry. I think all these VC’s think the same way and want the next “internet”. It’s also why I think so much money was thrown into crypto as even though it was bs, it had a next big thing kinda vibe.
I’m hoping it doesn’t even get to that point and we can have some kinda revolution in the west because, fuck, we’re all dying so that these assholes can maintain their profits.
100% agree that education focuses way too much on test scores. I went through alot of college realizing that “I dont have to actually learn this stuff, I just have to know what questions will be asked on the test and have good answers for those”. When you approach it that way as a student, you dont actually go through the process of really learning the material and cultivating your mind. After spending alot of time out of college and reading alot more, I realize that true education is having the space to acquire knowledge and wrestle around with it until you really understand it, and socializing with others about the material. You kind of do this when studying for tests, which is why I guess tests have been around for so long. In my time though there was the internet, which was just kind of like an extended library that was easier to search, and I realize that it saved me the hassle of needing to socialize with people about the material, which I think is pretty important when learning. LLM’s are probably going to result in worse outcomes, though maybe test scores will remain the same.
I had a friend in college who I thought was very intelligent, because he would basically take every assignment or exam and distill it down to what needed to be done to fit in with the grading system. I realize now just how harmful that approach is to truly educating yourself, yet it seemed right at the time because that is how the education system is designed (and it relates to capitalism because people want to use education to get a job and to get a good job you need good grades etc.)
It’s also pretty telling whenever I ask someone if they ever studied for a test and forgot all the material on the subject right after. That is contradictory to the whole purpose of education.