“Blockchain engineer” isn’t a job.
It was “Blockchain” in 2017. “NFT” in 2020. “AI” in 2023. In a few years, there will be a new buzzword that companies throw a bunch of money at in hopes of being on the leading edge of the ‘next big thing.’
To me it looks like an over estimation of the capabilities for the tech. Same kind of thinking that led to lawyers submitting fake cases as support in court. The current tech can be useful but has to be verified and generally tweaked a bit to be good enough. It certainly has room for improvement in quality and just not lying. Real world use has some copyright questions with what the training data was. Applying it to something creative is questionable and more or less feels like uninspired remixes.
Also the whole graphic is kinda suspect to me when “Blockchain engineers” is a job category and it’s produced by an org working on AI.
deleted by creator
I love how specific the labor jobs on the left are and the right side is like… All mathematicians.
deleted by creator
This image/report itself doesn’t make much sense – probably it was generated by chatGPT itself.
- “What makes your job exposed to GPT?” – OK I expect a list of possible answers:
- “Low wages”: OK, having a low wage makes my job exposed to GPT.
- “Manufacturing”: OK, manufacturing makes my job exposed to GPT. …No wait, what does that mean?? You mean if my job is about manufacturing, then it’s exposed to GPT? OK but then shouldn’t this be listed under the next question, “What jobs are exposed to GPT?”?
- …
- “Jobs requiring low formal education”: what?! The question was “what makes your job exposed to GPT?”. From this answer I get that “jobs requiring low formal education make my job exposed to GPT”. Or I get that who/whatever wrote this knows no syntax or semantics. OK, sorry, you meant “If your job requires low formal education, then it’s exposed to GPT”. But then shouldn’t this answer also be listed under the next question??
- “What jobs are exposed to GPT?”
- “Athletes”. Well, “athletes” semantically speaking is not a job; maybe “athletics” is a job. But who gives a shirt about semantics? there’s chatGPT today after all.
- The same with the rest. “Stonemasonry” is a job, “stonemasons” are the people who do that job. At least the question could have been “Which job categories are exposed to GPT?”.
- “Pile driver operators”: this very specific job category is thankfully Low Exposure. “What if I’m a pavement operator instead?” – sorry, you’re out of luck then.
- “High exposure: Mathematicians”. Mmm… wait, wait. Didn’t you say that “Science skills” and “Critical thinking skills” were “Low Exposure”, in the previous question?
Icanhazcheezeburger? 🤣
(Just to be clear, I’m not making fun of people who do any of the specialized, difficult, and often risky jobs mentioned above. I’m making fun of the fact that the infographic is so randomly and unexplainably specific in some points)
I’ve seen GPT struggling with pretty basic maths and “abstract” tasks such as making the letters add up in an anagram. Math requires insight that a language model cannot posess. I don’t really get why people like infographics so much. The format usually just distracts from the data presented, which is convenient given that the data is usually garbage too.
- “What makes your job exposed to GPT?” – OK I expect a list of possible answers:
It’s not bad.
There’s one thing that people tend to neglect that I like to remember–it’s going to be awhile yet before an AI can walk up to your door, knock, come in and find the specific nature of a plumbing/electrical/HVAC or whatever problem, and then diagnose and fix it. And then get safely home without getting hit by a truck or vandalized by bored teenagers or both.
That’s such a complex suite of different “problems” that we’re going to need nothing less than a general AI to navigate them all. Thus, one of the last jobs that’ll be replaced is various kinds of repair professionals that do house calls. The constant novelty of the career, where every call is its own unique situation, is a nightmare for a current-method AI.
Orchestrator AI to determine which context it’s in with specialized AIs running for those tasks.
You’re going to have an unacceptably high failure rate as you attempt to trial-and-error your way through all the lower-probability problems. Meanwhile, independent research paths aiming at general AI, which absolutely could handle all these problems, is racing you.