I had hoped that as most younger adults now were kids who grew up with computers, the average person would have a pretty good understanding of how they work. I never expected everyone to be a programmer or sysadmin of course, but to have a general sense of things like whether data is stored on their device or remotely, how to find out if an app install is risky, and whether a prompt requesting permissions, a password, etc… is reasonable.
For the most part, I don’t think that has happened. The average person doesn’t know how to use a computer and isn’t going to learn.
All that happened by design. The trend for decades has been to remove the user from the internal workings of the computer. This paved the way for expensive support packages, geek squads, and genius bars.
If we look at cars as an example, the future of computing looks grim. Who’s to say that there won’t be leased laptops with built in features behind paywalls in the next 10 years?
I don’t think we need a sinister plan to explain how we got where we are.
Most people are interested in some outcome, and want the easiest process to achieve it, not to learn about the process. They want to play a computer game, not learn about graphics drivers. They want to take a picture and send it to their friends, not learn about communications protocols or camera settings.
It’s not just tech. They want to cut their food, not learn to sharpen knives. They want to drive to their destinations, not maintain their cars. Maintenance-free tends to outsell serviceable in most product categories.
Geek Squad didn’t come about because people didn’t have the ability to access the inner workings of their computers, but because they didn’t want to put in the effort to learn. Getting the defaults right so most people don’t have to change settings before your product is useful is good design even when your product offers lots of access to the inner workings.
I do, however see the trend of software requiring remote attestation about the OS it’s running on as sinister. Google even recently tried to bring that to the web.
I’m afraid peak computer literacy and hygiene is past us now. Younger folks are so used to everything just working, that the vast majority don’t care or are willing to find out how things work. (Don’t get me wrong, the vast majority of boomers, gen-x and millennials aren’t much better, but tend to have more of a healthy suspicion because of their analog youths.)
I work at a major university. Everything became a black box and now if there is no output, students born circa 2002-2006, who are otherwise very bright, don’t know how to navigate it.
Is it possible this is because of Apple though? Feels like a whole generation is coming of age that we’re told they were too dumb to figure out settings and to just let papa Apple take care of all that nerd shit.
I had hoped that as most younger adults now were kids who grew up with computers, the average person would have a pretty good understanding of how they work. I never expected everyone to be a programmer or sysadmin of course, but to have a general sense of things like whether data is stored on their device or remotely, how to find out if an app install is risky, and whether a prompt requesting permissions, a password, etc… is reasonable.
For the most part, I don’t think that has happened. The average person doesn’t know how to use a computer and isn’t going to learn.
All that happened by design. The trend for decades has been to remove the user from the internal workings of the computer. This paved the way for expensive support packages, geek squads, and genius bars.
If we look at cars as an example, the future of computing looks grim. Who’s to say that there won’t be leased laptops with built in features behind paywalls in the next 10 years?
I don’t think we need a sinister plan to explain how we got where we are.
Most people are interested in some outcome, and want the easiest process to achieve it, not to learn about the process. They want to play a computer game, not learn about graphics drivers. They want to take a picture and send it to their friends, not learn about communications protocols or camera settings.
It’s not just tech. They want to cut their food, not learn to sharpen knives. They want to drive to their destinations, not maintain their cars. Maintenance-free tends to outsell serviceable in most product categories.
Geek Squad didn’t come about because people didn’t have the ability to access the inner workings of their computers, but because they didn’t want to put in the effort to learn. Getting the defaults right so most people don’t have to change settings before your product is useful is good design even when your product offers lots of access to the inner workings.
I do, however see the trend of software requiring remote attestation about the OS it’s running on as sinister. Google even recently tried to bring that to the web.
I’m afraid peak computer literacy and hygiene is past us now. Younger folks are so used to everything just working, that the vast majority don’t care or are willing to find out how things work. (Don’t get me wrong, the vast majority of boomers, gen-x and millennials aren’t much better, but tend to have more of a healthy suspicion because of their analog youths.)
deleted by creator
I work at a major university. Everything became a black box and now if there is no output, students born circa 2002-2006, who are otherwise very bright, don’t know how to navigate it.
Is it possible this is because of Apple though? Feels like a whole generation is coming of age that we’re told they were too dumb to figure out settings and to just let papa Apple take care of all that nerd shit.
Nope, that would make it a mostly US thing, but it’s not