So, in other words: which of your core beliefs do you think has the highest likelihood of being wrong? And by wrong, I don’t necessarily mean the exact opposite - just that the truth is significantly different from what you currently believe it to be.
That people can change through conversations. It’s tough to accept, but most people only change when forced to.
People or beliefs?
I’ve changed my mind many times based on online discussions.
Beliefs. I’ve changed my mind too, but it seems to be the exception.
I’ve noticed 2 types on this, stick-in-the-muds and peak-hunters.
Stick in the muds latch on to the first version of a belief they encounter properly. They will stubbornly hang on to that for as long as possible.
Peak hunters are the opposite, they will rapidly change beliefs to maximise the results/find truth.
Interestingly, after some time, the 2 groups look almost identical. The peak hunters tend to find the ‘best’ version of their belief, based on their existing memeplex. To budge them, you need to show a different belief is better, on their rankings (not yours). This is hard when they have already maximised it. Without knowing how they are weighing things, they can look like stick in the muds.
The biggest tell is to question why they believe what they do. If they have a reasonably comprehensive answer, they are likely peak hunters. Stick in the muds generally can’t articulate why their belief is better, outside of common sound bites.
That people are not wilfully stupid. The last 10 years have proved people will act against their own benefit if TV tells them to do it.
I can’t think of any that I’d be particularly surprised by at this point.
That people are fundamentally benevolent to one another. Obviously it can be trained out of you by circumstance, overcome by self-interest, and mental illness is a thing, but I think people innately care for one another. It’s why dehumanization is the first step to committing atrocities.
But if someone offered proof that I’m wrong that might be the least surprising thing that happened all week. And if I’m wrong, the evil-doers are sub-human and should be culled without mercy until I am right.
My deepest core belief is that there is a non-zero likelihood (which may be quite high) that everything I think I know about the world is wrong.
If it was proven to me beyond a doubt that something I know is undoubtedly correct, I would probably think that there was a possibility that the proof was wrong and go on with my day.
Would be interested in a list of past facts that turned out to be wrong.
That all living things are worthy of my compassion. If the millions of conservatives out there somehow prove me wrong… then all attempts at civilization are doomed to collapse and we’re reverting back to feudal times.
For me it would be that while lies are in many cases morally justifiable.
My current belief on this is that lying is never right unless you’re literally using it as a form of self defence as an alternative to physical violence. However, I also tend to believe that absolute beliefs are virtually always wrong, and these two are conflicting beliefs. I can atleast think of a few extreme scenarios where a white lie seems justifiable even when you’re not in danger. For example: a dying person showing their painting and you complimenting it despite not liking it.
That we can build a sane, rational society.
One could argue by historical standards that we’re closer than not already. How sane would you say sane is?
I thought that Western style Democratic republics were leading the world toward purely secular forms of government, but yet another group of sociopaths has managed to take power. They have distracted the science-illiterate majority into petty conflicts based on different versions of magical thinking.
So, “sane” would mean that we don’t elevate the least sane among us (sociopaths) into positions of power. “Rational” would mean that public policy decisions are mostly made based on evidence, rather than fundamentally irrational belief systems.
I fear that we are barely-sentient primates doomed to repeat the same awful mistakes, when simple, obvious solutions are within our grasp.