Rationality doesn't stand a chance against emotions
It looks like we are mostly through the Covid19 pandemic and there has been a lot of blame going around recently. The same is true for the energy global energy crisis brought about by the Ukraine war. (The blame going around, not being through it)
It has become increasingly clear for me that different nations’ handling of these situations revealed once again something fundamentally at odds with our normal believes about rationality and how we act:
It is the basic human operating system to prioritize emotions and rationalize later on.
Emotions rule
It looks increasingly like that differences in our reactions are best explained by differences in what we fear:
When you fear climate change more than anything else, you will be willing to sacrifice all of the economy to combat it. When you are at the end of your work life and expect a pension, changing all of the economy is probably the last thing you want to do. You fear change. Should you weigh the benefits and costs on anything related to energy, the industry to power all industry? Certainly! Do we have an “objective” way to do it? Nope. Even translating policy effects to money doesn’t help, because, OBVIOUSLY, in this special case I care about, there is some externality that is not captured. And some negative aspect is certainly not weighed with the “right” discount factor. It’s easy to find post-hoc rationalization to do what we wanted to do anyway. And it takes a lot of discipline to even try to overcome such tendencies.
Take Covid19. When you are 70, your risk profile looks very different from when you are 7 or 27. But odds are that 70ish year old people are holding the reins of power.
It makes sense for those in power to want everybody else to behave in ways that minimizes risk for exactly those in power. So when it’s: “Everybody has to wear masks” and “Only the vulnerable have to wear masks”, people in the “vulnerable” and “mighty” category might opt for the former. They don’t have anything to lose from that (it’s masks for them anyway), but there might be a benefit from other people wearing masks. Costs of that decision, if any, will be delayed and distributed.
The same with vaccinations; given my current reading (which should mean little to you) it has probably been a good decision for people older than 60 to get a vaccination against the original and alpha variant, when these were dominant. So it’s vaccinations for them anyway; so why not making anybody else to do it as well? It might give the powerful yet vulnerable group some advantage (maybe even monetarily, although I have of course no way of proving that), if younger people are vaccinated as well. Maybe their viral load is lower during an infection, which might lower the odds of them spreading the virus. To the powerful, yet more vulnerable people. This effect also exists, even if a vaccine does not stop an infected person from spreading the virus. See David Deutsch’s explanation on that.
Suppose person1 meets person2. The probability that this infects p2 depends on whether p1 is infected. If p1 isn't, that probability is zero. https://t.co/USpwgybNcY
— David Deutsch (@DavidDeutschOxf) January 16, 2023
So who behaves irrational? The powerful, older people wanting everybody to wear masks and get vaccinated? Or the younger people being forced to do so or those opting for it voluntarily to increase the odds of protecting a vulnerable family member or those fighting any mandate? Well neither group. Everybody is rationally FOR THEMSELVES.1 Does that shift risks? Do other people have to bear costs they wouldn’t want to have? Sure! Is that bad? Depends on who is doing the analysis, I guess.
Analysis
If you “analyze” a problem, you cut off the irrelevant parts. Or at least you try. But that is rather subjective, sometimes, even arbitrary. (Like my realization that “rationality” seems downstream of “emotions”. Could be wrong, but explains a lot for me at the moment) You might think that some pollution is ok to reduce poverty. Another person might think you are a criminal for even proposing that, because “you never know” what this pollutants might do in the world. One is afraid of poverty, someone else might be afraid of any impact of humanity on the planet. What we see as relevant, when we see it as relevant, who we think can be sacrificed, who we think is allowed to sacrifice. All of these are probably driven by emotions and tradition and culture and childhood trauma to a far greater degree than by rationality. People might be motivated by love, hate or fear alike.
Information asymmetry
Do we know how to make decisions in these situations? Not really. We know how an individual actor ought to behave given their belief about the state of the world and the effects of their behavior on the state of the world. It doesn’t mean they are right. It doesn’t mean that this behavior is good, let alone optimal for anyone else. So who should rule? Who should be made to obey? And what are the relevant beliefs and probabilities you should include? Do you model “other people want control over me”? Do you discount it by “but it might be good for me”? Do you include “I only have 5 years left to live. I want those years to be great, I don’t care after that”? Or do you include “my children”? In the same way as “your children”? Is it relevant, how your relationship to your family is? Should you generalize your experience (good or bad)? If it’s not your personal experience, whose should you use? How do you know, when it is ok to use a proxy for a question (for example the recency proxy for probabilities) and when does it payoff to do actual research? How do you even know that there is relevant research on the question at hand? When do you ask an expert?
Experts
But which expert should you trust? An expert is someone, who has been trained (hopefully by experienced people) on a set of past problems and their solutions. They know how to handle the problems of the past. Great.
What is not so great is that hardly two problems are the same. There are no two identical viruses. There are no two identical situations. Every new situation poses new challenges. Sometimes we get lucky and past solutions work on new problems. (for example applying norms to bridges that are to be built in areas, where you have built before, on ground conditions, that you have dealt with before, over lengths, you have crossed before etc.)
When we are not lucky, we need to find new solutions to new problems. And then all bets are off. We have to grope around in the dark and grab everything that might help us. It’s not necessarily the same type of people that are experts of the old and innovators of the new. The problem is; most of the new stuff fails. It takes a lot of time to build something new. So the innovators will be wrong in more than 9 out of 10 cases. But when they are right, it is a gigantic pay off. If you could predict even slightly better than random chance, when that is going to happen, you’d be filthy rich.
But we cannot predict that! So is it rational to trust “experts”? The answer, as ALWAYS is: it depends! Is it something, that has been dealt with before? Probably, they can handle it. Is it closely related? That’s dicier. Still, probably, they can handle it. Is it new? Odds are, they don’t know the solution. They will have to find it. That means “experts” will be wrong. A lot. But so will anybody else. Because, it is new. We don’t know what we don’t know. So we cannot predict how to handle it. There could be a rogue doctor, a reclusive genius etc. coming up with a solution. Experts are also not free from having their biases for financial, political, ideological, religious and all the other reasons, that “normal” people have to deal with themselves.
Who are you to tell, whether their solution is good and their approach is unbiased? How do you even know whether the problem is “like an old one”? How do you think you can judge the quality of their work? Have you worked in that field previously? Or on related problems? If not, you will have a really hard time judging, who is right. And an even harder one predicting whose approach might turn out to be right.
Predictions
It is hard to distinguish a wrong prediction from one, where the circumstances or context changed. Is Paul Ehrlich consistently wrong (as his critics claim) or is he heroically raising the alarm soon enough to change our ways? Is he merely warning? “If this trend persists, then doom” is a prediction and probably often true as logic statement, even though it is not necessarily factually true. If the trend does not persist, then “not necessarily doom”. But then the prediction is not wrong either, it has just become an irrelevant hypothetical. That makes true prediction of the kind “If it rains enough, the ground will be wet for a period of time under normal circumstances” often trivial.
The rub is that because we are interested in the latter part, we often assume that the predictor would guarantee the conditions to hold or really think that the “then”-part is going to happen. But that is probably a fault on the part of the audience… or skillful manipulation by the predictor.
It’s even worse to prove probabilistic predictions wrong. “There is a 99.99999% chance that X will become president in the next election”. How would you even begin to disprove such a prediction? There is only one election. If “Y” becomes president, then what? Did the 1 in a million event happen? Was the reasoning process of the predictor faulty? No way to tell from the result.
It’s not even easy to tell who WAS right afterwards!
Data
Evidence from data is also tricky. Given the same set of data, you can slice it in many ways. You cannot even say what is relevant, because relevance (in the common usage) is not well-defined statistical term. Is it the correct data? Is it capable of answering the question that is being asked? Does it test the right people in the right way at the right time? Does it administer the right dosage? Does it follow the participants long enough? Who is to say, what the right way even is?
Science is the process of putting forth a hypothesis and being criticized. Everything “science” knows, is tentative. When a better explanation comes along, it supersedes that what is already known. Scientists accept that they don’t have access to a final, “capital T” Truth. Real scientists try to find small pockets of reality that they can prod and probe until they get a handle on it.
Some audacious ones might venture to generalize the results. But good scientists typically know that this is an assumption that has to stand the test of reality. But who wants that degree of humility? Of “probabilities” of “current knowledge”? We want the Truth, now. With 100% certainty.
Priests
And we are often acting as if some people, bestowed by some institution with degrees, titles and honors, would be “qualified” to deliver it. That relieves our anxiety. We are no longer in the world of uncertainty that we are responsible to live and work with. Paaah! That is only for dumb folks.
No, we are “following the science”! Coincidentally, it will often conform to our preconceived notions or amplify the danger of something that we “intuitively” knew to be dangerous. Competing ideas are “dumb”, “conspiracy”, “evil”, “selfish”, “shortsighted”, “irrational”. Shamans and priests have made a great living out of this model of creating certainty out of their behinds for thousands of years.
Today, they have access to globe spanning platforms to transport their certitude to you between breaks filled with ads for sponsors, which are certainly not biasing their views.
It is a tired trope, but the Gell-Mann amnesia is a real phenomenon. Unfortunately, we are not knowledgeable enough to call journalists and other peddlers of false certainty on their BS on more than a handful of topics. They are often substituting their own moral ideas and their own preferences for “objective” or “majority” views.
How often to you get a really balanced view on a topic? You will probably not even be able to tell, because you are just not sophisticated enough to detect it on a lot of topics. Neither am I. Hardly anyone is. If any at all.
I have no idea what is going on inside Iran, no idea about their cultural traditions, how it relates to the Islamic Revolution, how much of the violence is driven by the internal security forces, how much is being paid for, if any, by their regional or global adversaries, because I am not knowledgeable enough about their motives, inner balance of power, factions etc. I don’t know who blew up centrifuges or drone factories for what reasons.
Or take the current war in Ukraine. How much do I know about the actions and intentions of either side? Of the stuff I think I know, how much of that was spoon-fed to me by the propaganda arm of one of the countries? Which propaganda department is more reliable? It’s hard to tell from afar.
And afar is exactly the situation the journalists are in that are “compiling” information to present it as fact to you. But in reality, neither they nor you know how many dead there are on the ground, which side’s strategy is currently working out just as planned, because neither they nor you have a clue what either side’s plan looks like. We can’t even tell who blew up North Stream 2, if anyone at all.
You can guess, sure, but you have to face the truth that your view on the situation is based on your guess, which you made based on the reporting and story telling of journalists, that are guessing their stories based on state propaganda and anonymous and/or activist accounts on social media. But that won’t keep any of us from having a firm moral opinion, will it?
And it gets darker still.
The really perverse thing is: if you are in doubt, but your adversary manages to “brainwash”, “propagandize”, “influence”, “inform”, or “educate” their troops and people into a coherent force, while “your” side is in doubt, “on the fence”, “discussing”, it’s very likely that your side will lose.
This asymmetry between what is optimal for individual and for groups is a real and often problematic tension.2 Too often, the individual gets crushed by a “group”. The dynamics of groups and how beliefs, even those that are easily proven to be false, become a badge of honor and a sign of group membership is quite interesting.3 And more than a bit depressing.
What do you want to do now? What do you do, when “experts” don’t know the answer, data doesn’t help you (yet?) and you don’t know the answer?
Living
You cannot not act.
But what do you want to do? You start to “f” around, like everyone else, like all the experts, like all the researchers, like all the entrepreneurs.
Maybe, eventually, someone will stumble upon something that works. With a lot of luck and access to markets, you might be able to buy their solution for outrageously high prices. But at least you don’t have to find a working solution yourself. Our best bet seems to be that kind of freedom.
We let people chose for themselves according to their own risk profiles. We challenge and criticize people with different views to get to a better place. That freedom of speech is something that helps us to stay sane collectively.
But when these “locally optimal” decisions conflict, it seems we have to find a mode of choosing a path. Some societies opt for a single person, in the best case working like an unbiased coin flip, in the worst case that person is a vindictive tyrant that is more concerned with punishing enemies than with solving problems, to force a solution on all. Other societies have opted for representative democracy and try to broadly federated (decision) power. We call a social structure, where decisions are made as close to the people as possible a democratic republic. (It’s probably not hard to tell that I think this form of government to be a great invention). Is it perfect and free of conflict? Obviously not. As irritating as that might be, Churchill might have had it right:
“Many forms of Government have been tried, and will be tried in this world of sin and woe. No one pretends that democracy is perfect or all-wise. Indeed it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time.…” - Winston S. Churchill, 11 November 1947
Religion
A great problem remains and seems to become increasingly more challenging: What do we do, when our fundamental beliefs are different? What if your God is Gaia? Or the Russian master race? Or the Prophet? Or Vishnu? Or some cult leader? What if all your thinking and action is oriented towards that? But mine is not? Then we will have religious wars.
Challenging views will cease to be an important corrective to get us to something better and become heresy that has to be canceled. And the heretics have to be ostracized.
Religion used to be a Schelling Point for a short period of time, a phase we now call the Enlightenment and Industrial Revolution. Everybody in the Western world knew roughly what anybody else there was looking for. At least in the moral realm. 4 Without such a unifying belief (or direction), cataclysmic strife seems to be ahead and the terrors of the 20th century, the two World Wars, the holocaust, the Great Leap Forward; all of that might be just a prelude. We certainly have far more horrible weapons at our disposal now: nuclear weapons in the Mt range, viruses, chemical weapons, drones, long-range artillery, cyber wars.
What is the alternative to religion? Maybe full-blown nihilism?
Maybe all the people manipulating their way through life are right. They act in their own reality, improve what they can improve locally (for themselves). They try to bully, pressure, bribe, coax, coerce people to do their bidding. And they probably think they are right to do so.
Maybe, that is the best we can do, while also preventing cataclysm? Maybe all you can do is to formulate a positive vision of the future? Or finding something that people are afraid of and abuse them by scaring the s* out of them until they do “the right thing”? That seems to be what most activists and political messages are doing today, that seems were “internet celebrities” are making their fortunes and that is where companies are making windfall profits.
Welcome to 2023.