Three Kinds of Concluding: Logic, Intuition, Authority

From Rasmapedia
Revision as of 08:33, 19 April 2021 by Rasmusen p1vaim (talk | contribs) (Bryan Caplan)
Jump to navigation Jump to search

Bryan Caplan

"Being Normal," Bryan Caplan, blog (2021).

The Principle of Normality: A normal person says what others say, but does what others do.

Notice that this principle captures two distinct features of normality.

First, conformism. People dislike expressing views or taking actions unless other people express the same views and take the same actions.

Second, the chasm between words and actions. Normal people lack integrity. They feel little need to bring their actions in harmony with their words – or their words in harmony with their actions.

Example: A normal person will say, “We should do everything possible to fight global warming” – yet donate zero to environmental charities. How can they cope with the cognitive dissonance? Because this psychological experience is alien to them. They speak environmentalist words to echo the environmentalist words they hear other people say. They donate zero to environmental charities because to mimic what they see other people do. What is this “dissonance” of which you speak, weird one?

For normal people, Social Desirability Bias is far more than a bias; it is their way of life.

His example can be improved. The problem with it is that nobody knows whether you donate or not, and you don't know whether other people are donating. Here's a better example, though rather narrower:

Example: A normal economist will say, “Economics articles should be short, use only the math necessary, well-written, and on interesting topics rather than just following what the literature is doing” – yet they will reject such articles when they are sent them to referee, and reject them with scorn, saying they are too short, ignore special cases or robustness checks, read like they were written by a child, and are on a topic nobody else is writing on. How can they cope with the cognitive dissonance? Because this psychological experience is alien to them. They speak methodological words to echo the methodological words they hear other people say. They reject articles with that methodology to mimic what they see other people do. What is this “dissonance” of which you speak, weird one?

This of course is inspired by personal frustration. See “Why Firms Reduce Business Risk Revisited: Projects with Risky Cash Flows Are Harder To Evaluate," which I've given up on publishing.

Arnold Kling

why we need a new scoring system,"]]blog, Arnold Kling (2021).

"1. We learn socially, so that most of our beliefs come from other people.

2.This makes the problem of choosing which people to trust the central problem in epistemology.

3.What Eric Weinstein calls our “sense-making apparatus” can be thought of as a set of prestige hierarchies, at the top of which are the people who are most widely trusted.

4. Our prestige hierarchies are based largely on credentials: professor at Harvard; writer for the New York Times; public health official.

5. The incentive systems and selection mechanisms in the credential-based hierarchies have become corrupted over time, allowing people to rise to the top who lack wisdom and intellectual rigor.

I think of electrons in an atom as occupying orbits relative to a nucleus. I have never observed this. I have never done any experiments that would verify this. I believe it because that is what I was taught fifty years ago by my high school chemistry teacher, Dr. Frank Quiring. I have not kept up with chemistry or physics since then.

In The Secret of Our Success, Joseph Henrich drives home the point that almost all of the knowledge that we possess comes from culture rather than from personal experience. ...

Philosophers typically view the problem of knowledge, or epistemology, as one of aligning the beliefs in your mind with the “reality out there.” But because our beliefs about reality come from other people, I think that the choice of which people to trust is the core issue in epistemology. I have made this point to academic philosophers, and they blow me off, insisting that the issue of aligning beliefs to reality is the nub of the problem. Relying on “testimony” (other people’s beliefs) is just one method for trying to solve it. I think that they would say that I choose a person to believe based on how well I think that person’s beliefs align with “reality out there.” But I would counter that I choose who to believe first, and then I choose what to believe. ...

Henrich points out that humans have two types of hierarchies. In a dominance hierarchy, the people at the top gain authority by force, and the people at the bottom reluctantly obey. In a prestige hierarchy, the people at the top gain authority by earning respect, and the people at the bottom willingly try to copy and learn from those at the top. Prestige hierarchies work through competitive mechanisms. ...

The value of competition in choosing the people to trust was driven home to me years ago by David Brin, in his essay on Disputation Arenas. In that essay, he offered a proposal for structured competition on the Internet to improve what I call social epistemology. Better ideas would win.

...

The process of getting ahead in a prestige hierarchy is analogous to the process of earning a bonus in a firm’s compensation system. If the bonus criteria align with the firm’s goals, people who do productive work will earn bonuses and the firm will be successful. If the bonus criteria are not well considered, worker who are not particularly productive will obtain bonuses, and the firm’s performance will suffer.

Bonus systems are like a game. The firm wants to get the most (useful) effort from its workers for the least compensation. Workers want to get the most compensation with the least effort.

My observation is that the longer a specific bonus system is in place, the better workers become at figuring out how to get more compensation for less effort. Incentive systems naturally degrade over time. Management has to revise the bonus systems every few years if the firm is to prosper.

Most incentive systems use a combination of formal measures (“metrics”) and informal judgment (“what your boss thinks”). Neither is perfect. Jerry Muller’s The Tyranny of Metrics describes how the formal approach often goes wrong. Informal judgment can be used as a corrective for imperfect metrics. But judgment also can introduce bias and cronyism.

Our prestige hierarchies of academia and legacy media rely heavily on credentials. Think of the process of obtaining tenure as a professor or the process of obtaining a prestigious position for a newspaper or TV network. Such credentials are awarded on the basis of judgment by incumbents. They reward conformity rather than excellence. Why this has emerged as a problem now more than in the past is a question that I am still pondering for a subsequent essay.

In any case, popular trust in our sense-making institutions has fallen dramatically over the past 70 years. The relationship between elites and the public at large in 2021 is somewhere between troubled and dysfunctional.

Many elites cannot understand why people do not trust “the science,” government officials, leading academics, or the news as reported in legacy media. But more detached observers, such as Martin Gurri in The Revolt of the Public and Yuval Levin in A Time to Build, understand that elite misconduct contributes heavily to the problem."