Excellent study material for all civil services aspirants - begin learning - Kar ke dikhayenge!
CRUCIAL CONCEPTS OF LIFE (1/4)
Read more on - Polity | Economy | Schemes | S&T | Environment
- Causal Reductionism: Things rarely happen for just one reason. Mostly, outcomes result from many causes coming together in unexpected or unimagined ways. But the human mind cannot process such a complex arrangement, so it tends to ascribe outcomes to single cause(s), reducing the web of causality to a mere thread.
- Ergodicity: A die rolled 100 times has equal probabilities to 100 dice rolled once; rolling a die is “ergodic”. But if the die gets chipped after 10 throws so that it’s likelier to roll a 4, then one die 100 times is not equal to 100 dice once (non-ergodic). Many of us treat non-ergodic systems as ergodic.
- Dunning-Kruger Effect: Awareness of the limitations of cognition (thinking) requires a proficiency in metacognition (thinking about thinking). In other words, being stupid makes a person too stupid to realize how stupid he or she may be!
- Emergence: When many simple objects interact with each other, they can form a system that has qualities that the objects themselves don’t. Examples: neurons in our brains creating consciousness, traders creating the stock-market, simple maths rules creating “living” patterns.
- Cultural Parasitism: An ideology parasitizes the mind, changing the host’s behavior so they spread it to other people. Therefore, a successful ideology (the only kind we hear about) is not configured to be true; it is configured only to be easily transmitted and easily believed. Infectivity becomes more important than truth.
- Cumulative Error: Mistakes grow. Beliefs are built on beliefs, so one wrong thought can snowball into a delusional worldview. Likewise, as an inaccuracy is reposted on the web, more is added to it, creating fake news. In the networked age of the 21st century, cumulative errors are the regular feature of society.
- Survivorship Bias: We overemphasize the examples that pass a visibility threshold e.g. our understanding of serial killers is based on the ones who got caught. Equally, news is only news if it’s an exception rather than the rule, but since it’s what we see we treat it as the rule.
- Simpson’s Paradox: A trend can appear in groups of data but disappear when these groups are combined. This effect can easily be exploited by limiting a dataset so that it shows exactly what one wants it to show. Thus, beware of even the strongest correlations.
- Limited Hangout: A common tactic by journalists & politicians of revealing intriguing but relatively innocent info to satisfy curiosity and prevent discovery of more incriminating info. E.g. a politician accused of murder may confess to having stolen documents from a government office.
- CRUCIAL CONCEPTS OF LIFE (2/4)
- COVERED SO FAR
- Causal Reductionism
- Ergodicity
- Dunning-Kruger Effect
- Emergence
- Cultural Parasitism
- Cumulative Error
- Survivorship Bias
- Simpson’s Paradox
- Limited Hangout
- Focusing Illusion: Nothing is ever as important as what you’re thinking about while you’re thinking about it. Example – Worrying about a thing makes the thing being worried about seem worse than it is. So suffering exists more in imagination than in reality.
- Concept Creep: As a social issue such as racism or sexual harassment becomes rarer, people react by expanding their definition of it, creating the illusion that the issue is actually getting worse. The process may actually be becoming less important with time. Harvard psychologist Steven Pinker has noted that there’s a tendency for people to become selectively blind to social progress, particularly in Western nations.
- Streetlight Effect: People tend to get their information from where it’s easiest to look. Example – the majority of research uses only the sources that appear on the first page of Google search results, regardless of how factual they are. Taken over a long period of time, this can skew an entire field, and make results wrong!
- Belief Bias: Arguments we would normally reject for being idiotic suddenly seem perfectly logical if they lead to conclusions we approve of. In other words, we judge an argument’s strength not by how strongly it supports the conclusion but by how strongly we support the conclusion.
- Pluralistic Ignorance: Phenomenon where a group goes along with a norm, even though all of the group members secretly hate it, because each mistakenly believes that the others approve of it.
- The Petrie Multiplier: In fields in which men outnumber women, such as in STEM, women receive an underestimated amount of harassment due to the fact that there are more potential givers than receivers of harassment.
- Woozle Effect: An article makes a claim without evidence, is then cited by another, which is cited by another, and so on, until the range of citations creates the impression that the claim has evidence, when really all articles are citing the same uncorroborated source. This is how entire fake news industry operates.
- Tocqueville Paradox: As the living standards in a society rise, the people’s expectations of the society rise with it. The rise in expectations eventually surpasses the rise in living standards, inevitably resulting in disaffection (and sometimes populist uprisings). Classic example can be of Chinese industrial workers.
- Ultimate Attribution Error: We tend to attribute good acts by our allies to their character, and bad acts by our allies to situational factors. For opponents, we reverse this: good acts are attributed to situational factors, and bad acts to character.
- CRUCIAL CONCEPTS OF LIFE (3/4)
- COVERED SO FAR
- Causal Reductionism
- Ergodicity
- Dunning-Kruger Effect
- Emergence
- Cultural Parasitism
- Cumulative Error
- Survivorship Bias
- Simpson’s Paradox
- Limited Hangout
- Focusing Illusion
- Concept Creep
- Streetlight Effect
- Belief Bias
- Pluralistic Ignorance
- The Petrie Multiplier
- Woozle Effect
- Tocqueville Paradox
- Ultimate Attribution Error
- Golden Hammer: When someone, usually an intellectual who has gained a cultish following for popularizing a concept, becomes so drunk with power he thinks he can apply that concept to everything. A classic example is that of Nassim Taleb
- Pareto Principle: It is a pattern of nature in which ~80% of effects result from ~20% of causes. Example: 80% of wealth is held by 20% of people, 80% of computer errors result from 20% of bugs, 80% of crimes are committed by 20% of criminals, 80% of box office revenue comes from 20% of films and so on.
- Nirvana Fallacy: When people reject a thing because it compares unfavorably to an ideal that in reality is unattainable. E.g. Condemning capitalism due to the superiority of imagined socialism, condemning ruthlessness in war due to imagining humane (but unrealistic) ways to win.
- Emotive Conjugation: Synonyms can yield positive or negative impressions without changing the basic meaning of a word. Example: Someone who is obstinate (neutral term) can be “headstrong” (positive) or “pig-headed” (negative). This is the basis for much bias in journalism.
- Enantiodromia: An excess of something can give rise to its opposite. Example: A society that is too liberal will be tolerant of tyrants, who will eventually make it illiberal. Carl Jung defined enantiodromia as "the emergence of the unconscious opposite in the course of time”.
- Halo Effect: When a person sees an agreeable characteristic in something or someone, they assume other agreeable characteristics. Example: If a Modi supporter sees someone’s Twitter profile with Modi-favouring tweets, he’s likely to think that person is also nationalistic, pro-Hindu, pro-business etc.
- Outgroup Homogeneity Effect: We tend to view outgroup members as all the same e.g. believing all Trump supporters would see someone wearing a MAGA cap, and think that person is also decent, honest, hard-working, etc.
- Matthew Principle: Advantage begets advantage, leading to social, economic, and cultural oligopolies. The richer you are the easier it is to get even richer, the more recognition a scientist receives for a discovery the more recognition he’ll receive for future discoveries, etc. This can be unfair, and even wrong!
- Peter Principle: People in a hierarchy such as a business or government will be promoted until they suck at their jobs, at which point they will remain where they are. As a result, the world is filled with people who suck at their jobs (i.e. are inefficient).
- Loki’s Wager: Fallacy where someone tries to defend a concept from criticism, or dismiss it as a myth, by unduly claiming it cannot be defined. Examples: “God works in mysterious ways” (god of the gaps), “race is biologically meaningless” (Lewontin’s fallacy).
- Subselves: We use different mental processes in different situations, so each of us is not a single character but a collection of different characters, who take turns to commandeer the body depending on the situation. There is an office “you”, a lover “you”, an online “you”, etc.
- Goodhart’s Law: When a measure becomes a goal, it ceases to become a measure. Example: British colonialists tried to control snakes in India. They measured progress by number of snakes killed, offering money for snake corpses. People responded by breeding snakes & killing them.
- CRUCIAL CONCEPTS OF LIFE (4/4)
- COVERED SO FAR
- Causal Reductionism
- Ergodicity
- Dunning-Kruger Effect
- Emergence
- Cultural Parasitism
- Cumulative Error
- Survivorship Bias
- Simpson’s Paradox
- Limited Hangout
- Focusing Illusion
- Concept Creep
- Streetlight Effect
- Belief Bias
- Pluralistic Ignorance
- The Petrie Multiplier
- Woozle Effect
- Tocqueville Paradox
- Ultimate Attribution Error
- Golden Hammer
- Pareto Principle
- Nirvana Fallacy
- Emotive Conjugation
- Enantiodromia
- Halo Effect
- Outgroup Homogeneity Effect
- Matthew Principle
- Peter Principle
- Subselves
- Goodhart’s Law
- Predictive Coding: There is no actual movement on a TV screen, but the human brain invents it. There are no actual spaces between spoken words, but the brain inserts them. Human perception is like predictive text, replacing the unknown with the expected. Predictive Coding leads to Apophenia.
- Apophenia: We impose our imaginations on arrangements of data, seeing patterns where no such patterns exist. A common form of Apophenia is Narrative Fallacy. Evolution has shape our brains to do this (to ease survival).
- Narrative Fallacy: When we see a sequence of facts we interpret them as a story by threading them together into an imagined chain of cause & effect. If a drug addict commits suicide we assume the drug habit led to the suicide, even if it didn’t. A lot of mass media, fake news and propaganda can be easily built around this human weakness.
- Pareidolia: For lakhs of years, predators stalked humans in undergrowth & shadow. In such times survival favoured the paranoid—those who could discern a wolf from the vaguest of outlines. This paranoia preserved the homo species, but cursed it with pareidolia, so we now see wolves even in the skies. It is also a form of Apophenia.
- Reactance Theory: When someone is restricted from expressing a point of view (POV), or pressured to adopt a different POV, they usually react by believing their original POV even more. So a neo-Nazi or a fascist should not be countered with facts, but with emotions, if one is to convert them.
- Availability Cascade: When a new concept enters the arena of ideas, people react to it, thereby amplifying it. The idea thus becomes more popular, causing even more people to amplify it by reacting to it, until everyone feels the need to talk about it.
- Shifting Baseline Syndrome: Frog says to Fish, “how’s the water?” Fish replies, “what’s water?” We become blind to what we’re familiar with. And since the world is always changing, and we're always getting used to it, we can even become blind to the slow march of catastrophe. Climate change is the best example!
- Legibility: Humans see a complex natural system, assume that because it looks messy hence it must be disordered, and then impose own order on it to make it “legible”. But in removing the messiness we remove essential components of the system that we couldn’t grasp, and it fails.
- Radical Phase Transition: Extremist movements can behave like solids (tyrannies), liquids (insurgencies), and gases (conspiracy theories). Pressuring them causes them to go from solid => liquid => gas. Leaving them alone causes them to go from gas => liquid => solid.
- Occam's razor: It is the law of parsimony. It's a problem-solving principle that "entities should not be multiplied without necessity" or that the simplest explanation is usually the right one (for a given problem). Occam's Razor is not a proof but just a useful guideline. The term "razor" refers to the "shaving away" of unnecessary assumptions when distinguishing between two theories.
* Content sourced from free internet sources (publications, PIB site, international sites, etc.). Take your own subscriptions.
Copyrights acknowledged.
COMMENTS