Difference between revisions of "Cognitive biases that are opposites of each other"

From Issawiki
Jump to: navigation, search
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
 +
fundamental attribution error and typical mind fallacy are basically opposites. so if you make one bias instead of the other, that just means your priors are slightly off, which isn't a huge problem imo. it's similar to what critch said about bystander effect and unilateralist's curse being opposites.
 +
 +
i guess this fits into that SSC post about equal and opposite advice. and how proverbs often have inverses.
 +
 +
i wonder what fraction of cognitive biases are like this. like there is some quantity X, and there is a "just right" amount of X, and there are two biases for "too little X" and "too much X". and so while the callout that the amount of X is at a suboptimal level lands, it's still not a satisfying bias because it's like, some ppl have too little and some have too much and it's hard to get the right amount as bounded reasoners. it would be like if there is a perfect height and calling the short ppl too short and the tall ppl too tall. like yes that's true but calling human height "biased" seems wrong; there's just a distribution of abilities.
 +
 
halo effect vs positive manifold? this seems like another one of those things where it's called a "bias" if it goes either above or below some optimal threshold. but it's unrealistic to get that optimal level.
 
halo effect vs positive manifold? this seems like another one of those things where it's called a "bias" if it goes either above or below some optimal threshold. but it's unrealistic to get that optimal level.
  
Line 15: Line 21:
 
[[Critch]]'s bystander effect vs unilateralist's curse is another example of opposite biases.
 
[[Critch]]'s bystander effect vs unilateralist's curse is another example of opposite biases.
  
cognitive biases: but then there are ones like dunning-kruger, which give _differential diagnoses_ depending on some attribute, like skill level. i think these ones _are_ useful to point out.
+
cognitive biases: but then there are ones like dunning-kruger, which give _differential diagnoses_ depending on some attribute, like skill level. i think these ones _are_ useful to point out. More generally a result like "in situation X, people have bias A, and in situation Y, people have bias B, even though A and B are opposites" would be useful. it's just the kind of blanket "people have this bias" that feels unfair to humans.
  
  
 
[[Category:Rationality]]
 
[[Category:Rationality]]

Latest revision as of 01:14, 12 May 2021

fundamental attribution error and typical mind fallacy are basically opposites. so if you make one bias instead of the other, that just means your priors are slightly off, which isn't a huge problem imo. it's similar to what critch said about bystander effect and unilateralist's curse being opposites.

i guess this fits into that SSC post about equal and opposite advice. and how proverbs often have inverses.

i wonder what fraction of cognitive biases are like this. like there is some quantity X, and there is a "just right" amount of X, and there are two biases for "too little X" and "too much X". and so while the callout that the amount of X is at a suboptimal level lands, it's still not a satisfying bias because it's like, some ppl have too little and some have too much and it's hard to get the right amount as bounded reasoners. it would be like if there is a perfect height and calling the short ppl too short and the tall ppl too tall. like yes that's true but calling human height "biased" seems wrong; there's just a distribution of abilities.

halo effect vs positive manifold? this seems like another one of those things where it's called a "bias" if it goes either above or below some optimal threshold. but it's unrealistic to get that optimal level.

confirmation bias vs 'probability vs likelihoods' thing where ppl confuse the posterior with likelihood and then change their views too much? i feel like this might be another instance where you have two extremes and the ideal is in the middle?

see also https://en.wikipedia.org/wiki/Conservatism_(belief_revision) which actually gives an even better example! -- i should post this example to abram's probabilities vs likelihoods post. but i should work out the math in the red/blue chips example first.

law of the instrument is just overlearning right? it seems like a _good_ thing! "I call it the law of the instrument, and it may be formulated as follows: Give a small boy a hammer, and he will find that everything he encounters needs pounding."

https://en.wikipedia.org/wiki/Escalation_of_commitment -- this and sunk cost fallacy is another case where some people just give up too early, and other people don't give up at all, and it's like, sure, you can call both of them biases but it's more that the right balance is just a single point along some axis, and you're almost always going to be wrong.

"false consensus effect" too -- this is just the opposite of SSC's "bravery debates"...

"satisfaction of search" too -- this is the opposite of the thing with the colliders making the other less likely. like if you find out sprinklers are on, it makes it less likely that it was raining, or whatever the example was.

Critch's bystander effect vs unilateralist's curse is another example of opposite biases.

cognitive biases: but then there are ones like dunning-kruger, which give _differential diagnoses_ depending on some attribute, like skill level. i think these ones _are_ useful to point out. More generally a result like "in situation X, people have bias A, and in situation Y, people have bias B, even though A and B are opposites" would be useful. it's just the kind of blanket "people have this bias" that feels unfair to humans.