Difference between revisions of "Interacting with copies of myself"

From Issawiki
Jump to: navigation, search
(Created page with "(this is really crazy; don't read this) how should i interact with other copies of myself? sometimes there are people who think so similarly to you, who are value-aligned, et...")
 
 
Line 25: Line 25:
  
 
i want to think more about related ideas like, if i were the kind of organism that had evolved in an environment where minds were regularly copied and merged and so forth, what kinds of intuitions would i have wrt this kind of question?
 
i want to think more about related ideas like, if i were the kind of organism that had evolved in an environment where minds were regularly copied and merged and so forth, what kinds of intuitions would i have wrt this kind of question?
 +
 +
[[Category:Philosophy]]

Latest revision as of 20:40, 12 April 2021

(this is really crazy; don't read this)

how should i interact with other copies of myself? sometimes there are people who think so similarly to you, who are value-aligned, etc., that you sort of want to treat them as different versions of who-you-could-have-been rather than different people. i.e. any differences they have with you are things you can chalk up to differences in circumstances, luck, etc. rather than any sort of fundamental personal/temperamental difference. (even personalities can evolve differently depending on life circumstances, so maybe you can decide to "split" the copies when they turn 10 years old or something. i.e. you determine the age at which you developed your unique personality, then take all people-like-that-person and treat them as copies of you-at-that-age. or maybe you think hard and try to figure out the kind of person you would have been if you had grown up in a different society, e.g. find the russian issa.)

what if some of them become rich? what if some become influential/high status? what if some become poor, mentally ill, etc. etc.? how would different copies of yourself interact with each other? another thing you'd want to do is to effectively locate your other copies, by calculating Schelling points and so forth. (the fact that i haven't found people similar to myself is evidence that such people don't exist, or that the copies that split off from 10-year-old-me turned out so poorly that they couldn't even find LW/EA/the idea of a Schelling point. this is my "mini Fermi paradox") some potential problems:

  • you can't even find your copies due to poor coordination (e,g. maybe your copies can't even find LW/EA/the idea of Schelling point)
  • you are so unique that there aren't people who are even close to you in thinking style etc so you have nobody to coordinate with
  • the other versions of you are "updateful" in the sense that once they see where they've ended up, they feel no urge to cooperate with the other copies. e.g. the rich version of you might want to keep all of the wealth to himself.
  • the poor/low status versions of "you" might tend to have an overly lax/loose way to identify other copies, in order to capture more of the resources that have been obtained by the rich/high status versions. this is sort of the reverse problem of the previous point. an extreme version of this is -- this kind of thought of coordinating with other copies even only occurs in the low status versions. (hey, did i just insult myself :P?)
  • maybe it's not very efficient to try to coordinate with people similar to you. maybe you should just broadly aim to coordinate with anyone who seems aligned to you in relevant ways (e.g. has read some similar books, knows about EA, and so forth). this feels kind of disappointing to me, and seems kind of a "i don't want to bother thinking about this, so it's probably not important" kind of response.
  • one of the big problems seems to be the differing level of smartness between your copies. a smarter copy more reliably locates other copies. it also more reliably gains more resources (or maybe not! if you look at me you'll see that smartness doesn't necessarily imply greater resources). it also more reliably even thinks about decision theory for long enough to stumble across weird ideas like this one...

note1: this is going to sound really crazy etc etc. i hope you will go along with it, if only as an interesting thought experiment.

note2: yes i am aware that wei dai does not endorse using UDT in everyday life situations. again, please just go along with this because i find the idea fascinating!

i think this kind of thinking is not completely unprecedented, e.g.

  • someone might want to offer scholarships to people with similar life circumstances/ethnicity/whatever.
  • people in general seem more empathetic to other people who are suffering problems they themselves went through, like a specific disease, or school-induced suffering, or whatever.
  • rawlsian veil of ignorance: "This is interesting because it gets us most of the way to Rawls’ veil of ignorance. We imagine a poor person coming up to a rich person and saying “God decided which of us should be rich and which of us should be poor. Before that happened, I resolved that if I were rich and you were poor, I would give you charity if and only if I predicted, in the opposite situation, that you would give me charity. Well, turns out you’re rich and I’m poor and the other situation is counterfactual, but will you give me money anyway?”" https://slatestarcodex.com/2018/04/01/the-hour-i-first-believed/

i want to think more about related ideas like, if i were the kind of organism that had evolved in an environment where minds were regularly copied and merged and so forth, what kinds of intuitions would i have wrt this kind of question?