EA - Odds of recovering values after collapse? by Will Aldred
The Nonlinear Library: EA Forum - Ein Podcast von The Nonlinear Fund
Kategorien:
Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Odds of recovering values after collapse?, published by Will Aldred on July 24, 2022 on The Effective Altruism Forum. (This question is inspired by conversations with Haydn Belfield and Hannah Erlebach, though I'm not certain both would endorse the full version of my question.) Question Let's say we roll the dice 100 times with respect to values. In other words, let's say civilization collapses in 100 worlds, each very similar to our current world, and let's say full tech recovery follows collapse in all 100 of these worlds. In how many of these 100 worlds do you think that, relative to pre-collapse humanity, the post-recovery version of humanity has: worse values? similar values? better values? I encourage the reader to try answering the question before looking at the comments section, so as to not become anchored. Context Components of recovery It seems, to me, that there are two broad components to recovery following civilizational collapse: P(Tech Recovery|Collapse) i.e., probability of tech recovery given collapse where I define "tech recovery" as scientific, technological, and economic recovery P(Values Recovery|Tech Recovery) i.e., probability of values recovery given tech recovery where I define "values recovery" as recovery of political systems and values systems (where "good" on the values axis would be things like democracy, individualism, equality, and secularism, and "bad" would be things like totalitarianism) It also seems to me that P(Tech Recovery|Collapse) ≈ 1, which is why the question I've asked is essentially "P(Values Recovery|Tech Recovery) = ?", just in a little more detail. Existing discussion I ask this question on values recovery because there's less discussion on this than I would expect. Toby Ord, in The Precipice, mentions values only briefly, in his "Dystopian Scenarios" section: A second kind of unrecoverable dystopia is a stable civilization that is desired by few (if any) people. [...] Well-known examples include market forces creating a race to the bottom, Malthusian population dynamics pushing down the average quality of life, or evolution optimizing us toward the spreading of our genes, regardless of the effects on what we value. These are all dynamics that push humanity toward a new equilibrium, where these forces are finally in balance. But there is no guarantee this equilibrium will be good. (p. 152) The third possibility is the “desired dystopia.” [...] Some plausible examples include: [...] worlds that forever fail to recognize some key form of harm or injustice (and thus perpetuate it blindly), worlds that lock in a single fundamentalist religion, and worlds where we deliberately replace ourselves with something that we didn’t realize was much less valuable (such as machines incapable of feeling). (pp. 153-154) Luisa Rodriguez, who has produced arguably the best work on civilizational collapse (see "What is the likelihood that civilizational collapse would directly lead to human extinction (within decades)?"), also only very briefly touches on values: Values is the other one. Yeah. Making sure that if we do last for a really long time, we don’t do so with really horrible values or that we at least don’t miss out on some amazing ones. (Rodriguez, Wiblin & Harris, 2021, 2:55:00-2:55:10) Nick Beckstead and Michael Aird come the closest, as far as I've seen, to pointing to the question of values recovery. Beckstead (2015): Negative cultural trajectory: It seems possible that just as some societies reinforce openness, toleration, and equality, other societies might reinforce alternative sets of values. [...] Especially if culture continues to become increasingly global, it may become easier for one kind of culture to dominate the world. A culture opposed to open society values, or otherwise problematic for utilitarian-type values,...
