RSS

Evolution

TagLast edit: 2 Oct 2020 18:31 UTC by Ruby

Evolution is “change in the heritable characteristics of biological populations over successive generations” (Wikipedia). For posts about machine learning look here.

Related: Biology, Evolutionary Psychology,

The sequence, The Simple Math of Evolution provides a good introduction to LessWrong thinking about evolution.

Why be interested in evolution?

Firstly, evolution is a useful case study of humans’ ability (or inability) to model the real world. This is because it has a single clear criterion (“relative reproductive fitness”) which is selected (optimized) for:

“If we can’t see clearly the result of a single monotone optimization criterion—if we can’t even train ourselves to hear a single pure note—then how will we listen to an orchestra? How will we see that “Always be selfish” or “Always obey the government” are poor guiding principles for human beings to adopt—if we think that even optimizing genes for inclusive fitness will yield organisms which sacrifice reproductive opportunities in the name of social resource conservation?

To train ourselves to see clearly, we need simple practice cases”—Eliezer Yudkowsky, Fake Optimisation Criteria

Secondly, much of rationality necessarily revolves around the human brain (for now). An understanding of how it came into being can be very helpful both for understanding ‘bugs’ in the system (like superstimuli), and for explaining Complexity of Value, among others.

A candy bar is a superstimulus: it contains more concentrated sugar, salt, and fat than anything that exists in the ancestral environment. A candy bar matches taste buds that evolved in a hunter-gatherer environment, but it matches those taste buds much more strongly than anything that actually existed in the hunter-gatherer environment. The signal that once reliably correlated to healthy food has been hijacked, blotted out with a point in tastespace that wasn’t in the training dataset—an impossibly distant outlier on the old ancestral graphs.
-- Eliezer Yudkowsky, Superstimuli and the Collapse of Western Civilisation

See also

External links

Summaries of Sequence’s Posts on Evolution

The following are summaries of posts concerning evolution in the Eliezer’s sequences:

An Alien God

Eliezer Yudkowsky2 Nov 2007 6:57 UTC
131 points
148 comments8 min readLW link

Prob­lems in evolu­tion­ary psy­chol­ogy

Kaj_Sotala13 Aug 2010 18:57 UTC
78 points
102 comments8 min readLW link

Adap­ta­tion-Ex­e­cuters, not Fit­ness-Maximizers

Eliezer Yudkowsky11 Nov 2007 6:39 UTC
85 points
32 comments3 min readLW link

The Tragedy of Group Selectionism

Eliezer Yudkowsky7 Nov 2007 7:47 UTC
80 points
88 comments5 min readLW link

Evolv­ing to Extinction

Eliezer Yudkowsky16 Nov 2007 7:18 UTC
85 points
41 comments6 min readLW link

Evolu­tion of Modularity

johnswentworth14 Nov 2019 6:49 UTC
123 points
9 comments2 min readLW link2 nominations1 review

Con­jur­ing An Evolu­tion To Serve You

Eliezer Yudkowsky19 Nov 2007 5:55 UTC
56 points
26 comments4 min readLW link

Stud­ies On Slack

Scott Alexander13 May 2020 5:00 UTC
126 points
27 comments24 min readLW link
(slatestarcodex.com)

The in­no­cent gene

Joe Carlsmith5 Apr 2021 3:31 UTC
37 points
1 comment9 min readLW link

Evolu­tions Are Stupid (But Work Any­way)

Eliezer Yudkowsky3 Nov 2007 15:45 UTC
68 points
66 comments4 min readLW link

Evolu­tion­ary Psychology

Eliezer Yudkowsky11 Nov 2007 20:41 UTC
76 points
40 comments5 min readLW link

Be­ware of Stephen J. Gould

Eliezer Yudkowsky6 Nov 2007 5:22 UTC
48 points
76 comments6 min readLW link

Su­per­stim­uli and the Col­lapse of Western Civilization

Eliezer Yudkowsky16 Mar 2007 18:10 UTC
95 points
88 comments4 min readLW link

Co­or­di­na­tion Prob­lems in Evolu­tion: Ei­gen’s Paradox

Martin Sustrik12 Oct 2018 12:40 UTC
96 points
6 comments8 min readLW link
(250bpm.com)

Com­puter bugs and evolution

PhilGoetz26 Oct 2009 22:06 UTC
55 points
10 comments1 min readLW link

No Evolu­tions for Cor­po­ra­tions or Nanodevices

Eliezer Yudkowsky17 Nov 2007 2:24 UTC
65 points
32 comments6 min readLW link

Co­or­di­na­tion Prob­lems in Evolu­tion: The Rise of Eukaryotes

Martin Sustrik15 Oct 2018 6:18 UTC
46 points
8 comments8 min readLW link

The Won­der of Evolution

Eliezer Yudkowsky2 Nov 2007 20:49 UTC
61 points
83 comments4 min readLW link

As­sor­ta­tive Mat­ing And Autism

Scott Alexander28 Jan 2020 18:20 UTC
45 points
2 comments4 min readLW link
(slatestarcodex.com)

[link] Back to the trees

[deleted]4 Nov 2011 22:06 UTC
121 points
47 comments2 min readLW link

The Oc­to­pus, the Dolphin and Us: a Great Filter tale

Stuart_Armstrong3 Sep 2014 21:37 UTC
73 points
236 comments3 min readLW link

Your Evolved Intuitions

lukeprog5 May 2011 16:21 UTC
20 points
106 comments10 min readLW link

Why would evolu­tion fa­vor more bad?

KatjaGrace6 Oct 2013 18:26 UTC
1 point
0 comments3 min readLW link

The Psy­cholog­i­cal Diver­sity of Mankind

Kaj_Sotala9 May 2010 5:53 UTC
122 points
162 comments7 min readLW link

The Psy­cholog­i­cal Unity of Humankind

Eliezer Yudkowsky24 Jun 2008 7:12 UTC
42 points
21 comments4 min readLW link

Thou Art Godshatter

Eliezer Yudkowsky13 Nov 2007 19:38 UTC
126 points
75 comments5 min readLW link

In Search of Slack

Martin Sustrik23 May 2020 11:20 UTC
47 points
3 comments6 min readLW link
(250bpm.com)

Grow­ing Up is Hard

Eliezer Yudkowsky4 Jan 2009 3:55 UTC
38 points
41 comments7 min readLW link

At­ten­tion to snakes not fear of snakes: evolu­tion en­cod­ing en­vi­ron­men­tal knowl­edge in periph­eral systems

Kaj_Sotala2 Oct 2020 11:50 UTC
39 points
1 comment3 min readLW link
(kajsotala.fi)

The Dar­win Game

lsusr9 Oct 2020 10:19 UTC
88 points
136 comments3 min readLW link

Sny­der-Beat­tie, Sand­berg, Drexler & Bon­sall (2020): The Timing of Evolu­tion­ary Tran­si­tions Suggests In­tel­li­gent Life Is Rare

Kaj_Sotala24 Nov 2020 10:36 UTC
79 points
15 comments2 min readLW link
(www.liebertpub.com)

Against evolu­tion as an anal­ogy for how hu­mans will cre­ate AGI

Steven Byrnes23 Mar 2021 12:29 UTC
38 points
25 comments25 min readLW link

The Dark Mir­a­cle of Optics

Suspended Reason24 Jun 2020 3:09 UTC
26 points
5 comments8 min readLW link

Refer­ences & Re­sources for LessWrong

XiXiDu10 Oct 2010 14:54 UTC
146 points
106 comments20 min readLW link

He­donic asymmetries

paulfchristiano26 Jan 2020 2:10 UTC
89 points
22 comments2 min readLW link
(sideways-view.com)

Refram­ing the evolu­tion­ary benefit of sex

paulfchristiano14 Sep 2019 17:00 UTC
81 points
15 comments2 min readLW link1 nomination
(sideways-view.com)

Win­ning is for Losers

Jacob Falkovich11 Oct 2017 4:01 UTC
29 points
15 comments18 min readLW link
(putanumonit.com)

Notes From an Apocalypse

Toggle22 Sep 2017 5:10 UTC
56 points
25 comments14 min readLW link

You’re En­ti­tled to Ar­gu­ments, But Not (That Par­tic­u­lar) Proof

Eliezer Yudkowsky15 Feb 2010 7:58 UTC
71 points
228 comments8 min readLW link

You’re in New­comb’s Box

HonoreDB5 Feb 2011 20:46 UTC
59 points
176 comments4 min readLW link

Hu­mans in Funny Suits

Eliezer Yudkowsky30 Jul 2008 23:54 UTC
45 points
130 comments7 min readLW link

An­thro­po­mor­phic Optimism

Eliezer Yudkowsky4 Aug 2008 20:17 UTC
55 points
58 comments5 min readLW link

Group se­lec­tion update

PhilGoetz1 Nov 2010 16:51 UTC
48 points
66 comments5 min readLW link

Protein Re­in­force­ment and DNA Consequentialism

Eliezer Yudkowsky13 Nov 2007 1:34 UTC
48 points
20 comments4 min readLW link

Three Fal­la­cies of Teleology

Eliezer Yudkowsky25 Aug 2008 22:27 UTC
36 points
16 comments9 min readLW link

[Question] Why do hu­mans not have built-in neu­ral i/​o chan­nels?

Richard_Ngo8 Aug 2019 13:09 UTC
25 points
24 comments1 min readLW link

What strange and an­cient things might we find be­neath the ice?

Benquo15 Jan 2018 10:10 UTC
15 points
2 comments2 min readLW link
(benjaminrosshoffman.com)

Is That Your True Re­jec­tion? by Eliezer Yud­kowsky @ Cato Unbound

XiXiDu7 Sep 2011 18:27 UTC
44 points
83 comments1 min readLW link

What is the group se­lec­tion de­bate?

Academian2 Nov 2010 2:02 UTC
37 points
16 comments3 min readLW link

Nat­u­ral Selec­tion’s Speed Limit and Com­plex­ity Bound

Eliezer Yudkowsky4 Nov 2007 16:54 UTC
8 points
105 comments5 min readLW link

More Ques­tions about Trees

digital_carver9 Oct 2020 8:35 UTC
2 points
5 comments1 min readLW link
(bit-player.org)

Mo­du­lar­ity and Buzzy

Kaj_Sotala4 Aug 2011 11:35 UTC
33 points
27 comments9 min readLW link

Why math­e­mat­ics works

Douglas_Reay8 Mar 2018 18:00 UTC
7 points
4 comments5 min readLW link

A Failed Just-So Story

Eliezer Yudkowsky5 Jan 2008 6:35 UTC
17 points
49 comments2 min readLW link

“In­ner Align­ment Failures” Which Are Ac­tu­ally Outer Align­ment Failures

johnswentworth31 Oct 2020 20:18 UTC
51 points
38 comments5 min readLW link

Ob­serv­ing Optimization

Eliezer Yudkowsky21 Nov 2008 5:39 UTC
9 points
28 comments6 min readLW link

Build­ing Some­thing Smarter

Eliezer Yudkowsky2 Nov 2008 17:00 UTC
19 points
57 comments4 min readLW link

Mus­ings on Cu­mu­la­tive Cul­tural Evolu­tion and AI

calebo7 Jul 2019 16:46 UTC
19 points
5 comments7 min readLW link

An ap­peal for vi­tamin D sup­ple­men­ta­tion as a pro­phy­lac­tic for coro­n­aviruses and in­fluenza and a sim­ple evolu­tion­ary the­ory for why this is plau­si­ble.

Michael A22 Dec 2020 19:40 UTC
11 points
1 comment9 min readLW link

Evolu­tion and fit­ness vs self-aware­ness and memetics

FractalParrot29 Nov 2020 21:07 UTC
3 points
2 comments3 min readLW link

Are we all mis­al­igned?

Mateusz Mazurkiewicz3 Jan 2021 2:42 UTC
10 points
0 comments5 min readLW link

On the na­ture of pur­pose

Nora_Ammann22 Jan 2021 8:30 UTC
28 points
15 comments9 min readLW link

Evolu­tions Build­ing Evolu­tions: Lay­ers of Gen­er­ate and Test

plex5 Feb 2021 18:21 UTC
11 points
1 comment6 min readLW link

Idea selection

krbouchard1 Mar 2021 14:07 UTC
1 point
0 comments2 min readLW link

Fish­e­rian Ru­n­away as a de­ci­sion-the­o­retic problem

Bunthut20 Mar 2021 16:34 UTC
10 points
0 comments3 min readLW link
No comments.