RSS

Black Marble

TagLast edit: 4 Feb 2024 19:20 UTC by gilch

A Black Marble is a technology that by default destroys the civilization that invents it. It’s one type of Existential Risk. AGI may be such an invention, but isn’t the only one.

The name comes from a thought experiment by Nick Bostrom, where he described inventions as pulling marbles out of an urn. Most are white (beneficial), some are dangerous or a mixed blessing (usually described as gray or red balls), and some are black (fatal).

Also sometimes phrased as “black ball” when describing the experiment, including by Bostrom himself, but that already means something else when used by itself.

As a hypothetical example, Bostrom asked what would happen to civilization if a weapon of mass destruction comparable to an atomic bomb was much easier to make, on the level of cooking sand in a microwave or something like that. The natural answer being, once knowledge of the technique spreads, any random psychopath can do it, and some do so, and soon thereafter bomb us back into the stone age. Civilization then can’t rebuild past the point of microwaves without getting destroyed again, as long as the knowledge persists (making black marbles an extreme type of information hazard). But this scenario is path dependent. One could imagine a different civilization with different capabilities that could survive such knowledge. Perhaps one with a world government (no wars) and a screening for psychopathy, etc. Perhaps a dystopian world panopticon could prevent use. Or, for a space-faring civilization that mostly lives in small independent orbital colonies, everybody already has (and civilization is somehow currently surviving) similarly destructive kinetic attack capabilities so maybe the sand-nukes don’t change much.

Take 6: CAIS is ac­tu­ally Or­wellian.

Charlie Steiner7 Dec 2022 13:50 UTC
14 points
8 comments2 min readLW link

A Pin and a Bal­loon: An­thropic Frag­ility In­creases Chances of Ru­n­away Global Warm­ing

avturchin11 Sep 2022 10:25 UTC
33 points
23 comments52 min readLW link

The Dumbest Pos­si­ble Gets There First

Artaxerxes13 Aug 2022 10:20 UTC
44 points
7 comments2 min readLW link

En­light­en­ment Values in a Vuln­er­a­ble World

Maxwell Tabarrok20 Jul 2022 19:52 UTC
15 points
6 comments31 min readLW link
(maximumprogress.substack.com)

My thoughts on nan­otech­nol­ogy strat­egy re­search as an EA cause area

Ben_Snodin2 May 2022 17:57 UTC
34 points
0 comments42 min readLW link

Nu­clear Strat­egy in a Semi-Vuln­er­a­ble World

Jackson Wagner27 Jun 2021 8:17 UTC
7 points
1 comment17 min readLW link

misc raw re­sponses to a tract of Crit­i­cal Rationalism

mako yass14 Aug 2020 11:53 UTC
21 points
52 comments3 min readLW link

Ab­sent co­or­di­na­tion, fu­ture tech­nol­ogy will cause hu­man extinction

Jeffrey Ladish3 Feb 2020 21:52 UTC
21 points
12 comments5 min readLW link

The Trans­par­ent So­ciety: A rad­i­cal trans­for­ma­tion that we should prob­a­bly undergo

mako yass3 Sep 2019 2:27 UTC
14 points
25 comments8 min readLW link

The Vuln­er­a­ble World Hy­poth­e­sis (by Bostrom)

Ben Pace6 Nov 2018 20:05 UTC
50 points
17 comments4 min readLW link
(nickbostrom.com)