<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.opensourceecology.org/index.php?action=history&amp;feed=atom&amp;title=If_Anyone_Builds_it_Everyone_Dies</id>
	<title>If Anyone Builds it Everyone Dies - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.opensourceecology.org/index.php?action=history&amp;feed=atom&amp;title=If_Anyone_Builds_it_Everyone_Dies"/>
	<link rel="alternate" type="text/html" href="https://wiki.opensourceecology.org/index.php?title=If_Anyone_Builds_it_Everyone_Dies&amp;action=history"/>
	<updated>2026-04-25T16:37:39Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.39.13</generator>
	<entry>
		<id>https://wiki.opensourceecology.org/index.php?title=If_Anyone_Builds_it_Everyone_Dies&amp;diff=320901&amp;oldid=prev</id>
		<title>Marcin: Created page with &quot;=About=  Book on SAI by Yudlowsky.  =Notes= *Experts estimate 1 in 6 chance of human extintion from AI today. =For= *We don&#039;t understand it *Reasonable speculation *Comparison to nuke annihilation - 1% in the next 100 years. =Against= *AI abstraction can understand it? Some way to understrand via abstraction.&quot;</title>
		<link rel="alternate" type="text/html" href="https://wiki.opensourceecology.org/index.php?title=If_Anyone_Builds_it_Everyone_Dies&amp;diff=320901&amp;oldid=prev"/>
		<updated>2026-03-07T18:52:29Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;=About=  Book on SAI by Yudlowsky.  =Notes= *Experts estimate 1 in 6 chance of human extintion from AI today. =For= *We don&amp;#039;t understand it *Reasonable speculation *Comparison to nuke annihilation - 1% in the next 100 years. =Against= *AI abstraction can understand it? Some way to understrand via abstraction.&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;=About=&lt;br /&gt;
&lt;br /&gt;
Book on SAI by Yudlowsky.&lt;br /&gt;
&lt;br /&gt;
=Notes=&lt;br /&gt;
*Experts estimate 1 in 6 chance of human extintion from AI today.&lt;br /&gt;
=For=&lt;br /&gt;
*We don&amp;#039;t understand it&lt;br /&gt;
*Reasonable speculation&lt;br /&gt;
*Comparison to nuke annihilation - 1% in the next 100 years.&lt;br /&gt;
=Against=&lt;br /&gt;
*AI abstraction can understand it? Some way to understrand via abstraction.&lt;/div&gt;</summary>
		<author><name>Marcin</name></author>
	</entry>
</feed>